Python 1500% Memory Overhead from loading binary file -


I am trying to load a binary file containing 120MB of data using the following routine:

< Pre>

However, when I execute it, my python process starts using RAM up to 2.2 GB, which is me - very wrong The breakfast. Are there any obvious errors that can explain this behavior? Am I misusing any pretty useless dragon features?

One more thing, I do not want to use generator functions for it, I really need all the data in memory.

I try to change more data at a time to cut per-object memory overhead Will do Your data takes around 12 bytes to 3-Tupal, so for 120-MB file you have 10 million Tuples.

If you look at it, you can see it:

  • An integer has an overhead of 24 bytes.
  • A float also does. <

    10 million toplas (2 * 24 + 24 + 63) bytes each weighed on 1.35 GB, but maybe the d list is increasing There is extra garbage.


Comments

Popular posts from this blog

java - Can't add JTree to JPanel of a JInternalFrame -

javascript - data.match(var) not working it seems -

javascript - How can I pause a jQuery .each() loop, while waiting for user input? -