Find out how much memory is being used by an object in Python

PythonPerformanceMemory Profiling

Python Problem Overview


How would you go about finding out how much memory is being used by an object? I know it is possible to find out how much is used by a block of code, but not by an instantiated object (anytime during its life), which is what I want.

Python Solutions


Solution 1 - Python

Try this:

sys.getsizeof(object)

getsizeof() Return the size of an object in bytes. It calls the object’s __sizeof__ method and adds an additional garbage collector overhead if the object is managed by the garbage collector.

A recursive recipe

Solution 2 - Python

There's no easy way to find out the memory size of a python object. One of the problems you may find is that Python objects - like lists and dicts - may have references to other python objects (in this case, what would your size be? The size containing the size of each object or not?). There are some pointers overhead and internal structures related to object types and garbage collection. Finally, some python objects have non-obvious behaviors. For instance, lists reserve space for more objects than they have, most of the time; dicts are even more complicated since they can operate in different ways (they have a different implementation for small number of keys and sometimes they over allocate entries).

There is a big chunk of code (and an updated big chunk of code) out there to try to best approximate the size of a python object in memory.

You may also want to check some old description about PyObject (the internal C struct that represents virtually all python objects).

Solution 3 - Python

I haven't any personal experience with either of the following, but a simple search for a "Python [memory] profiler" yield:

  • PySizer, "a memory profiler for Python," found at <http://pysizer.8325.org/>;. However the page seems to indicate that the project hasn't been updated for a while, and refers to...

  • Heapy, "support[ing] debugging and optimization regarding memory related issues in Python programs," found at <http://guppy-pe.sourceforge.net/#Heapy>;.

Hope that helps.

Solution 4 - Python

This must be used with care because an override on the objects _sizeof_ might be misleading.

Using the bregman.suite, some tests with sys.getsizeof output a copy of an array object (data) in an object instance as being bigger than the object itself (mfcc).

>>> mfcc = MelFrequencyCepstrum(filepath, params)
>>> data = mfcc.X[:]
>>> sys.getsizeof(mfcc)
64
>>> sys.getsizeof(mfcc.X)
>>>80
>>> sys.getsizeof(data)
80
>>> mfcc
<bregman.features.MelFrequencyCepstrum object at 0x104ad3e90>

Solution 5 - Python

For big objects you may use a somewhat crude but effective method: check how much memory your Python process occupies in the system, then delete the object and compare.

This method has many drawbacks but it will give you a very fast estimate for very big objects.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestiondwestbrookView Question on Stackoverflow
Solution 1 - PythonUzerView Answer on Stackoverflow
Solution 2 - PythonfserbView Answer on Stackoverflow
Solution 3 - PythonjcsalteregoView Answer on Stackoverflow
Solution 4 - PythonrafaelvalleView Answer on Stackoverflow
Solution 5 - Pythonuser3791116View Answer on Stackoverflow