SoFunction
Updated on 2024-11-18

Python tracemalloc tracks memory allocation issues

Python tracemalloc tracks memory allocation

The tracemalloc module is a tool for debugging blocks of memory that have been requested by python.

It provides the following information:

  • Locate where the object allocates memory
  • check or refer toFile, by lineStatistics on python's memory block allocations: total size, number of blocks, and average block size.
  • Compare the differences between the two memory snapshots in order to troubleshoot memory leaks

Showing first 10 items

Displays the 10 files with the most memory allocations:

import tracemalloc
 
()
# --- operational code start ---
n = 10000000
s = 0
for i in range(1, n):
    s *= i
# --- operational code end ---
snapshot = tracemalloc.take_snapshot()  # Memory camera
top_stats = ('lineno')  # Memory footprint data acquisition
 
print('[Top 10]')
for stat in top_stats[:10]:  # Print the 10 sub-processes that take up the most memory
    print(stat)
 
# [Top 10]
# D:/MyPython/tracemalloc/:5: size=576 B, count=1, average=576 B
# D:/MyPython/tracemalloc/:7: size=28 B, count=1, average=28 B

TOP1: The fifth line of code occupies 576B of memory size

Calculation of differences

Get two snapshots and show the differences:

import tracemalloc
 
 
()
snapshot0 = tracemalloc.take_snapshot()  # First snapshot
# --- operational code start ---
n = 10000000
s = 0
for i in range(1, n):
    s *= i
# --- operational code end ---
snapshot1 = tracemalloc.take_snapshot()  # Second snapshot
top_stats = snapshot1.compare_to(snapshot0, 'lineno')  # Snapshot Comparison
 
print('[Top 10 differences]')
for stat in top_stats[:10]:
    print(stat)
 
# [Top 10 differences]
# D:/MyPython/tracemalloc/:27: size=576 B (+576 B), count=1 (+1), average=576 B
# D:\Program Files\anaconda3\lib\:397: size=88 B (+88 B), count=2 (+2), average=44 B
# D:\Program Files\anaconda3\lib\:534: size=48 B (+48 B), count=1 (+1), average=48 B
# D:\Program Files\anaconda3\lib\:291: size=40 B (+40 B), count=1 (+1), average=40 B
# D:/MyPython/tracemalloc/:31: size=28 B (+28 B), count=1 (+1), average=28 B

TOP1: Memory size increased by 576B at line 27 of the code

tracemalloc analyzes memory usage and leaks

summarize

python memory management is enforced by reference counting, if all references to an object expire, the referenced object can be cleared from memory to make room for other data.

Theoretically, python developers don't have to worry about how a program allocates and frees memory, because the python system itself, as well as the Cpython runtime environment, handles this automatically.

However, in reality, the program will run out of memory because it does not release the data it no longer needs to reference in time. Here are some ways to look at memory usage.

View the total number of objects referenced by gc

Below is the tested code, this code creates objects and generates references to objects in gc.

import os

class MyObject:
    def __init__(self):
         = (100)

def get_data():
    values = []
    for _ in range(100):
        obj = MyObject()
        (obj)
    return values

def run():
    deep_values = []
    for _ in range(100):
        deep_values.append(get_data())
    return

The following code is used to output the number of objects currently referenced by gc

import gc

# Get the number of gc references before running
found_objects = gc.get_objects()
print('Before:', len(found_objects))

# Import the module to be tested
import waste_memory

# Functions to run the code to be tested
hold_reference = waste_memory.run()

# Get the number of objects referenced by gc after running the code
found_objects = gc.get_objects()
print('After: ', len(found_objects))
for obj in found_objects[:5]:
    print(repr(obj)[:100])

print('...')

Running the above code, here is the total number of objects referenced by gc.

Before: 28834
After:  28923

tracemalloc to see memory allocation

1. Check the memory allocation

The above only outputs the total number of gc's, which is not much of a guide to analyzing memory allocation. The tracemalloc module is able to trace back to the location where it was allocated, so we can take a snapshot of memory usage before and after the previous module and analyze the difference between the two snapshots.

Here is the code under test

import tracemalloc

(10)                      # Set stack depth
time1 = tracemalloc.take_snapshot()        # Before snapshot

import waste_memory

x = waste_memory.run()                     # Usage to debug
time2 = tracemalloc.take_snapshot()        # After snapshot

stats = time2.compare_to(time1, 'lineno')  # Compare snapshots
for stat in stats[:3]:
    print(stat)

Running the above code, you can see from the results that each record has size and count indicators, which are used to indicate how much memory is occupied by the objects allocated by this line of code, and the number of objects. By comparing them you can see that the objects that take up more memory are allocated by those lines of code.

/waste_memory.py:11: size=5120 B (+5120 B), count=80 (+80), average=64 B
/waste_memory.py:14: size=4424 B (+4424 B), count=79 (+79), average=56 B
/waste_memory.py:9: size=1704 B (+1704 B), count=8 (+8), average=213 B

2. View stack information

tracemalloc also prints stack trace information. Below is a printout of the stack trace information corresponding to the line of code that allocates the most memory in the program, to see along which path the program triggers this line of code.

import tracemalloc

(10)
time1 = tracemalloc.take_snapshot()

import waste_memory

x = waste_memory.run()
time2 = tracemalloc.take_snapshot()

stats = time2.compare_to(time1, 'traceback')
top = stats[0]
print('Biggest offender is:')
# Print stack information
print('\n'.join(()))

Run the code above

Biggest offender is:
  File "/with_trace.py", line 14
    x = waste_memory.run()
  File "/waste_memory.py", line 23
    deep_values.append(get_data())
  File "/waste_memory.py", line 16
    obj = MyObject()
  File "/waste_memory.py", line 11
     = (100)

summarize

The above is a personal experience, I hope it can give you a reference, and I hope you can support me more.