CPython Memory Insights
Memory profiling infrastructure for CPython core development
How Memory Insights Works
Community members running worker processes monitor the CPython repository for new commits. When a commit is pushed, workers can queue it for memory profiling across multiple configurations and environments.
Worker processes (which anyone can run with an API token) compile CPython with various optimization flags, then run a comprehensive suite of memory benchmarks using Memray. Each benchmark captures detailed allocation data, call stacks, and memory usage patterns.
Results are processed and stored with rich metadata. The web interface provides interactive visualizations, trend analysis, and detailed flamegraphs to help developers understand memory behavior and identify regressions.
What you can do with this
CPython commits are profiled by community workers across multiple build configurations. Track memory usage trends over time, identify sudden regressions, and validate that memory optimizations are working as expected.
Compare memory usage across different build flags (--enable-optimizations, --with-lto, debug builds), Python versions, and hardware environments. Understand how different configurations affect memory consumption.
Powered by Memray's flamegraph generation, explore exactly where memory is allocated in the CPython codebase. Click through call stacks, zoom into specific functions, and identify memory hotspots.
Access memory profiling data for any commit in CPython's history. Compare commits across weeks, months, or years to understand long-term memory usage trends and validate that optimizations have lasting impact.
CPython Memory Insights is built on top of Memray, Bloomberg's powerful memory profiler for Python. Memray provides the core profiling capabilities that enable tracking memory allocations with minimal overhead and generate detailed flamegraphs for analysis.
The CPython Memory Insights maintainers are here to help with:
Reach out via GitHub or the CPython Discord channel