⚡️ GPU Memory Calculator for LLMs ⚡️


Disclaimer: I built this tool because I couldn't find an easy way to calculate GPU memory requirements for large language models. Every time I wanted to run a model locally, I had to manually look up the specifications for each model or get lost into endless Reddit threads. This tool provides an approximate calculation of memory requirements, but keep in mind that actual requirements may vary depending on additional factors such as the model architecture, GPU teraflops, memory bandwidth, and other implementation details.

If you found this tool helpful, feel free to star it and share it with others!

Thank you for your support!