AirLLM enables 70B large language model inference on a single 4GB GPU, making large model inference accessible without expensive hardware.
Tue, 10 Mar 2026 11:43:16 UTC
There was a problem with this listing's funding.json manifest. If it is not fixed, the listing will be removed from the portal.
Crawl error
error: https://github.com/lyogavin/airllm/blob/main/funding.json?raw=true returned 502
The funding manifest has not provided proof via wellKnown that this link is associated with it. Learn more.
wellKnown