gpt-oss-120b
-
First AI Test Drive with the Framework Desktop

For those of us experimenting with local Large Language Models (LLMs), the hardware ceiling is almost always defined by video memory. My current workstations, equipped with a 3060 Ti (8GB) and a 5070 Ti OC (16GB), are excellent for daily tasks and gaming, but they hit a hard wall when attempting to load anything beyond… Continue reading
-
Unlocking Large Language Models at Home

Large language models have traditionally been something you associate with data centers packed full of GPUs. Models with over 100 billion parameters usually seem far beyond the reach of a home PC, even a well-equipped “gaming” PC. Having enough memory to load such a model being probably the largest roadblock. Thankfully, between my professional work… Continue reading
