200
AI language model runs on a Windows 98 system with Pentium II and 128MB RAM
(www.tomshardware.com)
This is a most excellent place for technology news and articles.
Imagine how much better it would run on a similar era version of redhat, gentoo, or beos.
They just proved that the hardware was perfectly capable, in the absolute garbage middle layer-the operating system is what matters about propelling the potential of the hardware forward into a usable form.
Many people may not remember, but there were a few Lins distributions around at the time. Certainly, they would have been able to make better use of the hardware had enough developers worked on it.
but the hardware is not capable. it's running a miniscule custom 260k LLM and the "claim to fame" is that it wasn't slow. great? we already know tiny models are fast, they're just not as accurate and perform worse than larger models, all they did was make an even smaller than normal model. this is akin to getting Doom to run on anything with a CPU, while cool and impressive, it doesn't do much for anyone other than being an exercise in doing something because you can.
With your first sentence, I can say you’re wrong. My 1997 era DX4-75 MHz ran redhat wonderfully. And SUSE, and Gentoo.
As the rest? You don’t know what an AI/LLM would’ve looked like on a processor from the era. No one even thought of it then. That doesn’t mean it can’t run it. It just means you can’t imagine that.
Fortunately, I do not lack imagination for what could be possible.
except i'm not wrong. the model they ran is 4 orders of magnitude smaller than even the smallest "mini" models that are generally available, see TinyLlama1.1B [1] or Phi-3 3.8B mini [2] to compare against. Most "mini" models range from 1 to about 10 Billion parameters, which makes running them incredibly inefficient on older devices.
but I can imagine it. in fact, I could have told you it would have needed a significantly smaller model in order to run at an adequate pace on older hardware. it's not at all a mystery, its a known factor. i think it's absolutely cool that they did it, but lets not pretend its more than what it is - a modern version of running Doom on non-standard hardware.
[1] https://huggingface.co/TinyLlama/TinyLlama-1.1B-step-50K-105b
[2] https://ollama.com/library/phi3:3.8b-mini-128k-instruct-q5_0
[3] https://www.thirtythreeforty.net/posts/2019/12/my-business-card-runs-linux/
We can all imagine it, it wouldn't run very well, as shown in the article...