Navigation
Recherche
|
Can You Run the Llama 2 LLM on DOS?
lundi 21 avril 2025, 02:34 , par Slashdot
![]() He's now sharing his latest project — installing Llama 2 on DOS: Conventional wisdom states that running LLMs locally will require computers with high performance specifications especially GPUs with lots of VRAM. But is this actually true? Thanks to an open-source llama2.c project [original created by Andrej Karpathy], I ported it to work so vintage machines running DOS can actually inference with Llama 2 LLM models. Of course there are severe limitations but the results will surprise you. 'Everything is open sourced with the executable available here,' according to the blog post. (They even addressed an early 'gotcha' with DOS filenames being limited to eight characters.) 'As expected, the more modern the system, the faster the inference speed...' it adds. 'Still, I'm amazed what can still be accomplished with vintage systems.' Read more of this story at Slashdot.
https://tech.slashdot.org/story/25/04/21/0026255/can-you-run-the-llama-2-llm-on-dos?utm_source=rss1....
Voir aussi |
56 sources (32 en français)
Date Actuelle
lun. 21 avril - 15:56 CEST
|