That’s one reason whynew Surface Prodevices have ARM Snapdragon processors usually found in phones.
There are many reasons to want local AI features.
Giving your laptop the smarts of something on the same continuum as ChatGPT and its contemporaries is definitely exciting!
Even Small LLMs Are Huge
The thing is, these AI “models” are pretty large.
As the model gains parameters and becomes more capable, it needs more room.
Then we have LLMs small enough to run on a smartphone, such asphi-3-miniwhich clocks in at 2.4GB.
Those NPUs are specialized parallel processors, which crunch the numbers on these virtual neural networks.
with billions of “parameters” at the same time.
Which is why Microsoft has made 16GB of RAM theabsolute minimum for Copilot PCs.
Microsoft
Just disable or ignore the AI part of the equation and you’ll be just fine.
Lucas Gouveia / How-To Geek