Blog

Dec 7, 2024

Large language models can be squeezed onto your phone — rather than needing 1000s of servers to run — after breakthrough

Posted by in categories: information science, mobile phones, robotics/AI

Running massive AI models locally on smartphones or laptops may be possible after a new compression algorithm trims down their size — meaning your data never leaves your device. The catch is that it might drain your battery in an hour.

Leave a reply