#AI will turn smartphones into ‘proactive assistants’

The company outlined this plan after the release of Llama 3.2 — Meta’s first open-source models that processes both images and text. Arm said the models run “seamlessly” on its compute platforms.
The smaller, text-based LLMs — Llama 3.2 1B and 3B — are optimised for Arm-based mobile chips. Consequently, the models can deliver faster user experiences on smartphones. Processing more AI at the edge can also save energy and costs.
These enhancements offer new opportunities to scale. By increasing the efficiencies of LLMs, Arm can run more AI directly on smartphones. For developers, that could lead to faster innovations.
Arm expects endless new mobile apps to emerge as a result.
LLMs will perform tasks on your behalf by understanding your location, schedule, and preferences. Routine tasks will be automated and recommendations personalised on-device. Your phone will evolve from a command and control tool to a “proactive assistant.”
Arm aims to accelerate this evolution. The UK-based business wants its CPUs to provide “the foundation for AI everywhere.”
Arm has an ambitious timetable for this strategy. By 2025, the chip giant wants more than 100 billion Arm-based devices to be “AI ready.”
If you liked the article, do not forget to share it with your friends. Follow us on Google News too, click on the star and choose us from your favorites.
If you want to read more like this article, you can visit our Technology category.