AAPL and Apple

Jan 27, 2025

The iPhone maker is instead focusing on how powerful a model it can get to work directly on its devices, thus saving on the need to spend heavily on AI servers.

In that context, DeepSeek has some promising news for Apple. It demonstrated a series of smaller versions of its new R1 model with parameter counts—a measure of an AI model’s size and complexity—ranging from 1.5 billion to 70 billion, as opposed to the original’s 671 billion parameters. So far, iPhones have been shown to be able to run models with around 3 billion parameters locally on the device.

The more AI computing that can be done on the “edge” or directly on a user’s device rather than connecting to a server, the better it should be for Apple.