Or how the most powerful pocket computer ever made is mostly used to check the weather
The iPhone has become an extraordinary object. Not extraordinary in the marketing sense of “thinner glass” or “more cinematic bokeh”, but in the deeply boring, engineer-approved sense of absurdly efficient silicon, terrifying performance per watt, and a level of integration that would have been considered science fiction a decade ago. The irony is that, for all this power, most of the time my iPhone 17 PRO is busy doing things a €150 phone could already do perfectly well: notifications, messages, maps, and the occasional weather check to confirm that yes, it is still raining.
This is not a hardware complaint. On the contrary, Apple Silicon is a small miracle. A fanless MacBook Air can now run local LLMs. I could quickly iterate on ideas and do real computational work that previously required me a loud desktop (or a cloud bill). The iPhone Pro carries a chip that could probably easily guide a small satellite (or not… but hey, it’s powerful), yet its primary “Pro” identity remains firmly attached to its camera. An excellent camera, to be fair, but still a camera. In my own case, when I actually do sports, fly, ski, or move at any speed that could damage a thin glass slab, the iPhone stays safely in a pocket while an Insta360, fastened with duct tape to a helmet, does the real work. So much for “Pro”.
This is where the thought experiment begins.
Imagine, for a moment, a device that does not exist and almost certainly never will: the iPhone DEV. No triple-lens camera island. No cinematic modes. No ProRes. Possibly no camera at all, or just a symbolic one, included for decorum. What it would keep, however, is the same CPU, GPU, and NPU as the iPhone Pro. The same memory. The same absurd performance per watt. The difference would not be in the hardware, but in what you are allowed to do with it.
The iPhone DEV would greet you with a warning screen so aggressive it would make nuclear power plants look casual. Red text. Multiple confirmations. Legal language implying that the cosmos itself may collapse if you proceed. Then, once validated, it would grant something heretical: the ability to treat the iPhone like a computer. A terminal. Local execution. Python. Local ML experiments. Explicitly unsupported, completely optional, and very clearly not meant for everyone. A device for people who know what they are doing and accept responsibility for it.
This is, of course, a fully utopian idea. Not slightly utopian. Deeply, recursively, multi-dimensionally utopian. Because nothing about this is blocked by technology, security is not the real issue; Apple already sandboxes everything, and it does so better than anyone else. The core constraint is philosophical and economic. The iPhone is the crown jewel of Apple’s ecosystem, the Queen on the chessboard. Open it too much, and the App Store toll booth starts to look less inevitable. Once people can compute freely, even a little, the question “why do I need permission?” becomes very expensive.
Apple already solved this tension on the Mac. macOS offers freedom, with warnings, notarization, and a gentle but persistent suggestion to consider using the App Store instead. It works because the Mac is not the center of Apple’s revenue universe. The iPhone is. That is why iPadOS cautiously inches toward macOS, while iOS remains locked with religious devotion. This is not an accident or a failure; it is an optimization.
And yet, the mismatch grows. We are entering an age during which local AI, edge computing, and private on-device inference actually matter. The hardware is ready, more than ready. What lags is not software capability, but permission. The one-page keynote slides that list faster chips and prettier effects feel increasingly like the old console wars, when “more bits” was the headline long after it stopped meaning anything. Today, the interesting question is not how powerful the device is, but how much possibility it gives its user.
The iPhone DEV will never exist. I believe strongly in that. It would sell poorly, confuse the narrative, and threaten a business model that works extraordinarily well. But imagining it is useful. It exposes the real bottleneck in modern mobile computing, and it has nothing to do with transistors, batteries, or thermals. It is about trust, control, and what we believe users should be allowed to do with the machines they carry in their pockets.
Until then, I will continue to carry a supercomputer disguised as a camera, use an Android phone when I need an embedded computer in one of my demonstrators, run real experiments on my MacBook Air, and occasionally smile at the irrationality of it all. The future of portable computing is already here. It is just carefully locked behind very good marketing.