(Slowly) Running Microsoft’s Phi-3-Mini on iPhone 11

Emanuele
3 min readApr 23, 2024

Running a Large Language Model is computationally expensive and requires a lot of memory, but through research and optimizations, these limitations are starting to fade away, to the point where we can run a LLM on a device that is 5 years old.

Size and quality

Microsoft’s Phi models are an example of these advancements, defined by them as…

--

--

Emanuele

Coding and GenAI enthusiast // Flutter apps creator // Experimenting with LLMs and Stable Diffusion