Microsoft strikes back — Phi-3 may change the entire game.

A Highly Capable Language Model Locally on Your Phone! We have already the GGUF binaries, so let’s do it!

Fabio Matricardi
The AI Explorer

--

phi-3-mini is out!

This is exactly the title of the official Paper that introduces to the World the latest born from Microsoft powerhouse.

Just one week after the shocking release of the Llama-3 series, Microsoft announce a family of so-called Small Language Models aimed to run on mobile devices.

4-bit quantized phi-3-mini running natively on an iPhone with A16 Bionic chip, generating over 12 tokens per second — Image from official paper

Is it going to work though? Can we test it?

Keep up with the article because we try to unveil details and the easiest way to run it right now on your Laptop too.

Phi-3-mini

We introduce phi-3-mini, a 3.8 billion parameter language model trained on 3.3 trillion tokens, whose overall performance, as measured by both academic benchmarks and internal testing, rivals that of models such as Mixtral 8x7B and GPT-3.5 (e.g., phi-3-mini achieves 69% on MMLU and 8.38 on MT-bench), despite being small enough to be deployed on a phone.

--

--

Fabio Matricardi
The AI Explorer

passionate educator, curious industrial automation engineer. Learning Leadership and how to build my own AI. contact me at fabio.matricardi@gmail.com