Phi 3.5, Microsoft's small language model (SLM) created in August 2024, is an open source state of the art model. Phi 3.5 comes in two models: Phi-3.5 Mini, and Phi-3.5-MoE.
Compared to various other SLM's released in 2024, Phi-3.5 Mini is by far the smallest. Despite its size, it still manages to perform above par in terms of accuracy on the MMLU benchmark. Meanwhile Phi-3.5-MoE is highly competent on the benchmark, beating out models exponentially larger than itself, such as Mistral.
Phi-3.5
is licensed under a
license.
Explore alternatives to
Phi-3.5
.
You can use Roboflow Inference to deploy a
Phi-3.5
API on your hardware. You can deploy the model on CPU (i.e. Raspberry Pi, AI PCs) and GPU devices (i.e. NVIDIA Jetson, NVIDIA T4).
Below are instructions on how to deploy your own model API.