Vajram-And-RaviVajram-And-Ravi
hamburger-icon

What is Phi-3-mini?

26-04-2024

11:30 AM

timer
1 min read
What is Phi-3-mini? Blog Image

Overview:

Recently, Microsoft unveiled the latest version of its ‘lightweight’ AI model – the Phi-3-Mini.

About Phi-3-mini: 

  • It is believed to be first among the three small models that Microsoft is planning to release.
  • It has reportedly outperformed models of the same size and the next size up across a variety of benchmarks, in areas like language, reasoning, coding, and maths.
  • It is the first model in its class to support a context window of up to 128K tokens, with little impact on quality.
  • The amount of conversation that an AI can read and write at any given time is called the context window, and is measured in tokens.
  • It is a 3.8B language model, available on AI development platforms such as Microsoft Azure AI Studio, Hugging Face, and Ollama.
  • Phi-3-mini is available in two variants, one with 4K content-length, and another with 128K tokens.

How is Phi-3-mini different from Large language Models?

  • Phi-3-mini is a Small Language Model (SLM).
  • SLMs are more streamlined versions of large language models. When compared to LLMs, smaller AI models are also cost-effective to develop and operate, and they perform better on smaller devices like laptops and smartphones.
  • SLMs are great forre source-constrained environments including on-device and offline inference scenarios and such models are good for scenarios where fast response times are critical, say for chabots or virtual assistants.
  • SLMs can be customised for specific tasks and achieve accuracy and efficiency in doing them. Most SLMs undergo targeted training, demanding considerably less computing power and energy compared to LLMs.
  • SLMs also differ when it comes to inference speed and latency. Their compact size allows for quicker processing. Their cost makes them appealing to smaller organisations and research groups.

Q1: What are Small language models?

These are designed to perform well for simpler tasks, are more accessible and easier to use for organizations with limited resources and they can be more easily fine-tuned to meet specific needs.

Source: Microsoft unveils Phi-3-mini, its smallest AI model yet: How it compares to bigger models