Browse Models
Phi-2 is a 2.7-billion parameter Transformer-based language model developed by Microsoft Research, trained on 1.4 trillion tokens of curated "textbook-quality" data. The model demonstrates strong performance in reasoning, language understanding, mathematics, and code generation tasks, often outperforming larger models while maintaining efficiency through its compact architecture and specialized training approach focused on high-quality educational content.