AI models are becoming increasingly large, with large language models (LLMs) composed of over 100 billion parameters. However, there are drawbacks to this growth, such as the models becoming more unwieldy and energy-hungry. To address this issue, AI developers have begun to explore smaller models and datasets. Microsoft researchers recently released a technical report on a new language model called phi-1.5, which is composed of 1.3 billion parameters and has demonstrated abilities comparable to those of AIs that are five to 10 times larger.
