News Gist .News

Articles | Politics | Finance | Stocks | Crypto | AI | Technology | Science | Gaming | PC Hardware | Laptops | Smartphones | Archive

Distilling AI Models Costs Less, Raises Revenue Questions

Developers can access AI model capabilities at a fraction of the price thanks to distillation, allowing app developers to run AI models quickly on devices such as laptops and smartphones. The technique uses a "teacher" LLM to train smaller AI systems, with companies like OpenAI and IBM Research adopting the method to create cheaper models. However, experts note that distilled models have limitations in terms of capability.

See Also