Hosted on MSN

What is AI Distillation?

Distillation, also known as model or knowledge distillation, is a process where knowledge is transferred from a large, complex AI ‘teacher’ model to a smaller and more efficient ‘student’ model. Doing ...
David Sacks, U.S. President Donald Trump's AI and crypto czar. David Sacks says OpenAI has evidence that Chinese company DeepSeek used a technique called "distillation" to build a rival model. OpenAI ...
Since Chinese artificial intelligence (AI) start-up DeepSeek rattled Silicon Valley and Wall Street with its cost-effective models, the company has been accused of data theft through a practice that ...
OpenAI believes outputs from its artificial intelligence models may have been used by Chinese startup DeepSeek to train its new open-source model that impressed many observers and shook U.S. financial ...
This transcript was prepared by a transcription service. This version may not be in its final form and may be updated. Pierre Bienaimé: Welcome to Tech News Briefing. It's Thursday, February 6th. I'm ...
Artificial intelligence companies like OpenAI, Microsoft (MSFT), and Meta (META) are using a technique called ‘distillation’ to make cheaper and more efficient AI models. This method is the industry’s ...
Chinese artificial intelligence lab DeepSeek roiled markets in January, setting off a massive tech and semiconductor selloff after unveiling AI models that it said were cheaper and more efficient than ...
If severe weather events have taught us anything, it's that preparation and preparedness can lead to far better outcomes. Having safe drinking water is imperative during an emergency and keeping a ...
This is Atlantic Intelligence, a newsletter in which our writers help you wrap your mind around artificial intelligence and a new machine age. Sign up here. If DeepSeek did indeed rip off OpenAI, it ...