NVIDIA and the National Science Foundation are joining forces to support the development of several open AI models designed for use in science. In an announcement Thursday, NSF said it would ...
The original version of this story appeared in Quanta Magazine. Large language models work well because they’re so large. The latest models from OpenAI, Meta, and DeepSeek use hundreds of billions of ...
The artificial intelligence (AI) race is heating up: the number and quality of high-performing Chinese AI models is rising to challenge the US lead, and the performance edge between top models is ...
When Liquid AI, a startup founded by MIT computer scientists in 2023, introduced its Liquid Foundation Models series 2 (LFM2) in July 2025, the pitch was straightforward: Deliver the fastest on-device ...
Enterprises that have been juggling separate models for reasoning, multimodal tasks, and agentic coding may be able to simplify their stack: Mistral’s new Small 4 brings all three into a single ...
Hosted on MSN
OpenAI releases GPT-5.4 mini and nano small models
OpenAI has released two compact models in its GPT‑5.4 family, branded GPT‑5.4 mini and GPT‑5.4 nano, and is promoting them as small but capable options for developers and device makers. The company ...
Small language models shine for domain-specific or specialized use cases, while making it easier for enterprises to balance performance, cost, and security concerns. Since ChatGPT arrived in late 2022 ...
Large language models (LLMs) use vast amounts of data and computing power to create answers to queries that look and sometimes even feel “human”. LLMs can also generate music, images or video, write ...
As artificial intelligence (AI) tools shake up the scientific workflow, Sam Rodriques dreams of a more systemic transformation. His start-up company, FutureHouse in San Francisco, California, aims to ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results