Liquid AI Announces Generative AI Liquid Foundation Models With Smaller Memory Footprint

Date:

Liquid AI, a Massachusetts-based artificial intelligence (AI) startup, announced its first generative AI models not built on the existing transformer architecture. Dubbed Liquid Foundation Model (LFM), the new architecture moves away from Generative Pre-trained Transformers (GPTs) which is the foundation for popular AI models such as the GPT series by OpenAI, Gemini, Copilot, and more. The startup claims that the new AI models were built from first principles and they outperform large language models (LLMs) in the comparable size bracket.

Liquid AI’s New Liquid Foundation Models

The startup was co-founded by researchers at the Massachusetts Institute of Technology (MIT)’s Computer Science and Artificial Intelligence Laboratory (CSAIL) in 2023 and aimed to build newer architecture for AI models that can perform at a similar level or surpass the GPTs.

These new LFMs are available in three parameter sizes of 1.3B, 3.1B, and 40.3B. The latter is a Mixture of Experts (MoE) model, which means it is made up of various smaller language models and is aimed at tackling more complex tasks. The LFMs are now available on the company’s Liquid Playground, Lambda for Chat UI and API, and Perplexity Labs and will soon be added to Cerebras Inference. Further, the AI models are being optimised for Nvidia, AMD, Qualcomm, Cerebras, and Apple hardware, the company stated.

LFMs also differ significantly from the GPT technology. The company highlighted that these models were built from first principles. The first principles ar essentially a problem-solving approach where a complex technology is broken down to its fundamentals and then built up from there.

See also  iPhone 16 Available for Purchase With 10-Minute Delivery via Blinkit, BigBasket in Select Cities

According to the startup, these new AI models are built on something called computational units. Put simply, this is a redesign of the token system, and instead, the company uses the term Liquid system. These contain condensed information with a focus on maximising knowledge capacity and reasoning. The startup claims this new design helps reduce memory costs during inference, and increases performance output across video, audio, text, time series, and signals.

The company further claims that the advantage of the Liquid-based AI models is that its architecture can be automatically optimised for a specific platform based on their requirements and inference cache size.

While the clams made by the startup are tall, their performance and efficiency can only be gauged as developers and enterprises begin using them for their AI workflows. The startup did not reveal its source of datasets, or any safety measures added to the AI models.

Liquid AI, a Massachusetts-based artificial intelligence (AI) startup, announced its first generative AI models not built on the existing transformer architecture. Dubbed Liquid Foundation Model (LFM), the new architecture moves away from Generative Pre-trained Transformers (GPTs) which is the foundation for popular AI models such as the GPT series by OpenAI, Gemini, Copilot, and more. The startup claims that the new AI models were built from first principles and they outperform large language models (LLMs) in the comparable size bracket.

Liquid AI’s New Liquid Foundation Models

The startup was co-founded by researchers at the Massachusetts Institute of Technology (MIT)’s Computer Science and Artificial Intelligence Laboratory (CSAIL) in 2023 and aimed to build newer architecture for AI models that can perform at a similar level or surpass the GPTs.

See also  Volcanic Activity in Yellowstone Is Shifting in the Northeast Direction, Study Finds

These new LFMs are available in three parameter sizes of 1.3B, 3.1B, and 40.3B. The latter is a Mixture of Experts (MoE) model, which means it is made up of various smaller language models and is aimed at tackling more complex tasks. The LFMs are now available on the company’s Liquid Playground, Lambda for Chat UI and API, and Perplexity Labs and will soon be added to Cerebras Inference. Further, the AI models are being optimised for Nvidia, AMD, Qualcomm, Cerebras, and Apple hardware, the company stated.

LFMs also differ significantly from the GPT technology. The company highlighted that these models were built from first principles. The first principles ar essentially a problem-solving approach where a complex technology is broken down to its fundamentals and then built up from there.

According to the startup, these new AI models are built on something called computational units. Put simply, this is a redesign of the token system, and instead, the company uses the term Liquid system. These contain condensed information with a focus on maximising knowledge capacity and reasoning. The startup claims this new design helps reduce memory costs during inference, and increases performance output across video, audio, text, time series, and signals.

The company further claims that the advantage of the Liquid-based AI models is that its architecture can be automatically optimised for a specific platform based on their requirements and inference cache size.

While the clams made by the startup are tall, their performance and efficiency can only be gauged as developers and enterprises begin using them for their AI workflows. The startup did not reveal its source of datasets, or any safety measures added to the AI models.

 

See also  iQOO Neo 10 Series Alleged Schematics Show Design, Tipped to Get 6.78-Inch Display

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

South Carolina prepares for second firing squad execution

A firing squad is set to kill a South...

RRB ALP Recruitment 2025: Apply for 9,970 vacancies from April 12; check selection process and other details here

The RRB ALP Recruitment 2025 application process for 9,970...

‘Gauti (Gautam Gambhir) bhai has helped me understand my potential’

Washington Sundar, a versatile all-rounder, faces the challenge of...

Apple is left without a life raft as Trump’s China trade war intensifies, analysts warn

Apple remains stranded without a life raft, experts say,...