comparemela.com

Latest Breaking News On - Cerebras - Page 3 : comparemela.com

Open-Source GPT Model Trained at Wafer Scale

Open-Source GPT Model Trained at Wafer Scale
eetimes.com - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from eetimes.com Daily Mail and Mail on Sunday newspapers.

Cerebras releases 7 new GPT models trained on CS-2 wafer-scale systems

Cerebras Systems has trained and is releasing a series of seven GPT-based large language models (LLM) for open use by the research community, according to the company. This is the first time a company has used non-GPU based AI systems to train LLMs up to 13 billion parameters and is sharing the models, weights, and training recipe via the industry standard Apache 2.0 license. All seven models were trained on the 16 CS-2 systems in the Cerebras Andromeda AI supercomputer.

Cerebras-GPT vs LLaMA AI Model Comparison

Comparing the real-world performance of GPT-like AI models with public weights

© 2024 Vimarsana

vimarsana © 2020. All Rights Reserved.