Cerebras: Open Source AI Computing Startup Releasing Innovative Technology

Cerebras Systems, an AI chip startup based in Oakland, California, has launched its open-source ChatGPT-like models for free usage by businesses and academic researchers. This initiative is a step towards encouraging more collaboration between the two groups.

Cerebras, a Silicon Valley-based company, has unveiled seven models trained on their artificial intelligence supercomputer, Andromeda – ranging from a 111 million parameter language model to a whopping 13 billion parameter version.

Andrew Feldman, the Founder and CEO of Cerebras is a highly acclaimed individual. He is respected for founding and managing the company– Cerebras, which has seen immense success.

Andrew Feldman says:

“There is a big movement to close what has been open sourced in AI…it’s not surprising as there’s now huge money in it,”

“The excitement in the community, the progress we’ve made, has been in large part because it’s been so open.”

With a higher number of parameters, models are capable of producing more intricate generative functions.

ChatGPT, a chatbot developed by OpenAI and released in late 2019, has been a major draw for investment in AI technology due to its large number of parameters (175 billion) and the fact that it can generate poetry and facilitate scientific research.

The Cerebras systems, comprised of smaller and larger models, can be utilized in various ways: phones or smart speakers for the smaller ones and PCs or servers for the bigger ones. The large models are particularly suited to complex tasks such as passage summarization.

Karl Freund, a chip consultant at Cambrian AI, disagrees that bigger size is always necessary. According to him, size isn’t the most important factor in determining success.

Freund says:

“There’s been some interesting papers published that show that (a smaller model) can be accurate if you train it more,”

“So there’s a trade off between bigger and better trained.”

Feldman noted that with the help of Cerebras’s architecture, consisting of an AI training chip the size of a dinner plate, his most significant model only took roughly one week to train – a feat that usually consumes many months.

Nvidia Corp’s (NVDA.O) chips are currently utilized for training most AI models. However, startups like Cerebras aim to gain some of this market share.

Feldman stated that the models trained on Cerebras machines could also be utilized on Nvidia systems for additional training or customization. This provides users with the opportunity to modify further and refine their creations.

This move by Cerebras highlights the importance of open-source development in AI and the potential benefits it can bring to society. As more companies and researchers share their technology and knowledge, we expect to see more rapid progress in the field, leading to new applications and advancements that benefit everyone.

Source: Reuters

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top