Log in
Enquire now
Cerebras

Cerebras

Cerebras is an AI company building computer systems designed to accelerate AI workflows.

OverviewStructured DataIssuesContributors

Contents

cerebras.net
Is a
Company
Company
Organization
Organization

Company attributes

Industry
Technology
Technology
Artificial Intelligence (AI)
Artificial Intelligence (AI)
High-performance computing
High-performance computing
0
Electronic design automation
Electronic design automation
Microprocessor
Microprocessor
Hardware
Hardware
Location
Toronto
Toronto
0
Bangalore
Bangalore
0
Sunnyvale, California
Sunnyvale, California
0
San Diego
San Diego
0
B2X
B2B
B2B
CEO
Andrew Feldman
Andrew Feldman
0
Founder
Gary Lauterbach
Gary Lauterbach
0
Andrew Feldman
Andrew Feldman
0
Jean-Philippe Fricker
Jean-Philippe Fricker
0
Sean Lie
Sean Lie
0
Michael James
Michael James
0
Pitchbook URL
pitchbook.com/profiles...163733-59
Legal Name
Cerebras Systems Inc.
Legal classification
Corporation
Corporation
Number of Employees (Ranges)
201 – 5000
Email Address
info@cerebras.net0
pr@zmcommunications.com0
Full Address
1237 E. Arques Ave Sunnyvale, CA 940850
10188 Telesis Ct San Diego, CA 921210
150 King St. West, Suite 701 Toronto, ON M5H 1J90
Prestige Tech Park, 6th Floor, Valence Block Marthahalli-Sarjapur Outer Ring Road, Bangalore 560 1030
Investors
Group 42
Group 42
Sequoia Capital
Sequoia Capital
In-Q-Tel
In-Q-Tel
Sam Altman
Sam Altman
0
Alpha Wave Global
Alpha Wave Global
0
Benchmark (venture capital)
Benchmark (venture capital)
‌
Empede Capital
Foundation Capital
Foundation Capital
...
Founded Date
2016
0
Total Funding Amount (USD)
722,000,000
Latest Funding Round Date
November 10, 2021
Competitors
Semron
Semron
Rain AI
Rain AI
CTO
Gary Lauterbach
Gary Lauterbach
0
Latest Funding Type
‌
Series F
Latest Postmoney Valuation
4,000,000,000
Patents Assigned (Count)
24
Hugging Face ID
cerebras
Headquarters
Sunnyvale, California
Sunnyvale, California
0

Other attributes

Blog
cerebras.net/blogs/
Company Operating Status
Active
Contact Page URL
cerebras.net/contact/
Postmoney Valuation
4,300,000,000
Latest Funding Round Amount (USD)
250,000,000
Wikidata ID
Q107203509
Overview

Cerebras is an AI company building computer systems designed to accelerate AI workflows. The company has a team of computer architects, computer scientists, deep learning researchers, business experts, and engineers of all types with the goal of building a new class of computers for AI work. Cerebras states its wafer-scale clusters can train AI models in record time, and the company is developing an HPC (high-performance computing) accelerator, called the CS-2, with 850,000 cores and 40 GB of on-chip memory.

Cerebras's technology has applications in multiple AI fields, such as NLP (natural language processing), CV (computer vision), and HPC, and for a range of industries, including health & pharma, energy, government, scientific computing, financial services, and web & social media. The Cerebras platform has trained a number of models from multi-lingual LLMs to healthcare chatbots. The company helps customers train their own foundation models or fine-tune open-source models like Llama 2. Cerebras open-sourced seven GPT models in March 2023, the first trained using its CS-2 hardware.

Cerebras was founded in 2016 by Andrew Feldman (CEO), Jean-Philippe Fricker (Chief System Architect), Michael James (Chief Architect, Advanced Technologies), Gary Lauterbach (CTO), and Sean Lie (Chief Hardware Architect). Feldman previously cofounded and was CEO of SeaMicro, where he worked with the other four Cerebras cofounders. SeaMicro developed energy-efficient, high-bandwidth microservers and was acquired by AMD in 2012 for $357 million. The company is headquartered in Sunnyvale, California, with locations in San Diego, Toronto, and Bangalore.

Cerebras has raised over $700 million in funding from Alpha Wave, Benchmark, Foundation Capital, Eclipse, Coatue, VY Capital, Altimeter, and angel investors Fred Weber, Ilya Sutskever, Sam Altman, Andy Bechtolshelm, Greg Brockman, Adam D'Angelo, Mark Leslie, Nick Mckeown, David "Dadi" Perlmutter, Salyed Atiq Raza, Jeff Rothschild, Pradeep Sindhu, and Lip-Bu Tan. This total includes $250 million in series F from November 2021, which valued the company at around $4 billion.

Products
WSE

First announced in August 2019, the Cerebras Wafer-Scale Engine (WSE) is built from a single wafer of silicon, supplied by TSMC and measures 46,225 mm2 and consists of 1.2 trillion transistors, 400,000 cores and 18 gigabytes of on-chip memory (SRAM). The size of the chip tries to work around AI limitations based on multiple graphics-processing units (GPU) systems. The solutions lose time to information bottlenecks where the WSE should, through its 400,000 cores, communicate quickly and reduce AI learning time. Speaking to Fortune, CEO and cofounder Andrew Feldman said, about WSE:

Every time there has been a shift the computing workload, the underlying machine has had to change.

Previous to Cerebras's chip, chip manufacturers were reluctant to build chips this large. When manufacturing chips, the silicon often has imperfections, which are turned into lower-grade chips or tossed out. Cerebras seeks to work around this process with redundant circuits to work around defects. Cerebras uses TSMC's 16nm node and a 300mm wafer, out of which they cut the largest square. From that, they make their 400,000 sparse linear algebra cores (SLA cores), which are designed for AI deep learning workloads. The 18 gbs of SRAM becomes an aggregate 9 petabytes per second of memory workload.

Cerebras, with TSMC, developed a technique of laying thousands of links across scribe lines. This results in a chip that doesn't behave like 84 processing tiles but instead like 400,000 cores. The built-in redundancy of this technique means Cerebras might reach 100% yield in their manufacturing process.

There were also the issues of thermal expansion on a chip this large, connectivity, and cooling. Cerebras developed a new connector to deliver power through the chips PCB rather than across. To cool the chip, and stop any thermal throttling or thermal expansion that could crack the chip, they developed a water-cooling solution, which punches water onto a copper cold-plate with a micro-fin array to pull heat off the chip. The hot water is then air-cooled in a radiator.

WSE-2

Unveiled in April 2021, the second-generation WSE (WSE-2) powers Cerebras's CS-2 system. The company states it is the largest chip ever built, consisting of 2.6 trillion transistors, 850,000 cores, and 40 gigabytes on-wafer memory.

CS-2

The Cerebras CS-2 is a purpose-built deep learning system for fast and scalable AI workloads. Cerebras states that CS-2 reduces model training time and inference latency significantly. The CS-2 is powered by Cerebras Wafer-Scale Engine 2 (WSE-2) with 850,000 compute cores, 40GB of on-chip SRAM, 20 PB/s memory bandwidth, and 220Pb/s interconnect. This is housed in purpose-built packaging with cooling and power delivery.

CG-1

In July 2023, Cerebras introduced the Condor Galaxy 1 (CG-1): a 4 exaFLOPS supercomputer for generative AI. Built in partnership with G42, the CG-1 is the first in a series of nine supercomputers to be built. Cerebras states the nine supercomputers will be completed in 2024, the nine inter-connected supercomputers will have 36 exaFLOPS of AI compute. CG-1 is located in Santa Clara, California, at the Colovore data center.

Cerebras AI Model Studio

The Cerebras AI Model Studio is a pay-by-the-model computing service powered by clusters of CS-2s and hosted by Cirrascale Cloud Services. It is a purpose-built platform for training and fine-tuning large language models on dedicated clusters of millions of cores.

Timeline

No Timeline data yet.

Funding Rounds

Products

Acquisitions

SBIR/STTR Awards

Patents

Further Resources

Title
Author
Link
Type
Date
No Further Resources data yet.

References

Find more companies like Cerebras

Use the Golden Query Tool to find similar companies in the same industry, location, or by any other field in the Knowledge Graph.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.