Log in
Enquire now
Mistral AI

Mistral AI

Mistral AI is an AI-driven LLMs platform founded in 2023 by Arthur Mensch, Timothée Lacroix and Guillaume Lample.

OverviewStructured DataIssuesContributors

Contents

mistral.ai
Is a
Company
Company
Organization
Organization

Company attributes

Industry
Generative AI
Generative AI
0
Software as a service (SaaS)
Software as a service (SaaS)
Artificial Intelligence (AI)
Artificial Intelligence (AI)
Machine learning
Machine learning
Location
Paris
Paris
0
B2X
B2B
B2B
CEO
Arthur Mensch
Arthur Mensch
0
Founder
Timothée Lacroix
Timothée Lacroix
0
Arthur Mensch
Arthur Mensch
0
Guillaume Lample
Guillaume Lample
0
Pitchbook URL
pitchbook.com/profiles...527294-17
Number of Employees (Ranges)
11 – 500
Email Address
contact@mistral.ai0
Investors
General Catalyst
General Catalyst
JCDecaux
JCDecaux
0
Sofina
Sofina
0
Lightspeed Venture Partners
Lightspeed Venture Partners
0
Xavier Niel
Xavier Niel
0
La Famiglia VC
La Famiglia VC
0
‌
Motier Ventures
0
Index Ventures
Index Ventures
0
...
Founded Date
May 2023
0
Total Funding Amount (USD)
5,818,000,000
Latest Funding Round Date
September 3, 2025
Competitors
OpenAI
OpenAI
0
Adept AI Labs
Adept AI Labs
0
Google DeepMind
Google DeepMind
0
CTO
Timothée Lacroix
Timothée Lacroix
0
Latest Funding Type
Series B
Series B
Series D
Series D
Latest Postmoney Valuation
2,000,000,000
Hugging Face ID
mistralai
Country
France
France
Headquarters
Paris
Paris

Other attributes

Contact Page URL
mistral.ai/contact/
Latest Funding Round Amount (USD)
2,200,000,000
Overview

Mistral AI is a developer of generative AI models and open-source alternatives to large language model (LLM) platforms. It offers text-based model applications in art generation, content creation, chatbots, virtual assistants, language translation, and customer service. The company serves business clients to help them improve processes around research and development, customer care, and marketing through new tools developed with AI. The AI and LLM platform developed by Mistral AI is intended to address the public misuse challenges and security issues facing ChatGPT by, in part, rivaling OpenAI's ChatGPT. To do this, Mistral AI is using open source because the founders believe that open source overcomes misuse potentials.

Mistral AI's generative AI platform is available in early access, serving the company's open models. Mistral AI's models include the following:

  • •Mistral 7B—A 7B dense transformer for a variety of use cases. Supports English and code, and a 8k context window.
  • •Mixtral 8x7B—A sparse mixture-of-experts model with stronger capabilities compared to Mistral 7B. Uses 12B active parameters out of 45B total. Supports multiple languages, code, and a 32k context window.

Mistral AI provides two types of access to LLMs an API with pay-as-you-go access and under the Apache 2.0 license, available directly from its documentation and on Hugging Face. The Mistral AI API is in beta. Users can join the waiting list through the company's platform, immediately accessing the chat endpoint once their subscription is active.

History

Headquartered in Paris, Mistral was founded in May 2023 by Arthur Mensch (CEO), Timothée Lacroix (CTO), and Guillaume Lample (chief science officer)—alums of Google DeepMind and Meta. Mistral AI has stated that French investment bank Bpifrance and former Google CEO Eric Schmidt are shareholders in the company.

Four weeks after its founding, in June 2023, Mistral AI raised a $113 million seed round, leading some to speculate that there was an "AI bubble," especially as the funding was raised before Mistral AI had a product or customers. The round was led by Lightspeed Venture Partners with participation from Redpoint, Index Ventures, Xavier Niel, JCDecaux Holding, Rodolphe Saadé and Motier Ventures in France, La Famiglia and Headline in Germany, Exor Ventures in Italy, Sofina in Belgium, and First Minute Capital and LocalGlobe in the UK. Sources close to the company stated the funding values Mistral AI at $260 million.

On September 27, 2023, Mistral AI released its first model, Mistral 7B, a 7.3 billion parameter model that the company stated is the most powerful model released for its size. Mistral 7B is released under the Apache 2.0 license and can be used without restrictions. Alongside the release of Mistral 7B, the company released a blog post stating its commitment to open source AI development, including the following statements:

At Mistral AI, we believe that an open approach to generative AI is necessary. Community-backed model development is the surest path to fight censorship and bias in a technology shaping our future. We strongly believe that by training our own models, releasing them openly, and fostering community contributions, we can build a credible alternative to the emerging AI oligopoly. Open-weight generative models will play a pivotal role in the upcoming AI revolution.

On December 11, 2023, Mistral AI released its new model, Mixtral 8x7B, opened beta access to its first platform services, and announced $415 million in series A funding. Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights and licensed under Apache 2.0. Mistral AI states the model outperforms Llama 2 70B on many benchmarks with 6x faster inference and matches or outperforms GPT3.5 on most standard benchmarks. "La Plateforme," the company's commercial platform, allows developers to access, deploy, and customize Mistral AI models for production. It serves three chat endpoints for generating text following textual instructions and an embedding endpoint, with each endpoint having a different performance/price tradeoff.

The $415 million series A round was led by Andreessen Horowitz (a16z) with Lightspeed Venture Partners also investing in the company again. Other investors also participating in the round, include Salesforce, BNP Paribas, CMA-CGM, General Catalyst, Elad Gil, and Conviction. Reports state the new funding values the company at roughly $2 billion. Other reports state the funding was closer to $450 million with $200 million from Andreessen Horowitz and $130 million from Nvidia and Salesforce in convertible debt.

Models
Mistral 7B

Mistral 7B is a 7.3B parameter model that supports English and code, and an 8k context window. Mistral 7B uses a sliding window attention (SWA) mechanism, in which each layer attends to the previous 4,096 hidden states. The SWA mechanism exploits the stacked layers of a transformer to attend in the past beyond the window size, i.e., higher layers have access to information further in the past than what the attention patterns seem to entail. A fixed attention span means the model can limit the cache using rotating buffers. This saves half of the cache memory for inference on sequence length of 8192, without impacting model quality.

Mistral AI states that Mistral 7B:

  • •outperforms Llama 2 13B on all benchmarks,
  • •outperforms Llama 1 34B on many benchmarks, and
  • •approaches CodeLlama 7B performance on code while remaining good at English tasks.

The model was released under the Apache 2.0 license and can be used without restrictions. Users can download and run the model anywhere, including locally. It can be deployed on any cloud using vLLM inference server and skypilot and used on Hugging Face. Mistral 7B can also be fine-tuned on any task.

Mixtral 8x7B

Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights and the following capabilities:

  • •Handles a context of 32k tokens
  • •Provides support for English, French, Italian, German and Spanish
  • •Supports code generation
  • •Fine-tunes into an instruction-following model that achieves a score of 8.3 on MT-Bench

The model uses 12 billion active parameters out of 45 billion total. Released under Apache 2.0, Mistral AI states Mixtral 8x7B is the strongest open-weight model with a permissive license, outperforming Llama 2 70B on most benchmarks and matching or outperforming GPT3.5 on most standard benchmarks.

Comparison of LLM performance released by Mistral AI.

Comparison of LLM performance released by Mistral AI.

Mistral AI model sizes

Name
Number of parameters
Number of active parameters
Min. GPU RAM for inference (GB)

Mistral-7B-v0.2

7.3B

7.3B

16

Mistral-8X7B-v0.1

46.7B

12.9B

100

La plateforme

La Plateforme is Mistral AI's developer platform for deploying and customizing its models. Upon release, la plateforme serves three chat endpoints for generating text following textual instructions and an embedding endpoint. Each endpoint provides a different performance/price tradeoff. The first two generative endpoints—mistral-tiny and mistral-small—use Mistral 7B and Mixtral 8X7B, respectively. The third, mistral-medium, uses a prototype model with higher performances. Mistral-embed, the embedding endpoint, serves an embedding model with a 1024 embedding dimension.

Timeline

No Timeline data yet.

Funding Rounds

Products

Acquisitions

SBIR/STTR Awards

Patents

Further Resources

Title
Author
Link
Type
Date

4-Week-Old AI Startup Mistral AI Raises $113 Million Despite No Product And Little Staff

https://finance.yahoo.com/news/4-week-old-ai-startup-170118438.html

Web

June 22, 2023

A.I. company raises record $113 million just a month after being founded--despite having no product ready and only just hiring staff

https://fortune.com/2023/06/14/mistral-ai-startup-record-113-million-seed-round-arthur-mensch/

Web

Deal focus: French start-up Mistral AI's record $113m seed funding round

Lara Williams

https://www.verdict.co.uk/deal-focus-french-start-up-mistral-ais-record-113-seed-funding-round/

Web

June 22, 2023

France's Mistral AI raises 105 million euros shortly after being set up

https://www.reuters.com/technology/french-company-mistral-ai-raises-105-mln-euros-shortly-after-being-set-up-2023-06-13/

Web

July 25, 2023

References

Find more companies like Mistral AI

Use the Golden Query Tool to find similar companies in the same industry, location, or by any other field in the Knowledge Graph.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.