Log in
Enquire now
PaLM 2

PaLM 2

PaLM 2 is a large language model from Google with improved reasoning, multilingual proficiency, and natural language generation compared to its predecessor, PaLM.

OverviewStructured DataIssuesContributors

Contents

ai.google/discover/palm2/
Is a
Product
Product
Software
Software

Product attributes

Industry
Generative AI
Generative AI
Natural language processing (NLP)
Natural language processing (NLP)
Deep learning
Deep learning
Launch Date
May 10, 2023
Product Parent Company
Google
Google
Competitors
GPT-4
GPT-4

Other attributes

Announcement URL
blog.google/technolog...age-model/
Overview

PaLM 2 is a large language model (LLM) from Google introduced on May 10, 2023, at the company's I/O conference. PaLM 2 is the successor to the Pathways Language Model (PaLM) released in April 2022 as part of Google's Pathways vision (building a single highly efficient model that can generalize across domains and tasks). PaLM 2 is faster and more efficient than its predecessor, with additional capabilities in reasoning tasks, including code and math, classification and question answering, translation and multilingual proficiency, and natural language generation. These improvements are made possible due to three key LLM research advancements:

  • Compute-optimal scaling—increasing both the model size and the training dataset size in proportion to each other. This technique makes PaLM 2 smaller than PaLM, but more efficient with overall better performance, including faster inference, fewer parameters to serve, and a lower serving cost.
  • Dataset mixture—PaLM 2 improves its training corpus with a more multilingual and diverse mixture, which includes hundreds of human and programming languages, mathematical equations, scientific papers, and web pages.
  • Updated model architecture and objective—PaLM 2 has an improved architecture and was trained on a variety of different tasks.

Introduction to PaLM 2

Enhanced capabilities of PaLM 2 include the following:

  • Reasoning—improved capabilities in logic, common sense reasoning, and mathematics. PaLM 2 decomposes complex tasks into simpler subtasks and has a better understanding of nuances within human languages. This includes understanding riddles and idioms that need understanding ambiguous or figurative phrases rather than literal meaning. PaLM 2 demonstrates high performance on reasoning benchmark tasks such as WinoGrande and BigBench-Hard.
  • Multilinguality—the model was pre-trained on parallel multilingual text from a much larger corpus of natural languages (over one hundred). This improves its ability to understand, generate, and translate nuanced text. PaLM 2 passes advanced language proficiency exams at the “mastery” level; it achieves better results than its predecessor on benchmarks such as XSum, WikiLingua, and XLSum. PaLM 2 also improves translation capability over PaLM and Google Translate in languages like Portuguese and Chinese.
Example of PaLM 2 multilingual reasoning, understanding an idiom in the German language.

Example of PaLM 2 multilingual reasoning, understanding an idiom in the German language.

  • Coding—PaLM was pre-trained on a large quantity of webpages, source code, and other datasets. It excels at over twenty programming languages, including popular ones like JavaScript and Python, but also more specialized languages, such as Prolog, Verilog, and Fortran.

PaLM 2 was built on top of Google's JAX and TPU v4 infrastructure. The model is available in four sizes—Gecko, Otter, Bison, and Unicorn, from smallest to largest. Gecko is lightweight enough to run on mobile devices. The largest model in the PaLM 2 family is significantly smaller than the largest PaLM model but uses more training compute. Upon its release, Google announced over twenty-five products and features already powered by PaLM 2, including Bard, the company's chatbot, and features for Google Workspace apps like Docs, Slides, and Sheets. Other examples of PaLM 2 Google products include the following:

  • Trained by Google's health research teams with medical knowledge, Med-PaLM 2 can answer questions and summarize insights from a variety of medical texts. It achieves state-of-the-art results in medical competency, becoming the first LLM to perform at “expert” level on U.S. Medical Licensing Exam-style questions. Google plans to add multimodal capabilities to synthesize information such as x-rays and mammograms. Google plans to open up Med-PaLM 2 to a small group of cloud customers in the summer of 2023.
  • Sec-PaLM, a specialized version of PaLM 2, trained on security use cases for cybersecurity analysis. Sec-PaLM is available through Google Cloud; it uses AI to analyze and explain the behavior of potentially malicious scripts and better detect which scripts actually pose a threat.
  • PaLM 2 forms the basis of Codey, Google’s specialized model for coding and debugging. Codey launched on the same day as PaLM 2—May 10, 2023.

PaLM 2 is available through the PaLM API, which Google has been previewing with a small group of developers since March 2023; Vertex AI; and Duet AI for Google, a generative AI collaborator.

History

On October 28, 2021, Google introduced Pathways, a new AI architecture for handling various tasks across different modalities at the same time and learning new tasks faster. Google developed Pathways to be able to train a single model to perform thousands or millions of tasks, rather than having to train a new model from scratch for each new problem. Pathway is also more efficient than previous architectures that would typically activate the entire neural network to accomplish each task, regardless of its complexity. On March 23, 2022, Google released a paper describing the architecture titled "Pathways: Asynchronous Distributed Dataflow for ML."

On April 4, 2022, Google Research revealed Pathways Language Model (PaLM), its first LLM trained with the Pathways system. A 540-billion parameter, decoder-only Transformer model, PaLM was trained across multiple TPU v4 Pods. A day later, Google released a paper describing the new LLM titled "PaLM: Scaling Language Modeling with Pathways."

On May 10, 2023, Google CEO Sundar Pichai introduced PaLM 2 onstage at the company's I/O conference. The release of PaLM 2 was accompanied by a technical report with details on the new LLM.

Announcement of PaLM 2 at Google I/O 2023.

Training

PaLM 2's pre-training corpus is composed of a diverse set of sources: web documents, books, code, mathematics, and conversational data. The corpus is significantly larger and includes a higher percentage of non-English data than the one used to train PaLM. In addition to more non-English data, PaLM 2 was trained on parallel data covering hundreds of languages in the form of source and target text pairs where one side is in English. The inclusion of parallel multilingual data further improves the model’s ability to understand and generate multilingual text. It also produces an inherent ability for translation into the model, which can be useful for various tasks.

The language distribution of web documents excluding English.

The language distribution of web documents excluding English.

Google utilized several data cleaning and quality filtering methods, including de-duplication and the removal of sensitive, personally identifiable information.

Evaluation

Google evaluated the performance of three PaLM 2 variants (small, medium, and large) for a number of tasks, including the following:

  • Language proficiency
  • Classification and question answering
  • Reasoning
  • Coding
  • Translation
  • Natural language generation
  • Memorization

Evaluations include exams designed for humans and standard academic machine-learning benchmarks. Generally, the model was evaluated in a few-shot, in-context learning setting, with the model given a short prompt and, optionally, a few examples of the task. Google reports strong quality improvements across all areas.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date

Introducing PaLM 2

Zoubin Ghahramani

https://blog.google/technology/ai/google-palm-2-ai-large-language-model/

Web

May 10, 2023

PaLM 2 Technical Report

Google

https://ai.google/static/documents/palm2techreport.pdf

Report

May 10, 2023

PaLM: Scaling Language Modeling with Pathways

Aakanksha Chowdhery, Sharan Narang, Jacob Devlin, Maarten Bosma, Gaurav Mishra, Adam Roberts, Paul Barham, Hyung Won Chung, Charles Sutton, Sebastian Gehrmann, Parker Schuh, Kensen Shi, Sasha Tsvyashchenko, Joshua Maynez, Abhishek Rao, Parker Barnes, Yi Tay, Noam Shazeer, Vinodkumar Prabhakaran, Emily Reif, Nan Du, Ben Hutchinson, Reiner Pope, James Bradbury, Jacob Austin, Michael Isard, Guy Gur-Ari, Pengcheng Yin, Toju Duke, Anselm Levskaya, Sanjay Ghemawat, Sunipa Dev, Henryk Michalewski, Xavier Garcia, Vedant Misra, Kevin Robinson, Liam Fedus, Denny Zhou, Daphne Ippolito, David Luan, Hyeontaek Lim, Barret Zoph, Alexander Spiridonov, Ryan Sepassi, David Dohan, Shivani Agrawal, Mark Omernick, Andrew M. Dai, Thanumalayan Sankaranarayana Pillai, Marie Pellat, Aitor Lewkowycz, Erica Moreira, Rewon Child, Oleksandr Polozov, Katherine Lee, Zongwei Zhou, Xuezhi Wang, Brennan Saeta, Mark Diaz, Orhan Firat, Michele Catasta, Jason Wei, Kathy Meier-Hellstern, Douglas Eck, Jeff Dean, Slav Petrov, Noah Fiedel

https://arxiv.org/abs/2204.02311

Paper

April 5, 2022

Pathways: Asynchronous Distributed Dataflow for ML

Paul Barham, Aakanksha Chowdhery, Jeff Dean, Sanjay Ghemawat, Steven Hand, Dan Hurt, Michael Isard, Hyeontaek Lim, Ruoming Pang, Sudip Roy, Brennan Saeta, Parker Schuh, Ryan Sepassi, Laurent El Shafey, Chandramohan A. Thekkath, Yonghui Wu

https://arxiv.org/abs/2203.12533

Paper

March 23, 2022

References

Find more entities like PaLM 2

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.