LangChain is an open-source framework for developing applications powered by language models.
LangChain supports a number of language models, including those from prominent AI platforms, such as OpenAI, Hugging Face, and Anthropic. The framework provides APIs to access and interact with LLMs as well asand an array of tools, components, and interfaces to help the development process. LangChain also has extensive documentation to help users become familiar with the framework.
LangChain began as an open-source side project with no intention of starting a company. Chase saw common patterns in how people were approaching problems utilizing language models. LangChain is an attempt to create abstractions that alleviate these issues. With the project growing significantly, Chase cofounded a company to develop the framework with Ankush Gola in January 2023. Gola is a software engineer who previously worked at Unfold, Robust Intelligence, and Facebook.
LangChain offers standard, extendable interfaces, and external integrations for the following modules:
LangChain enables access to a range of pre-trained models that generate outputs based on the prompt and input provided. These models can be further fine-tuned to match the specific needs of the user. The following are the three types of models LangChain works with are:
The prompt or input is rarely hard-coded, instead being constructed from multiple components. A prompt template is responsible for the construction of the input, with LangChain providing several classes and functions to simplify working with prompts. The template defines the structure of the prompt, including the format and content. The prompt can be divided into four sections:
LangChain offers functionality to load, transform, store, and query user's data via the following:
Some generative AI applications require more than a predetermined chain of calls to LLMs and other tools. They may also need an unknown chain based on the user's inputs. These applications require an "agent" that has access to a suite of tools and can decide which to call. The agent acts as a "wrapper" around a model that takes user inputs and returns a response corresponding to an “action” to take and a corresponding “action input”.input.” LangChain provides a dedicated module for building agents, thatwhich can be used to create chatbots or personal assistants.
TheMemory is the concept of storing and retrieving data during a conversation. By default, chains and agents are stateless,; they react to each new input query independently of prior inputs. Memory allows them to recall previous interactions with users, providing them with more personalized and contextualized responses. The two main methods are based on input (fetch any relevant pieces of data) or input and output (update the state accordingly). The two main types of memory are short term and long term. Short term typically refers to how data is passed in the context of a singular conversation. Long term deals with fetching and updating information between conversations.
LangChain users can construct a chatbot from an LLM or chat model, a prompt template that guides how the chatbot acts, and memory to ensure the model changes its outputs based on previous interactions. LangChain producedLangChain-produced chatbots can be differentiated by combining them with other sources of data using similar techniques as the question answering over documents use case.
With the widespread availability of new generative AI models, LangChain has become a popular framework to chain AI functionality and build new applications. LangChain is being used across a large number of projects, including the following:
LangChain provides a series of steps that generate text from input prompts. These steps comprise six main components models, prompts, indexes, memory, chains, and agents.
LangChain offers standard, extendable interfaces, and external integrations for the following modules:
LangChain offers functionality to load, transform, store, and query user's data via:
The concept of storing and retrieving data during a conversation. By default chains and agents are stateless, they react to each new input query independently of prior inputs. Memory allows them to recall previous interactions with users, providing them with more personalized and contextualized responses. The two main methods are based on input (fetch any relevant pieces of data) or input and output (update the state accordingly). The two main types of memory are short and long term. Short term typically refers to how data is passed in the context of a singular conversation. Long term deals with fetching and updating information between conversations.
The concept of storing and retrieving data during a conversation. By default chains and agents are stateless, they react to each new input query independently of prior inputs. Memory allows them to recall previous interactions with users, providing them with more personalized and contextualized responses. The two main methods are based on input (fetch any relevant pieces of data) or input and output (update the state accordingly). The two main types of memory are short and long term. Short term typically refers to how data is passed in the context of a singular conversation. Long term deals with fetching and updating information between conversations.
LangChain provides a callbacks system, allowing users to hook into the various stages of their LLM application. This can be useful for logging, monitoring, streaming, and other tasks. Users access these events through the callback argument available throughout the API.
LangChain users can construct a chatbot from an LLM or chat model, a prompt template that guides how the chatbot acts, and memory to ensure the model changes its outputs based on previous interactions. LangChain produced chatbots can be differentiated by combining them with other sources of data using similar techniques as the question answering over documents use case.
LangChain offers various resources for querying information stored in tabular data, including csvs, excel sheets, or SQL tables. This includes document loading and indexing and querying using language models to interact with the data as well as chains and agents to perform more advanced tasks.
There are six main components of LangChain:
LangChain provides a series of steps that generate text from input prompts. These steps comprise six main components models, prompts, indexes, memory, chains, and agents.
LangChain enables access to a range of pre-trained models that generate outputs based on the prompt and input provided. These models can be further fine-tuned to match the specific needs of the user. The three types of models LangChain works with are:
The prompt or input is rarely hard-coded, instead being constructed from multiple components. A prompt template is responsible for the construction of the input with LangChain providing several classes and functions to simplify working with prompts. The template defines the structure of the prompt including the format and content. The prompt can be divided into four sections:
Indexes structure documents such that LLMs can interact best with them. LangChain provides functions for working with documents, different types of indexes, and then examples for using those indexes in chains. The most common method of using indexes in chains is in a retrieval step, taking a user's query and returning the most relevant documents. Typically, this refers to retrieving unstructured documents, such as text documents. The primary index and retrieval types supported by LangChain are centered around vector databases.
The concept of storing and retrieving data during a conversation. By default chains and agents are stateless, they react to each new input query independently of prior inputs. Memory allows them to recall previous interactions with users, providing them with more personalized and contextualized responses. The two main methods are based on input (fetch any relevant pieces of data) or input and output (update the state accordingly). The two main types of memory are short and long term. Short term typically refers to how data is passed in the context of a singular conversation. Long term deals with fetching and updating information between conversations.
A generic concept that returns a sequence of modular components (or other chains) combined in a particular way to accomplish a common use case. The most commonly used chain is an LLM chain that combines a prompt template, a model, and guardrails to take user input, format it accordingly, pass it to the model, then get a response, before validating and fixing the model output if necessary.
Some generative AI applications require more than a predetermined chain of calls to LLMs and other tools. They may also need an unknown chain based on the user's inputs. These applications require an "agent" that has access to a suite of tools and can decide which to call. The agent acts as a "wrapper" around a model that takes user inputs and returns a response corresponding to an “action” to take and a corresponding “action input”. LangChain provides a dedicated module for building agents, that can be used to create chatbots or personal assistants.
Personal assistants can be built upon the following LangChain components:
LangChains allows users to provide documents for LLMs to ingest new documents for question answering. Users can create an index over the data they contain to save time retrieving information.
With the widespread availability of new generative AI models, LangChain has become a popular framework to chain AI functionality and build new applications. LangChain hasis beenbeing used across a large number of projects, including:
LangChain was created by Harrison Chase, with the first version released on October 24th24, 2022. In a tweet thread upon its release, Chase described LangChain as
LangChain is an open-source framework for developing applications powered by large language models (LLMs). The framework combines LLMs with other sources of computation and knowledge to help develop new applications, that are data-aware (connecting a language model with other sources of information) and agentic (allowing a language model to interact dynamically with its environment). LangchainLangChain is available as both a Python and a TypeScript package.
LangchainLangChain aims to streamline the development of a wide range of applications, including chatbots, generative question-answering, and summarization. It does this by "chaining" components from multiple modules, to build more advanced LLM use cases.
LangChain supports a number of language models, including those from prominent AI platforms, such as OpenAI, Hugging Face, and Anthropic, and more. The framework provides APIs to access and interact with LLMs as well as an array of tools, components, and interfaces to help the development process. LangChain also has extensive documentation to help users become familiar with the framework.
LangChain was created by Harrison Chase, with the first version released on October 24th, 2022. In a tweet thread upon its release, Chase described LangChain as:
LangChain was initially developed by Chase while he was working at Robust Intelligence, an MLOps company testing and validating machine learning models. Chase led the ML team at Robust Intelligence. Prior to that, he led the entity linking team at Kensho (a fintech startupstart-up) and attended Harvard University, studying statistics and computer science.
LangChain began as an open-source side project with no intention of starting a company. Chase saw common patterns in how people were approaching problems utilizing language models. LangChain is an attempt to create abstractions that alleviate these issues. With the project growing significantly, Chase co-foundedcofounded a company to develop the framework with Ankush Gola in January 2023. Gola is a software engineer who previously worked at Unfold, Robust Intelligence, and Facebook.
On February 17, 2023, LangChain released support for TypeScript, allowing users to recreate applications in TypeScript natively. The TypeScript package mirrors the Python package as closely as possible, utilizing the same serializable format such that artifacts can be shared between languages. LangChain chose initially chose Python, as it is popular among machine learning research-oriented communities. However, as interest in the project grew, it was being used by people across the stack, many of which prefer javascriptJavascript.
The project continued to grow with over 20k stars on GitHub, 10K active Discord members, over 30K followers on Twitter, and over 350 contributors, by early April 2023. On April 4, 2023, LangChain publicly announced it had raised $10 million in seed funding. The round was led by Benchmark, who will also provide LangChain with counsel. Benchmark has previously been the first lead investor in major open-source projects such as Docker, Confluent, Elastic, and Clickhouse.
On April 11, 2023, LangChain announced support for running LangChain.js in browsers, Cloudflare Workers, Vercel/Next.js, Deno, and Supabase Edge Functions, alongside existing support for Node.js ESM and CJS. Originally, LangChain.js was designed to run in Node.js, the team began collecting feedback from the LangChain community to determine what other JS runtimes the framework should support.
Shortly after its seed round on April 13th13, 2023, BusinessInsider reported that LangChain had raised between $20 million and $25 million in funding from Sequoia, giving the company a valuation of at least $200 million. The deal was headed up by Sonya Huang, a growth investor known for her work in generative AI. Sequoia avoided a formal fundraising process by pre-empting the round.
Basic data types and schema used throughout the LangChain codebase include the following:
TheThere are six main components of Lanchain areLangChain:
LangChain can help with a number of end-to-end use cases, including those below:
A comprehensive list of LangChain integrations can be found in their documentation. A short list of key integrations includes the following:
With the widespread availability of new generative AI models, LangChain has become a popular framework to chain AI functionality and build new applications. LangChain has been used across a large number of projects including:
At a high-level it is possible to separate the models LangChain interacts with into:
LangChain integrates with a number of LLMs, systems, and products to help developers build applications using the environment they choose. These integrations can be grouped by the core LangChain modules mapping to:
A comprehensive list of LangChain integrations can be found in their documentation. A short list of key integrations includes:
With the widespread availability of new generative AI models, LangChain has become a popular framework to chain AI functionality and build new applications. LangChain has been used across a large number of projects including:
LangChain was created by Harrison Chase, with the first version released on October 24th, 2022. In a tweet thread upon its release, Chase described LangChain as:
LangChain began as an open-source side project with no intention of buildingstarting a company. Chase saw common patterns in how people were approaching problems utilizing language models. LangChain is an attempt to create abstractions that alleviate these issues. InWith the project growing significantly Chase co-founded a company to develop the framework with Ankush Gola in January 2023, he co-founded the LangChain company to develop the framework with Ankush. Gola, is a software engineer who had previously worked at Unfold, Robust Intelligence, and Facebook.
On February 17, 2023, LangChain released support for TypeScript allowing users to recreate applications in TypeScript natively. The TypeScript package mirrors the Python package as closely as possible, utilizing the same serializable format such that artifacts can be shared between languages. LangChain chose initially chose pythonPython as it is popular among machine learning research-oriented communities. However, as interest in the project grew it was being used by people across the stack many of which prefer javascript.
On April 11, 2023, LangChain announced support for running LangChain.js in browsers, Cloudflare Workers, Vercel/Next.js, Deno, and Supabase Edge Functions, alongside existing support for Node.js ESM and CJS. Originally LangChain.js was designed to run in Node.js, the team began collecting feedback from the LangChain community to determine what other JS runtimes the framework should support.
LangChain is an open-source framework for developing applications powered by large language models (LLMs). The framework combines LLMs with other sources of computation and knowledge to assist the developmenthelp ofdevelop new applications, that are data-aware (connecting a language model with other sources of information) and agentic (allowing a language model to interact dynamically with its environment). Langchain is available as both a Python and a TypeScript package.
LangChain supports a number of language models, including those from prominent AI platforms such as OpenAI, Hugging Face, Anthropic, and more. The framework provides APIs to access and interact with LLMs as well as an array of tools, components, and interfaces to help the development process. LangChain also has extensive documentation to help users become familiar with the framework.
LangChain was initially developed by Chase while he was working at Robust Intelligence an MLOps company testing and validating machine learning models. Chase led the ML team at Robust Intelligence. Prior to that he led the entity linking team at Kensho (a fintech startup) and attended Harvard University, studying statistics and computer science. In January 2023, he co-founded the LangChain company to develop the framework with Ankush Gola, a software engineer who had previously worked at Unfold, Robust Intelligence, and Facebook.
LangChain startedbegan as an open-source side project with no intention of building a company. Chase saw common patterns in how people were approaching problems utilizing language models. LangChain is an attempt to create abstractions that alleviate these issues. TheIn January 2023, he co-founded the LangChain projectcompany grewto develop quicklythe framework with over 20kAnkush starsGola, ona GitHub,software 10Kengineer activewho Discordhad members,previously overworked 30Kat followersUnfold, onRobust TwitterIntelligence, and over 350 contributors, by early April 2023Facebook.
On April 4, 2023, LangChain publicly announced it had raised $10 million in seed funding. The round was led by Benchmark who will also provide LangChain with counsel. Benchmark have previously been the first lead investor in major open-source projects such as Docker, Confluent, Elastic, and Clickhouse.
On February 17, 2023, LangChain released support for TypeScript allowing users to recreate applications in TypeScript natively. The TypeScript package mirrors the Python package as closely as possible, utilizing the same serializable format such that artifacts can be shared between languages. LangChain chose initially chose python as it is popular among machine learning research-oriented communities. However, as interest in the project grew it was being used by people across the stack many of which prefer javascript.
The project continued to grow with over 20k stars on GitHub, 10K active Discord members, over 30K followers on Twitter, and over 350 contributors, by early April 2023. On April 4, 2023, LangChain publicly announced it had raised $10 million in seed funding. The round was led by Benchmark who will also provide LangChain with counsel. Benchmark has previously been the first lead investor in major open-source projects such as Docker, Confluent, Elastic, and Clickhouse.
On April 11, 2023, LangChain announced support for running LangChain.js in browsers, Cloudflare Workers, Vercel/Next.js, Deno, Supabase Edge Functions, alongside existing support for Node.js ESM and CJS. Originally LangChain.js was designed to run in Node.js, the team began collecting feedback from the LangChain community to determine what other JS runtimes the framework should support.
Shortly after its seed round on April 13th, 2023, BusinessInsider reported that LangChain had raised between $20 million and $25 million in funding from Sequoia, giving the company a valuation of at least $200 million. The deal was headed up by Sonya Huang, a growth investor known for her work in generative AI. Sequoia avoided a formal fundraising process by pre-empting the round.
Basic data types and schema used throughout the LangChain codebase include:
The six main components of Lanchain are:
At a high-level it is possible to separate the models LangChain interacts with into:
LangChain can help with a number of end-to-end use cases, including:
April 11, 2023
These were chosen based on feedback from the LangChain Community.
February 17, 2023
The TypeScript package mirrors the Python package as closely as possible, utilizing the same serializable format such that artifacts can be shared between languages.
LangChain is an open-source framework for developing applications powered by language models.
LangChain is an open-source framework for developing applications powered by large language models (LLMs). The framework combines LLMs with other sources of computation and knowledge to assist the development of new applications, that are data-aware (connecting a language model with other sources of information) and agentic (allowing a language model to interact dynamically with its environment). Langchain is available as both a Python and a TypeScript package.
Langchain aims to streamline the development of a wide range of applications including chatbots, generative question-answering, and summarization. It does this by "chaining" components from multiple modules, to build more advanced LLM use cases.
LangChain was created by Harrison Chase, with the first version released on October 24th, 2022. In a tweet thread upon its release, Chase described LangChain as:
a python package aimed at helping build LLM applications through composability... The real power comes when you are able to combine [LLMs] with other things... LangChain aims to help with that by creating… a comprehensive collection of pieces you would ever want to combine… a flexible interface for combining pieces into a single comprehensive ‘chain’
LangChain was initially developed by Chase while he was working at Robust Intelligence an MLOps company testing and validating machine learning models. Chase led the ML team at Robust Intelligence. Prior to that he led the entity linking team at Kensho (a fintech startup) and attended Harvard University, studying statistics and computer science. In January 2023, he co-founded the LangChain company to develop the framework with Ankush Gola, a software engineer who had previously worked at Unfold, Robust Intelligence, and Facebook.
LangChain started as an open-source side project with no intention of building a company. Chase saw common patterns in how people were approaching problems utilizing language models. LangChain is an attempt to create abstractions that alleviate these issues. The project grew quickly with over 20k stars on GitHub, 10K active Discord members, over 30K followers on Twitter, and over 350 contributors, by early April 2023.
On April 4, 2023, LangChain publicly announced it had raised $10 million in seed funding. The round was led by Benchmark who will also provide LangChain with counsel. Benchmark have previously been the first lead investor in major open-source projects such as Docker, Confluent, Elastic, and Clickhouse.
Shortly after its seed round, BusinessInsider reported that LangChain had raised between $20 million and $25 million in funding from Sequoia, giving the company a valuation of at least $200 million. The deal was headed up by Sonya Huang, a growth investor known for her work in generative AI. Sequoia avoided a formal fundraising process by pre-empting the round.
April 4, 2023
January 2023
Up to this point, LangChain had been an open-source side project started by Chase with a number of other contributors.
October 24, 2022
LangChain was released as an open-source python package to help people build LLM applications through composability.