OpenAI

OpenAI

OpenAI is a non-profit AI research company, discovering and enacting the path to safe artificial general intelligence.

OpenAI is a non-profit artificial general intelligence research company. The company was founded in 2015 by Elon Musk, Greg Brockman, Ilya Sutskever, Sam Altman, and Wojciech Zaremba, in San Francisco, California, United States.

OpenAI's mission is to build safe AGI, and ensure AGI's benefits are as widely and evenly distributed as possible. OpenAI was founded with the belief that the first AGIs will have impact that greatly exceeds that of preceding AI technologies.

OpenAI is a research company with a full-time staff of 60 researchers and engineers who are dedicated to working towards the organization's mission. Their focus is on long-term research - working on problems that require making fundamental advances in AI capabilities. By being at the forefront of the field, OpenAI wants to influence the conditions under which AGI is created. As Alan Kay said, "The best way to predict the future is to invent it." Being concerned with how AGI is created, the organization plans to slowly spend their supported commitments that total $1billion.

Much research by OpenAI is published at top machine learning conferences. The organization also contributes open-source software tools for accelerating AI research and releases blog posts to communicate their research to others in the field. OpenAI does not believe in keeping information private for private benefit, although they do expect to create formal processes for keeping technologies private when there are safety concerns. The company is known for their openness to collaboration with other organizations and researchers, and keep their patents and findings accessible to the public.

Multi-Agent Interactions

OpenAI has developed artificial intelligence that has trained to play hide-and-seek against itself (involves hiders and seekers). The AI was trained without any predetermined explicit incentives, other than the hiders are to avoid the seekers line of sight, and the seekers are meant to keep the hiders in their line of sight. The hiders eventually learned to find objects, and use them to block seekers, and continued to learn new strategies as seekers did to accomplish their tasks. Emergent strategies were the result of autocurriculum. The company believes that these multi-agent dynamics can lead to complex behavior, similar to human behavior.

MuseNet

MuseNet is a deep neural network that is capable of composing musical pieces with up to ten different instruments, and write them in any genre, lasting four minutes. The software was not programmed with our understanding of music, but learned from predicting the next note in a MIDI file thousands of times. To train, the organization used data from ClassicalArchives and BitMidi.

GPT-3

GPT-3 is an autoregressive language model with 175 billion parameters made by OpenAI that launched on May 29, 2020 .The model builds on the transformer-based language model previously made by OpenAI called GPT-2 that had 1.5 billion parameters. GPT-3 improves on GPT-2 by adopting and scaling features present in GPT-2 such as modified initialization, pre-normalization, and reversible tokenization. GPT-3 training can improve scaling up language models primarily through improved task-agnostic and few-shot performance compared to the GPT-2 model. Open AI claims GPT-3 is approaching the performance of SOTA fine-tuned systems for generating quality performance and samples for defined tasks.

According to the GitHub depository of GPT-3, it can achieve strong performance when dealing with NLP datasets such as translation, question-answering, and cloze tasks. It can also perform on-the-fly reasoning/domain adaptation tasks like unscrambling words, using novel words in sentences, and performing 3-digit arithmetic. The performance of GPT-3 in few-shot learning and training on large web corpora datasets face methodological issues which produce results which are undesirable. GPT-3 was founded to be capable of producing news articles that are difficult for humans to distinguish from news articles written by humans.

Timeline

October 1, 2019
OpenAI attended Grace Hopper Celebration of Women in Computing 2019 as a Sponsor
2019
OpenAI forms OpenAI LP as a "capped" for-profit with $1 billion in funding from Microsoft.
August 22, 2016
OpenAI attended Y Combinator Summer 2016 Demo Day as an Exhibitor

Funding rounds

Funding round
Funding type
Funding round amount (USD)
Funding round date
Investment
OpenAI Corporate funding round, July 2019
100,000,000
July 2019
1 Result
Results per page:
Page 1 of 1

People

Name
Role
LinkedIn

Ankur Handa

Research Scientist

Dario Amodei

Research Scientist

Elon Musk

Co-Founder

Greg Brockman

Co-Founder, CTO, Chairman

Ilya Sutskever

Co-Founder, Research Director

Jack Clark

Strategy & Communications Director

Jonas Schneider

Engineering Lead, Robotics

Co-Founder, CEO

Wojciech Zaremba

Co-Founder

Further reading

Title
Author
Link
Type
Date

Emergent Tool Use from Multi-Agent Interaction

Bowen Baker

Web

September 17, 2019

MuseNet

Christine Payne

Web

April 25, 2019

Documentaries, videos and podcasts

Title
Date
Link

OpenAI YouTube Channel

Companies

Company
CEO
Location
Products/Services

OpenAI

Sam Altman

San Francisco, California, United States

Researching Artificial General Intelligence

News

Title
Author
Date
Publisher
Description
Will Knight
January 26, 2021
Wired
DALL-E drew laughs for creating images of a daikon radish in a tutu. But it builds on an important advance in computer vision with serious applications.
Will Knight
January 15, 2021
Wired
A Harvard medical student submitted auto-generated comments to Medicaid; volunteers couldn't distinguish them from those penned by humans.
Adrian Potoroaca
January 6, 2021
TechSpot
OpenAI is known for developing impressive AI models like GPT-2 and GPT-3, which are capable of writing believable fake news but can also become essential tools in...
By Jane Wakefield
January 6, 2021
BBC News
Dall-E, the latest system from OpenAI, can create images from simple phrases.
Devin Coldewey
January 5, 2021
TechCrunch
OpenAI's latest strange yet fascinating creation is DALL-E, which by way of hasty summary might be called "GPT-3 for images." It creates illustrations, photos, renders, or whatever method you prefer, of anything you can intelligibly describe, from "a cat wearing a bow tie" to "a daikon radish in a tutu walking a dog." But don't [...]
Will Heaven
January 5, 2021
MIT Technology Review
OpenAI has extended GPT-3 with two new models that combine NLP with image recognition to give its AI a better understanding of everyday concepts.
Chris Stokel-Walker
January 5, 2021
New Scientist
An OpenAI neural network creates outlandish images - armchairs shaped like avocados or dinosaurs in tuxedos - from a few words of text
FinSMEs
December 1, 2020
FinSMEs
About | Advertise | Contact | Disclaimer | News | The Daily Deal Newsletter FinSMEs.com by FinSMEs is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.
Matt Weinberger and Avery Hartmans
November 24, 2020
Business Insider
As CEO of SpaceX and Tesla and founder of The Boring Company and Neuralink, Elon Musk seems to be everywhere all at once.
Devin Coldewey
November 12, 2020
TechCrunch
When I send an email, it's special. A crafted, beautiful thing that -- who am I kidding, it's mostly automatic. So why not automate it? OthersideAI is taking this idea (with a $2.6M seed round) beyond the auto-responders and smart replies, using OpenAI's GPT-3 language generation engine to turn bullet points into full, personalized messages. [...]
Emily Luong
November 12, 2020
MIT Technology Review
Tech giants dominate research but the line between real breakthrough and product showcase can be fuzzy. Some scientists have had enough.
Jonathan Chadwick
November 10, 2020
Mail Online
Jukebox, created by California-based company OpenAI, is a neural network that generates eerie approximates of pop songs in the style of multiple artists.
Will Knight
August 25, 2020
Wired
Alphabet's DeepMind pioneered reinforcement learning. A California company used it to create an algorithm that defeated an F-16 pilot in a simulation.
Editorial
August 11, 2020
the Guardian
'Worried? If you aren't then remember that Dominic Cummings wore an OpenAI T-shirt on his first day in Downing Street.' Photograph: Tommy London/Alamy Stock Photo
Lou Kerner
July 28, 2020
AlleyWatch
The Pulse of New York Tech
Tom Simonite
July 22, 2020
Wired
The new program from OpenAI shows how far the field has come--and how far it has to go.
Alex Hern
June 12, 2020
the Guardian
Elon Musk-backed OpenAI to release text tool it called dangerous | Artificial intelligence (AI) | The Guardian
Federica Carugati
June 12, 2020
Wired
To ensure our AI-driven future is just ant equitable, we should borrow from Ancient Athens.
Devin Coldewey
June 11, 2020
TechCrunch
If you've ever wanted to try out OpenAI's vaunted machine learning toolset, it just got a lot easier. The company has released an API that lets developers call its AI tools in on "virtually any English language task." Basically, if you've got a task that requires understanding word in English, OpenAI wants to help automate [...]
Adrian Potoroaca
May 20, 2020
TechSpot
At its virtual Build 2020 developer conference, Microsoft revealed several interesting sneak peeks into its latest efforts to unify Windows 10 app development, offer more useful tools for power users, and integrate more Linux goodies via the Windows Subsystem for Linux.
SHOW MORE

References

Golden logo
Text is available under the Creative Commons Attribution-ShareAlike 4.0; additional terms apply. By using this site, you agree to our Terms & Conditions.