Searching the internet with large language models
We're building a new search engine over the internet using techniques inspired by large language models. Self-supervised learning is a paradigm that will produce radically new capabilities in many areas of tech. We're applying it to search, and in doing so, we hope to make sorting through the huge amount of information on the internet as easy as if you were calling up a memory.
Will Bryk, CTO
I grew up in NYC, then studied CS and physics at Harvard, where I did some ML research and led the robotics club. I then spent two years as a software engineer at the ML startup Cresta, while also writing a book about the history of civilization in my spare time. While living in a hacker house with Alex, I realized that the future of civilization depends almost entirely on the quality of information we consume, so I left to start Metaphor with Alex.
Alex Gajewski, CEO
I did my undergrad in computer science at Columbia (and, in spirit, parts of it at the University of Chicago). My background is mostly in AI research, systems engineering, and a little pure math (TTIC, Uber AI Labs, Columbia; NeurIPS, GECCO). I also like to read literature and philosophy when I have time. Metaphor is our attempt to create a startup that takes advantage of AI abilities as they come online.