We’re proud to announce a $40m Series B led by a16z crypto with Ali Yahya joining the board alongside current board member Marc Andreessen, bringing our total funding to ~$60m. The funding allows Golden to build a decentralized and incentivized system for getting data into a new protocol by incorporating mechanics from Web3.
Participants include crypto luminaries, operators, founders and long term supporters: Protocol Labs, Raj Gokal (Founder of Solana), Dan Romero, OpenSea Ventures, Juan Benet (Founder of Protocol Labs), Arash Ferdowsi (cofounder and ex CTO of Dropbox), Dylan Field (CEO of Figma), Bastian Lehmann (founder of Postmates and new crypto company), Maryanna Saenko (of Future Ventures), Michael Seibel (ex CEO of Y Combinator and cofounder of Twitch), Avlok Kohli (CEO of AngelList), a16z Cultural Leadership Fund, Harpoon Ventures, Amir Moazami, DCVC, Matt Bellamy (Lead singer of Muse and partner of Helium-3 Ventures), Sridhar Ramaswamy (CEO of the new search engine Neeva), MVP Ventures, Khaled Naim (CEO of Onfleet), HNVR, Vela Partners, Socii Capital, George Lambeth, FalconX, Stefano Bernardi, Wei Guo, Xoogler, TRAC, Scale Asia Ventures, Chaac Ventures, Neal Dempsey and others.
Knowledge is fragmented; to find reliable information, one needs to search across centralized repositories, personal webpages, news sites, blogs, and private databases. The world lacks a standardized interface for discovering, contributing and verifying knowledge. Creating this interface in a scalable way requires not just data; it requires the development of incentives for data entry, verification, and governance. Web3 technologies and mechanics are well suited to solve the core problems of incentivizing efficient task execution and organizing resources for protocol operations.
Early protocol concepts have developed significant interest and we are now in testnet phase with 35,000 individuals participating in our Discord community, a live decentralized app (the dApp), APIs and early governance in action. They are currently submitting facts, verifying information, improving and building the protocol itself.
Knowledge is for all, the protocol to capture it should be developed by a wide community.
A protocol for knowledge compilation, generation and storage
The Golden protocol financially rewards participants for building a decentralized, permissionless graph of canonical knowledge.
In a nutshell, the protocol is a game theoretic system designed to provide incentives for agents (individual humans, AI or people harnessing AI) to converge on truth. It provides financial rewards for correct data and disincentives for being wrong. Organizations using and paying for the data provide a direct feedback loop for converging on correctness as well as dynamic market pricing. We use the blockchain to construct a ledger of the transactions and eventually the data itself. Tokenization allows for incentives to be awarded to good actors adding correct data into the system and to verifiers verifying the data accuracy.
We utilize a mechanism of slashing (the removal of staked crypto collateral) in order to provide downside for being incorrect on data submission or verification. Submission of data is currently occurring on Golden.com and the protocol APIs. Verification of data can occur on the dApp or the protocol APIs.
Using public data NFTs and fractional ownership to build long-term data value and channel revenue sharing
We are constructing canonical public data NFTs for each ‘real world’ entity (e.g. ‘Ali Yahya’) in the protocol. This allows collections of public canonical data to be constructed and also provides a disincentive to producing duplicates (a common problem in knowledge graph construction). These canonical data NFTs have revenue sharing rights attached to them for the usage of data that has been compiled and verified. The data constructors get the majority of revenue from when commercial users use the data. This fractional revenue sharing incentivizes data producers to add the most commercially useful data to the graph first. NOTE: these NFTs are not general NFTs around brands of entities and only wrap public data that has been compiled and verified.
Reputation scores, protocol services: disambiguation and link prediction
The protocol will also develop efficiencies over time by way of reputation scores of good submitters and verifiers growing over time. We can use reputation scores to reduce the effort required to verify information. We’re also building various services to aid data ingestion ranging from disambiguation services to link prediction for aiding decisions on whether to include the data or not.
Bounties, data credits, burn-and-mint model for revenue from organizations
Canonical data is crucial for enterprise use cases ranging from enrichment, modeling, model building and other use cases. To create a stable and growing token economy, companies will be able to rent data from the protocol and pay for usage using a burn-and-mint model. Commercial users will be able to buy protocol tokens then burn them to create stable data credits which are spent to access data. This allows for organizations to purchase data with predictable pricing, even if token valuations fluctuate. We are committed to building healthy mechanisms for injecting real fiat into the protocol at scale.
This is not simply a ‘web3 Wikipedia’. We have a chance to make something much greater in terms of enhanced schema around the data, richer predicates, scale in entity count, verified data, transparent and market based governance, a system that can evolve over time, immutability of ledgering our collective knowledge and permissionless knowledge. Having accurate data in a deeply-linked knowledge graph allows for the creation of new applications and insights that are not currently possible. We are eager to see how this resource will be applied, from classic industries to AI research.
The Golden protocol is currently live on the Goerli testnet. We aim to take this to mainnet Q2 2023 after we have ironed out all core issues. We are currently working on (for example): anti gaming, efficiencies in queues, robustness of disambiguation services, and other friction reducing activities.
Help us build this thing!
There are many ways you can get involved and directly help build the protocol. We encourage anyone interested to join us on Discord or our forum. By helping us build you may be eligible to earn testnet points, which will convert to tokens at the time of mainnet launch, read more about incentivized activities here.
If you want to add data:
If you want to check and verify data
Building the protocol
- Creating bulk submission tools
- Legal and IP ownership using NFTs
- Reward function development
- List of Roadmap and things we need to solve
Community and ideas
- Meta Schema
- Protocol mechanisms, Governance