Our CEO and Co-founder Lukas Biewald recently sat down with Jonathan Siddarth, CEO and Co-founder of Turing for a wide-ranging discussion on scaling human intelligence as the path to AGI.
Here are some key takeaways from their conversation. You can watch the full interview on YouTube here.
Scaling human intelligence, rather than compute, is the bottleneck for AGI
The conversation between Lukas and Jonathan started with Jonathan’s observation that AGI used to be blocked on compute and data but that is no longer the case.
Compute is in a good place thanks to the work done by NVIDIA, Apple, Google, Microsoft and a host of exciting startups focused on creating custom chips and hardware. Data on the other hand hit a wall a couple of years ago because all the foundation models were essentially pretrained on the same data on the web.
Jonathan called these “basic tokens” and distinguished them from “intelligent tokens” such as human generated coding examples or mathematical reasoning. Turing has made a business of delivering these intelligent tokens to all the leading foundation models
“When these models get better at coding, they also get better at a wide variety of tasks such as symbolic reasoning, logical reasoning, math, so clearly these coding tokens are really important” – Jonathan Siddarth
AI-powered developer cloud
Lukas asked Jonathan whether this insight on intelligent tokens being the bottleneck to AGI was the insight that led to him founding Turing. Jonathan shared that the original business model for Turing was based on labor arbitrage - providing Silicon Valley level developer talent but at significantly lower cost - but to achieve this they had to build their AI powered developer cloud.
Turing applied AI to the three core processes of a tech services company: sourcing developers, vetting developers, and matching developers to specific projects. Turing now has 3.7 million developers that have been sourced and vetted. They built this using supervised learning to vet developers along three dimensions: role (front end, back end, data science, devops), tech stack (React, Javascript), and seniority (individual contributor, tech lead, manager)
“We saw it looks like an information retrieval problem from the web searches day. In web search you're matching query document pairs. Here it was the document is the developer. So we we thought of the problem as vetting a developer, building what we call a deep developer profile, a detailed, comprehensive, continuously updating vector representation of a developer and learning the weights to match the right developers to the right projects from this really large pool” – Jonathan Siddarth
Building AI applications for the Fortune 500
Jonathan shared that their AI powered developer cloud had allowed them to seize on the opportunity of working with the foundation model builders in two ways. The first line of business was to provide proprietary human data such as coding examples or mathematical reasoning to train the LLMs. They are also expanding into other business domains beyond STEM such as marketing, finance, retail, CPG, and healthcare. The second line of business, which builds on the first, is to build AI applications powered by LLMs for the Fortune 500. Jonathan shared a number of examples of use cases including coding completion for a software company, underwriting and claims copilot for an insurance company, audit and compliance copilot for a healthcare company, intelligent document processing pipeline for a wealth management company. Jonathan noted two commonalities across these use cases. The first was the application needed to handle a large volume of unstructured documents. The second was the current need for domain knowledge and human-in-the-loop copilot approaches rather than fully autonomous agents.
“So one was a coding copilot for a large software company where they needed a RAG system built on top of their own code base. We partnered with Google Gemini on this so they got custom code completions specific to their code base. And this was the company that saw the 33% lift in developer productivity. I actually think that 33% lift was an underestimate because it was still primarily in a code completion context and not a code generation context. I think with code generation the impact would be even more” – Jonathan Siddarth
To watch the full session on YouTube, click here.
In this episode of Gradient Dissent, Jonathan Siddharth, CEO & Co-Founder of Turing, joins host Lukas Biewald to discuss the path to AGI.
They explore how Turing built a “developer cloud” of 3.7 million engineers to power AGI training, providing high-quality code and reasoning data to leading AI labs. Jonathan shares insights on Turing’s journey, from building coding datasets to solving enterprise AI challenges and enabling human-in-the-loop solutions. This episode offers a unique perspective on the intersection of human intelligence and AGI, with an eye on the expansion of new domains beyond coding.
✅ *Subscribe to Weights & Biases* → https://bit.ly/45BCkYz
🎙 Get our podcasts on these platforms:
Apple Podcasts: http://wandb.me/apple-podcasts
Spotify: http://wandb.me/spotify
Google: http://wandb.me/gd_google
YouTube: http://wandb.me/youtube
Connect with Jonathan Siddharth:
https://www.linkedin.com/in/jonsid/
Follow Weights & Biases:
https://twitter.com/weights_biases
https://www.linkedin.com/company/wandb
Join the Weights & Biases Discord Server:
https://discord.gg/CkZKRNnaf3