Our latest event, AI’s Potential: From Policy to Posterity, featured an experienced panel of AI and machine learning specialists. Our panelists included:
• Douglas Bemis, CTO and Co-Founder at Uber AI Labs
• Gus Katsiapis, Senior Staff Software Engineer at Google
• Prateek Joshi, Founder at Pluto AI
• Christian Reilly, Co-Founder of MedNition
The event featured wide-ranging discussions on the current state of artificial intelligence infrastructure and where we need it to be in the future in order to truly realize its potential.
Our team of machine learning consultants at Digamma believes that we are on the brink of a new AI era in which emerging AI technologies will quickly evolve, mature and require the creation of a new, sophisticated technological infrastructure. The AI we have now is emergent and not fully production ready—it is primarily “custom craft”, not the “AI factory line” that we need to truly scale AI technologies. In light of this, we asked our panelists to provide their perspective on the role of hardware and processing power in enabling the growth and evolution of AI technologies.
Our panelists provided fascinating insights and emphasized the role of data and education over hardware. Google’s Gus Katsiapis had this to say:
“It’s true that a lot of improvements have been made in hardware that have enabled the machine learning revolution. But there’s a lot more to machine learning than deep networks, beyond GPUs there is even more custom hardware that is special to ML itself.”
Gus continues by explaining the critical importance of machine learning education:
“Beyond just raw processing power, I think managing data and figuring out ML best practices is an area where knowledge has not been dispersed widely enough yet. So we have two options. We can either educate, which is great, but it usually doesn’t scale. Or we can try to integrate a lot of these best practices into platforms that we build. So, if I can build a data ingestion system, debugging system, and model understanding system that leverages the knowledge of Ph.Ds, or software engineers, and it’s packaged nicely, I can give that to a startup who can apply it. Then it turns into something that is really multiplicative. So this is an area I really think we should be focusing on. AI is new and is moving quickly, but I don’t know if we have the background or education to digest it.”
Prateek Joshi explained that computing power is only one of many considerations that machine learning engineers need to consider. As he explains below, much of machine learning itself is concerned with pre-processing and preparing data:
“The major problem with [data cleaning in ML] is that there is no fixed method. There are so many variables involved. Let’s consider just image data: is it coming from a car? A drone flying over agricultural land? The way you have to pre-process the data is so different for each use case that a lot of data scientists spend 80% of their time considering how to convert what they have into something usable. The last 20% is building nice models. People don’t talk about it, but that’s a hidden part of machine learning.”
Finally, our panelists touched on the role of quantum computing and how this major innovation will play into the future evolution of AI. Gus Katsiapis was bearish on its short- to medium-term importance, saying:
“I don’t think quantum computing will be the thing that democratizes machine learning and AI. I think education and best practices is what will actually popularize it and make it more useful in the world. I’m sure at some point in time we will hit some barrier with backpropagation and then that’s when quantum computing will allow us to perform the next leap in the rate of acceleration of how quickly we will be making progress. But I don’t think this is what you should bet on.”
These are only a few of the key insights our panelists provided us with. Interested to hear the entire discussion? You can watch the entire discussion on AI Infrastructure on YouTube.
With over a decade of AI experience, Digamma.ai’s team are your trusted machine learning consultants, partners, and engineers.