Machine learning has been around for a while. What has made it less attractive — until now — has been the absence of large and robust datasets. Probably one of the most famous quotes defending the power of data is that of Google’s Research Director Peter Norvig claiming that “we don’t have better algorithms. We just have more data.”
In essence, the recent explosion of machine learning has been largely due to the availability of big data. But why?
Machine learning involves finding patterns in big data and learning from it. The core idea is that once past patterns in data are identified, the ability to predict future patterns—that often contain valuable insight—is realized. Machine learning can be applied to smaller datasets too; however, the insights will not be as accurate, because the learning opportunity, intimately tied to the dataset size, is too low.
So why exactly is big data critical, apart from the fact that it can enable machine learning tools to learn better and extract more valuable insights?
As Martin Hack, writing for Wired, explains:
Only with advanced analytics, and specifically machine learning, can companies truly tap into their rich vein of experience and mine it to automatically discover insights and generate predictive models to take advantage of all the data they are capturing. This advanced analytics technology means that instead of looking into the past for generating reports, businesses can predict what will happen in the future based on analysis of their existing data. The value of machine learning is rooted in its ability to create accurate models to guide future actions and to discover patterns that we’ve never seen before.
But what is the difference between big data and machine learning exactly?
Very simply, how one handles data relates to big data whereas how one uses data relates to machine learning. Put another way, big data is simply lots of data. Here, we are talking about terabytes and even petabytes of data. When you have data at the scale of what LinkedIn or Amazon may generate, you have big data. But it’s not just about the size of the data, although size is the most important criteria in differentiating big data from regular data sets. Typically, three “V’s” define big data: volume, velocity and variety.
Machine learning and big data, ultimately although very different things strictly speaking, work hand-in-hand. Andreesen and Horowitz summarize the relationship elegantly when they say, “machine learning is to big data as human learning is to life experience: We interpolate and extrapolate from past experiences to deal with unfamiliar situations. Machine learning with big data will duplicate this behavior, at massive scales.”
Even more broadly, big data has contributed to the growth of artificial intelligence technologies in the sense that machine learning helps machines learn from big data and solve problems. There are sets of algorithms such as supervised learning and semi-supervised learning which are used to solve specific business problems.
So, what’s next for machine learning as it relates to enterprises and fast growing startups? Andreesen and Horowitz predict that:
Where business intelligence before was about past aggregates (“How many red shoes have we sold in Kentucky?”), it will now demand predictive insights (“How many red shoes will we sell in Kentucky?”). An important implication of this is that machine learning will not be an activity in and of itself … it will be a property of every application. There won’t be a standalone function, “Hey, let’s use that tool to predict.”
Take Salesforce for example. Right now it just presents data, and the human user has to draw her or his predictive insights in their heads. Yet most of us have been trained by Google, which uses information from millions of variables based on ours and others’ usage to tailor our user experience … why shouldn’t we expect the same here? Enterprise applications — in every use case imaginable — should and will become inherently more intelligent as the machine implicitly learns patterns in the data and derives insights. It will be like having an intelligent, experienced human assistant in everything we do.
With machine learning becoming a property of every application, we inch closer to a world where AI further enables us to to make smarter decisions faster.
The next step after machine learning? Deep learning, a process with that focuses even more narrowly on a segment of machine learning tools and techniques and applies them to solving just about any concern which requires human or artificial “thought”.
Similar to machine learning, deep learning requires feeding a computer system a lot of data, which it can use to make decisions about other data. Unlike machine learning however, deep learning work is more focused on deep neural networks – logic networks of the complexity needed to deal with classifying sets of data as overwhelmingly extensive as, for example, Google’s image library. With data that is comprehensive as this and networks that are sophisticated enough to handle this level of classification, it be becomes dead easy for a computer to ‘look’ at an image and state with a high degree of precision on what it represents to humans.
The applications of deep learning? They are already in the market and include predicting the outcome of legal proceedings and enabling the navigation process behind self-driving cars. Already, using sensors and onboard analytics, cars can recognize obstacles and respond to them correctly using deep learning processes.
And with the current speed of these developments, our work, personal and social lives are without a double going to be deeply impacted by deep learning and machine learning technologies.