BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Machine Learning: The Evolution From An Artificial Intelligence Subset To Its Own Domain

Following
This article is more than 6 years old.

Machine Learning (ML) has reached an inflection point – at least in terms of messaging. For my first article as part of the TIRIAS Research contributor group, I’ll take this space to level set on a description of ML for what’s to follow.

I studied artificial intelligence (AI) back in the 1980s. Then, machine learning was clearly a subset of AI, focused on how to train machines to learn about their environments and synthesize information based on that gained knowledge. While study on artificial neural networks (ANN), today’s hot topic, was occurring , the main focus was in expert systems. Those two ML techniques still exist, but the massive advanced in machine power have flipped the balance. Faster chips, networks and software that manage parallel computing and clustering, and a maturity in model development have meant that ANN is now the main reference point when most people discuss ML.

However, AI is no longer the entire ML discussion. I’ve spent my career in the arena of business software. The same advances in computing that have driven ANN helped grow advances in business intelligence (BI). Complex analytics are able to discover data, run far more complex and advanced mathematical analysis than in previous decades and provide information visualizations that are informative and stunning.

It took me a while to wrap my head around it, given my earlier AI biases, but I’ve concluded that machine learning is now its own discipline, intersecting with both AI and BI in a very overlapped Venn Diagram.

David Teich

ML has blended techniques from the other two arenas and combined them into a new discipline. The difference comes in the fuzzy definition of the word “learning”.

The pure AI sense of the word meant that the machines learned on their own and could even update their own algorithms based on that learning. Yet it’s just as important to learn about unseen relationships hiding within the large datasets involved in big data and the Internet of Things (IoT). The fact that detailed, step-by-step statistical analysis can also uncover information doesn’t make that any less of a learning exercise than the much more fluid heuristics used in AI models.

While that might seem strange to AI purists, they should look back at the discipline’s history. AI endured decades of the Dreyfus brothers, two professors at UC Berkeley, telling us all the things AI will never do because it still hadn’t done those things. Over time, those things were moving targets. I like to define AI, only partially facetiously, as “Those things we still don’t know about making computers think and act like humans”. In the early days, robotics and vision were part of AI. When people learned enough about them and their complexities, they grew into their own clear disciplines. Natural Language Processing (NLP) went the same way. Today, we’re looking at machine learning, our AI child, growing up. Machine learning is stepping out on its own, with two key parents. Expect to see it continue to grow.

In this column, and in my future work at TIRIAS Research, you’ll see coverage at all levels of this new arena of machine learning. While the machine matters, it only matters as it can be applied to business; that means I’ll talk about both what’s happening to enhance machines and how those machines are being applied to solve real business problems.

-- The author and members of the TIRIAS Research staff do not hold equity positions in any of the companies mentioned. TIRIAS Research tracks and consults for companies throughout the electronics ecosystem from semiconductors to systems and sensors to the cloud.

Follow me on Twitter or LinkedInCheck out my website