Machine Learning as a Metaphor

Last week, I attended the QCon international software development conference in San Francisco, where I had the honor to chair a track on Applied Machine Learning and Data Science. The quality of the presentations was impressive, not just in my track but in other areas that span the practice of software engineering.

Machine learning was a major theme of the conference, finding its way into many of the talks and hallway conversations. Jeff Hammerbacher, who along with DJ Patil coined the term "data scientist", delivered a keynote on "The Evolution of Machine Learning from Science to Software", in which he focused on the practical engineering challenges of shipping production systems that use machine learning.

But what impressed me most was a talk that wasn't about machine learning as such. Rather, it was Facebook engineer Keith Adams's keynote on "The Unreasonable Effectiveness of Tuning".

The title follows a tradition of riffing on Nobel laureate Eugene Wigner's 1960 paper "The Unreasonable Effectiveness of Mathematics in the Natural Sciences" (e.g., Alon Halevy, Peter Norvig, and Fernando Pereira's essay on "The Unreasonable Effectiveness of Data").

Slow and Steady Wins the Race

Keith's take-away was simple: systems that are easy to tune in response to real workloads ultimately outperform and replace incumbents.

His most accessible example was how the Linux operating system evolved past Solaris because of a more agile release cycle that rapidly improved from user feedback. He also used examples from his own work on virtual machines and compilers at Facebook to show how slow and steady wins the race. In contrast to Clayton Christensen's theory of disruptive innovation, Keith focused on the power of evolutionary innovation.

His broader point was that we should look at machine learning as a metaphor. The first release of any software system involves a cold start problem, since there's no past data to draw on. That's why it's important to optimize for the speed of learning by collect training data as early as possible. Release early, release often. Think of the decisions that define a system as a parameter space, and of the release process as a series of experiments to search for the optimal point in that space. Beware of overfitting to unrepresentative training data, especially from your initial experiments.

People as Human Computers

It was a great talk, but I couldn't help wondering if Keith took the metaphor too far by conflating machine learning with human learning. So I asked him. He replied without skipping a beat: the same insights about learning apply, whether we're talking about silicon-based machines or human computers.

I love his answer, even if it glosses over some of the known bugs we face as human computers, such as our susceptibility to cognitive biases. We humans and computers are all in this together. What holds for all of us is that we should optimize for the effectiveness and efficiency of learning by collecting training data early and often.

Machine learning is a powerful computational tool, but it may have even greater value as a metaphor for engineering in general. Thanks to Keith and everyone at for the insights. Looking forward to using them to tune my own work!

Great post Daniel, thanks for all the links. My two cents: unlike software systems where there is often no past data to draw on, the human system has almost too much biased experiential data that it subconsciously draws on. This means several hypothesis may never get framed by the human operator even with the most sophisticated visualization tool. HR leaders and many middle managers fall into this trap - they just don't ask enough non-obvious questions in their organizations.

Pramod Kumar Srivastava

CEO | Business Management & International Development Consultant | Economist | Writer | Ekam Religion | Cosmos Citizen | Single

10y

Learning fast, unlearning faster, and moving ahead, albeit slowly but surely.

Like
Reply

Machine Learning As A Metaphor. #GreatRead #GoLean Demian Brener Santiago Bilinkis

Like
Reply

A complementary dimension to this is thinking of development as a cooperative game, which comes from Alistair Cockburn. Think cooperating cognitive agents --- the real challenge is communication. If you haven't seen it, take a look at alistair.cockburn.us --- in particular, "walking skeleton" (cold start), and "flow of decisions". I would imagine the metaphors play off one another well.

paolo cerrito

VP Business Development and Strategies at A-ICE s.r.l.

10y

are you talking about "SENSEMAKING" or something similar to it?

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics