BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Data Science Is Now Bigger Than 'Big Data'

Following
This article is more than 5 years old.

Getty

In a world in which “big data” and “data science” seem to adorn every technology-related news article and social media post, have the terms finally reached public interest saturation? As the use of large amounts of data has become mainstream, is the role of “data science” replacing the hype of “big data?”

Looking back over the past decade and a half, English language web searches reported by Google Trends for both “social media” and “cloud computing” begin at the latter half of the last decade, with cloud computing rising in late 2007 and social media taking flight in early 2009. (To compare the two terms on the same scale, they are reported as standard deviations from the mean, known as Z-scores).

Kalev Leetaru

Yet, while the phrase “social media” has increased linearly in the decade since, “cloud computing” has followed a very different trajectory, peaking in March 2011, decreasing through the end of 2016 and leveling off in the three years since.

It seems the idea of renting computing power in the “cloud” has become so mainstream we no longer even talk about it, even as social media, despite its ubiquity, still captures our search attention. The most popular search phrase over the last 12 months has been “social media marketing,” reflecting the outsized power of the digital behemoths in controlling the flow of attention coveted by businesses. Interestingly, “what is social media” and “about social media” are the next two most popular searches over the past year, reflecting that despite their apparent ubiquity, social media is still a new concept for a lot of the world’s population.

The now-ubiquitous term “big data” begins its meteoric rise in lockstep with cloud computing’s fall, suggesting that the public’s focus on hardware rental was rapidly replaced with how all of that computing power was being used: to analyze massive datasets.

In contrast, “data science” and “deep learning” both take off in 2013 and accelerate over 2014. Interestingly, despite deep learning’s Cambrian Explosion over the past few years, search interest appears to have leveled off as of last January, perhaps suggesting that we are now searching more for the individual applications of deep learning rather than the phrase itself.

Kalev Leetaru

Most significantly, as of January of this year, “data science” has surpassed “big data” in total search volume. Just as cloud computing’s hardware focus gave way to big data’s emphasis on what we do with all that hardware, so too has the focus shifted now from assembling huge piles of data to the people and processes making sense of all of that data.

While it may be entirely coincidental, it is interesting to note that data science and deep learning burst into popularity in the immediate aftermath of Edward Snowden’s June 2013 disclosures, raising questions of whether vastly increased public awareness of data mining led to increased interest in those fields.

Finally, combining all of these terms on the same timeline and adding “artificial intelligence” to the mix, several key trends emerge.

Kalev Leetaru

The most obvious is that search interest in cloud computing at its peak surpassed all of the other terms over the past decade and a half.

The second is that search interest in the phrase “artificial intelligence” plunged from the data’s start in January 2004 through mid-2008 and began climbing again in 2014 as the current AI renaissance began. Searches for AI begin to really accelerate in 2017 just as searches for “deep learning” level off.

This is worrisome in that it suggests that to the general public these neural advances are increasingly pulling away from their mathematical underpinnings of “deep learning” and back towards the science fiction catch-all of AI. As this transition strengthens it raises concerns that the public sees these creations as more than mere statistical equations codified in software and once again as silicon incarnations of a new form of artificial life. This raises the danger of another AI winter as the public’s soaring imagination begins to collide with the primitive reality of current advances.

Putting this all together, it is instructive to see how the way in which the public has internalized the data revolution of the past decade and a half, from renting hardware to sifting through data to the people and processes that drive our data-built understanding. It seems we have yet to capture the public’s imagination the way cloud computing did or perhaps the vocabulary today has simply becoming too fragmented. Most worryingly, the public seems to be increasingly seeing our modern neural revolution as “AI” rather than “deep learning.” In the end, understanding how the public is internalizing and understanding the technological revolutions affecting them is critical to managing expectations and preventing another technological winter.