Connect with us

Technology

The Evolution of Computer Science in One Infographic

Published

on

We take computing power for granted today.

That’s because computers are literally everywhere around us. And thanks to advances in technology and manufacturing, the cost of producing semiconductors is so low that we’ve even started turning things like toys and streetlights into computers.

But how and where did this familiar new era start?

The History of Computer Science

Today’s infographic comes to us from Computer Science Zone, and it describes the journey of how we got to today’s tech-oriented consumer society.

It may surprise you to learn that the humble and abstract groundwork of what we now call computer science goes all the way back to the beginning of the 18th century.

The Evolution of Computer Science in One Infographic

Incredibly, the history of computing goes all the way back to a famous mathematician named Gottfried Wilhem Leibniz.

Leibniz, a polymath living in the Holy Roman Empire in an area that is now modern-day Germany, was quite the talent. He independently developed the field of differential and integral calculus, developed his own mechanical calculators, and was a primary advocate of Rationalism.

It is arguable, however, that the modern impact of his work mostly stems from his formalization of the binary numerical system in 1703. He even envisioned a machine of the future that could use such a system of logic.

From Vacuums to Moore’s Law

The first computers, such as the IBM 650, used vacuum tube circuit modules for logic circuitry. Used up until the early 1960s, they required vast amounts of electricity, failed often, and required constant inspection for defective tubes. They were also the size of entire rooms.

Luckily, transistors were invented and then later integrated into circuits – and 1958 saw the production of the very first functioning integrated circuit by Jack Kilby of Texas Instruments. Shortly after, Gordon Moore of Intel predicted that the number of transistors per integrated circuit would double every year, a prediction now known as “Moore’s Law”.

Moore’s Law, which suggests exponential growth, continued for 50 years until it started scratching its upper limits.

It can’t continue forever. The nature of exponentials is that you push them out and eventually disaster happens.

– Gordon Moore in 2005

It’s now been argued by everyone from The Economist to the CEO of Nvidia that Moore’s Law is over for practical intents and purposes – but that doesn’t mean it’s the end of the road for computer science. In fact, it’s just the opposite.

The Next Computing Era

Computers no longer take up rooms – even very powerful ones now fit in the palm of your hand.

They are cheap enough to put in refrigerators, irrigation systems, thermostats, smoke detectors, cars, streetlights, and clothing. They can even be embedded in your skin.

The coming computing era will be dominated by artificial intelligence, the IoT, robotics, and unprecedented connectivity. And even if things are advancing at a sub-exponential rate, it will still be an incredible next step in the evolution of computer science.

Click for Comments

Technology

Visualizing AI Patents by Country

See which countries have been granted the most AI patents each year, from 2012 to 2022.

Published

on

Visualizing AI Patents by Country

This was originally posted on our Voronoi app. Download the app for free on iOS or Android and discover incredible data-driven charts from a variety of trusted sources.

This infographic shows the number of AI-related patents granted each year from 2010 to 2022 (latest data available). These figures come from the Center for Security and Emerging Technology (CSET), accessed via Stanford University’s 2024 AI Index Report.

From this data, we can see that China first overtook the U.S. in 2013. Since then, the country has seen enormous growth in the number of AI patents granted each year.

YearChinaEU and UKU.S.RoWGlobal Total
20103071379845711,999
20115161299805812,206
20129261129506602,648
20131,035919706272,723
20141,278971,0786673,120
20151,7211101,1355393,505
20161,6211281,2987143,761
20172,4281441,4891,0755,136
20184,7411551,6741,5748,144
20199,5303223,2112,72015,783
202013,0714065,4414,45523,373
202121,9076238,2197,51938,268
202235,3151,17312,07713,69962,264

In 2022, China was granted more patents than every other country combined.

While this suggests that the country is very active in researching the field of artificial intelligence, it doesn’t necessarily mean that China is the farthest in terms of capability.

Key Facts About AI Patents

According to CSET, AI patents relate to mathematical relationships and algorithms, which are considered abstract ideas under patent law. They can also have different meaning, depending on where they are filed.

In the U.S., AI patenting is concentrated amongst large companies including IBM, Microsoft, and Google. On the other hand, AI patenting in China is more distributed across government organizations, universities, and tech firms (e.g. Tencent).

In terms of focus area, China’s patents are typically related to computer vision, a field of AI that enables computers and systems to interpret visual data and inputs. Meanwhile America’s efforts are more evenly distributed across research fields.

Learn More About AI From Visual Capitalist

If you want to see more data visualizations on artificial intelligence, check out this graphic that shows which job departments will be impacted by AI the most.

Continue Reading
Visualizing Asia's Water Dilemma

Subscribe

Popular