top of page
  • Mohamed Salem

History of AI: Let's Get Physical!

Did you know that computers don't actually understand ones and zeros? Only different levels of voltage? Did you know that advancements in AI are based on electrical advances? Stick around and find out!


In an earlier piece, we claimed that the fundamental ideas driving artificial intelligence (AI) today have not really changed since the 60’s. It sounds hard to believe, especially given the boom in AI we’re witnessing today. AI's prevalence is exponentially increasing and it’s not only experts who are talking about AI: people from all walks of life have been wondering how this advent of the learning machine will affect their livelihoods and, more so, how it can be leveraged to improve their various lines of work.


So, if all we needed has been here since the 60’s, why is this happening now? The answer lies in the distinction between the theoretical and the achievable.


As you know, the basic building blocks of AI consist purely of a combination of mathematical ideas developed by Isaac Newton and Gottfried Leibniz in the 17th century; some of you may remember differentiation from high school, and that really is the fundamental pillar that today’s AI relies on.


But Math is concepts and logic. AI is real. AI is self-driving cars. It is virtual assistants that respond to your voice. It is Facial Recognition Software. It is your favorite movie recommendation service. It is a Roomba. If all we ever truly needed was math, then why didn’t we get AI earlier?


To answer that question, we’ll have to examine the history of electronics in general.


As movies are fond of reminding us, computers only understand zeroes and ones. However, that’s not really an inherent property of computers, it just has to do with the way we started designing them. It is irrefutable that our modern way of life fully depends on our ability to manipulate and produce power (i.e., electricity); and the simplest way to manipulate electricity is by turning it on and off. You can probably guess what these on and off’s are equivalent to: they’re the zeroes and ones in our machines.

Over the past century, our ability to produce more complex machinery has depended almost exclusively on one single invention: the transistor. A transistor is a wondrous switch with no moving parts. When exposed to a certain level of voltage, the transistor-switch is on, otherwise it’s off. Combining many of these switches allows our machines to produce unique sequences of zeroes and ones for any items ranging from letters and numbers to colors.


In fact, I’m going to let you in on a little secret; computers don’t actually perceive zeroes and ones, they perceive different voltage levels -- specifically 0V and 5V. When we write a program, our code represents these different voltage levels by 0 and 1 when, in fact, all our machines understand is that there are two levels of voltage. So, in a sense, everything is physical!


All our machine learning algorithms and all our AI methods are just different flows of electricity moving around turning switches on and off, and the more switches we can place on a small chunk of silicon, the more instructions we can give our machines and the faster our instructions are executed.


Here’s where we finally answer the question of why AI is happening now. Our ability to process large amounts of data and carry our complex algorithms (or many, many simple algorithms) in a short period of time, very critically depends on the number of transistors we can have in a single machine.



The graph above depicts the evolution in the number of transistors we’ve achieved thus far on a single computer chip (this graph is often associated with Moore’s Law). In 1970, the Intel 4004 microprocessor had around 2000 transistors on board. Today (actually, one year ago), the Nvidia A100 chip boasts around 54 billion transistors on board, each with a side length area less than 50 nanometers (for comparison, your typical household ant is about 5 million nanometers in length). Having more transistors in a tighter space means electricity travels faster, and more computations can be done simultaneously.


This also brings about some hot challenges. All these transistors working together consume a respectable amount of power and tend to heat things up pretty quickly. So, proper cooling becomes essential.


Where I’m going with this is that the advent of AI is more a hardware revolution than a revolution of thought, driven by advances in microchip manufacturing and cooling technologies that have allowed our computational capacity to increase by a trillion-fold since the 60’s!


So, in a sense, we already live in a world where machines are creating more sophisticated machines; though not in the grand sense of machines making themselves smarter (the Singularity), but rather, as humanity has done since the dawn of time, we’re using tools to make better tools.


Sadly, the future holds limitations. There’s only so much room left for making things smaller. But adding more computation capacity on a chip is only one way of making more sophisticated AI. Also, although the future may not be very promising when it comes to upholding Moore’s law, it offers many avenues for hope—some of which are technical like quantum computing, others are more philosophical and involve reshaping our AI methods into something that more closely mimics the way the human mind works. One thing is certain: this is an exciting time for AI.




 

References:


Amodei, Dario and Danny Hernandez. (2018). "AI and Compute", Open AI .



Petzold, C. (1999). Code: The Hidden language of computer hardware and software. Redmond, Wash: Microsoft Press.



Are you more of a visual person? Check out these videos for more info!


The Remarkable Art & Science of Modern Graphics Card Design: https://www.youtube.com/watch?v=ZEzI02SKALY

Uploaded by Nvidia


From Sand to Silicon: The Making of a Microchip | Intel: https://www.youtube.com/watch?v=_VMYPLXnd7E

Uploaded by Intel


Intel: The Making of a Chip with 22nm/3D Transistors | Intel: https://www.youtube.com/watch?v=d9SWNLZvA8g

Uploaded by Intel



198 views
bottom of page