The Problem with Smaller Transistors – Quantum Computing pt. 1

Quantum computing

The Problem with Smaller Transistors – Quantum Computing pt. 1

Hey Guys! In today’s blog post I will be discussing about a new contender in the field of computing, the infamous Quantum Computer. This is part one of a two part series, this blog explains the limits of traditional computing and the Transistor. The next blog post explains the concepts of a Quantum Computer. This is a really interesting topic so sit back and enjoy this wonderful journey into the Quantum world!

Prelude

The Personal Computer revolution started in 1971, with the creation of the first 4 bit microprocessor by Intel namely, the Intel 4004. Until then computers were extremely huge and took up vast amounts of space, often the size of multiple rooms. For example one of the first computers to ever be used, the Electronic Numerical Integrator and Computer, ENIAC for short. This machine was operated using Vacuum Tube Technology. This technology was good enough but had many drawbacks, mainly it failed a lot! So technicians had to replace them every so often and they were also bulky, characteristics that are not ideal for our idea of a PC nowadays.

Over the years, Microprocessors got increasingly advanced. The major shift from Vacuum Tubes to Microprocessors was the inclusion of a transistor, which acts like a basic switch controlling the flow of current inside an electrical circuit. Transistors have many advantages over Vacuum Tubes, mainly they are incredibly small and are resistant to failure up to a certain extent. So over the years, more transistors were packed into a single processor die and processors continued to grow in computational power. This trend was so obvious that the co-founder of Intel, Gordon Moore devised a law called Moore’s Law which states that the number of transistors on each square inch of a CPU die double every two years. This has been true until recently, because we are reaching the physical limit of how small we can make a single Silicon transistor! Currently we use the 14nm fabrication node technology for desktop and 10nm fabrication node for mobile devices.

Problem with smaller Transistors

Transistor Array

As the Transistor gets smaller, we slowly enter the realm of the Quantum world. Conventional Physics don’t apply in this context, as transistors get smaller the electrons that travel through the circuit start to show a weird Quantum Property called Quantum Tunneling, where the electrons just pass through the silicon hence, disrupting the circuit and rendering it useless. This is where traditional computing power comes to a full stop, we cant go any further than this.

Any other solutions?

Currently, some new technologies have risen to the surface like the Carbon Nanotube Fabrication Process presented by IBM. Experts say, this kind of architecture can take us to sizes closer to 1 nm.

Conclusion

All things said, We are nearing the limit to how small we can fabricate and thus, our computing power will come to a standstill.  We evolved from the days of simple 4 bit architectures with very few and large transistors, to 64 bit multi-core processors which consist of billions of transistors that are smaller than the size of a HIV Virus! Finally Moore’s Law seems to be coming to an end. the most promising technology that might succeed the Traditional Computer will surely be the Quantum Computer.

In the next part of this blog we shall discuss about the benefits of using a Quantum Computer as well as the fundamental principles behind it.

Until then, Have a great day!

THANK YOU 🙂

Manas Hejmadi

I am a boy who studies in 9th grade at Bangalore! I have a good knowledge of computer programming, AI and UI Design. I aspire to create a tech startup of my own!

One thought on “The Problem with Smaller Transistors – Quantum Computing pt. 1

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.