Will the dusk of Moore’s Law, Be the dawn of singularity?

Will the dusk of Moore’s Law, Be the dawn of singularity?

This is where technology began. It all started with mechanical switches. The device is called as Analytical or Babbage engine (1834) and it is the first mechanical computer. For this great invention of the computer, Sir Charles Babbage is now called as the father of the computer.

The world’s first electronic computer ENIAC uses 18,000 vacuum tubes, weighing almost 50 tons and occupies about 1,800 square feet which consumes lot of power with less computing power.

ENIAC

Then comes the Transistor (A transistor is a small semiconductor device was invented in 1947 by three scientists J.Bardeen, H.W.Brattain and W.Shockley which is used to amplify or switch electronic signals and electrical power. It is composed of semiconductor materials made up of germanium and silicon usually with at least three terminals for connection to an external circuit).

Note: All the images in this article are only for representation purpose.

A transistor computer is a computer which uses discrete transistors instead of vacuum tubes. Transistor is the primary building block of all microchips, including your CPU, and is what creates the binary 0’s and 1’s (bits) your computer uses to communicate and deal with Boolean logic.

Binary Digits

In 1954, IBM introduced its first computer that had 2000 transistors. Transistors replaced vacuum tubes and today they are found in all electronic devices. As of 2016, the most powerful computer processor can have over 7 billion transistors.

Circuit Board

Later Integrated Circuit replaced Transistors. An Integrated circuit (IC), sometimes called a chip or microchip, is a semiconductor wafer on which thousands or millions of tiny resistors, capacitors, and transistors are fabricated. An IC can function as an amplifier, oscillator, timer, counter, computer memory, or microprocessor. Invented by Jack Kilby and Robert Noyce.

Then came Microprocessor which replaced the IC’s. A microprocessor (Logic chip) is a central processing unit on a single integrated circuit chip containing millions of very small components including transistors, resistors, and diodes that work together.

Micro Processor

The Intel 4004 was the first commercially available microprocessor with 4-bit central processing unit (CPU) released by Intel Corporation in 1971.

What is Moore’s law and Why it is Important?

In 1965, Intel co-founder Gordon Moore predicted that,

The number of transistors per square inch on integrated circuits had doubled every year since their invention.

This prediction is later known popularly as Moore’s law. Moore’s law predicts that this trend will continue into the foreseeable future. Although the pace has slowed, the number of transistors per square inch has since doubled approximately every 18 months. Most experts, including Moore himself, expect Moore’s Law to hold true until 2020-2025.

But now some experts expect Moore’s law to hold for another two decades. Some studies have shown physical limitations could be reached between 2017 – 2020.

Progress achieving the doubling of the number of circuits has slowed, and integrated circuits cannot get much smaller as transistors approach the size of an atom.

The definition has changed and been challenged over the last few decades, and now it seems that Moore’s law might finally be coming to an end.

Some time in the future, software or hardware breakthroughs may keep the dream of Moore’s law alive. That has happened now,

This New Discovery Breaks the limits of Moore’s Law

In new research reported this week in the journal Nature Nanotechnology, a team of researchers at the Massachusetts Institute of Technology (MIT) demonstrated strange magnetic behavior which could greatly improve data storage methods.

Currently, data is read and written one bit at a time, a feat accomplished by altering the placement of magnetic particles.

Instead, this new method manipulates “skyrmions” virtual particles made of small disturbances in the orientation of this magnetism — by using electric fields. These “particles” can store data for much longer than traditional methods.

Geoffrey Beach, an associate professor of materials science and engineering at MIT, led the original study which first documented the existence of skyrmions in 2016.

Skyrmions

In this new study, he has demonstrated, for the first time, that it’s possible to create the virtual particles in specific locations (when previously documented, the particles’ location was entirely random). This most recent development is what will be key in creating improved data storage systems.

Current storage capacities that physically exist in magnetic systems adhere to Moore’s law, but they are reaching their limits.

If they were to be replaced by the new method, which uses skyrmions, the law could not only be outgrown — it could be eliminated entirely.

What’s preventing a shift to the new method from happening is that we still need a way to read the stored data. It’s possible to use x-ray spectroscopy to accomplish this, but the equipment required is expensive and impractical for use in computers.

However, this improved method for data storage is applied, one thing is for sure: it’s challenging Moore’s law to the extent that it may become a thing of the past. This may be the end of Moore’s law.

Intel’s new chip will take Quantum computing to next level

But, recently Intel announced its latest chip, which may keep Moore’s law live for a decade.

Intel’s Qubit Processor

Intel’s successfully fabricated a 17-qubit superconducting chip and the superconducting chip has been submitted to the company’s quantum research partner QuTech for further testing. The hardware could be a stepping stone to a fully fledged Quantum computer.

Quantum computing has the potential to be a truly revolutionary technology, providing a currently unprecedented amount of computational power. This paves the way for next generation computer’s and keeps the Moore’s law live for a decade.

Nevermind Moore’s Law is not going to end. Recently Researchers have successfully created a transistor 50,000 times smaller than a strand of hair. Transistors just got a whole lot smaller.

That’s what a team from the Department of Energy’s Lawrence Berkeley National Laboratory managed to do, according to a study published in the journal Science. (Read more…)

The Future of computing technology

However in the future, there are some promising technologies such as carbon nanotubes being developed by IBM, graphene sheets, and black phosphorus that may one day replace the transistor we use today.

And some transistors are smaller than a virus, thanks to nanotechnology. These microscopic structures contain carbon and silicon molecules aligned in perfect fashion that help move electricity along the circuit faster.

Cloud computing, wireless communication, the Internet of Things and quantum physics may all play a role in innovating computer technology.

Experts show that computers should reach physical limits of Moore’s law sometime in the 2020s. When that happens, computer scientists can examine entirely new ways of creating computers or this may leads to the development of Singularity (Artificial Intelligence).

Whether Moore’s law will stay alive or dead, we are urging towards the development of Artificial Intelligence. But surely,

The end of Moore’s law will be the beginning or birth of Singularity (Artificial Intelligence).

Know about: What is Singularity?

Ray Kurzweil is considered as a formidable figure in futuristic thinking, as he is estimated to have an 86 percent accuracy rate for his predictions about the future. The future he envisions is one marked by decentralization of both the physical and mental.

Recently, Ray told to Futurism, he predicted that,

“2029 is the consistent date I have predicted for when an AI (Artificial Intelligence) will pass a valid Turing test and therefore achieve human levels of intelligence.”

Well, don’t worry about AI taking over Human race, we have established Ethics for Artificial Intelligence.

And finally it’s in our hands to choose that we already know technology can be used either for good or bad. So, let’s hope for a better future and on our own we’re going to figure out what will be the future or there is no future.

Read about: Ethics for Artificial Intelligence

Scientists present the first bionic hand with the sense of touch
Cryptojacking might be the new privacy threat in 2018
Physicists developed a coldest chip runs at near Absolute Zero
Scientists prints ‘self-healing’ flexible metal circuits
A Tokyo-based Startup going to start ‘Moon Ads’
New research could develop batteries that triple’s the range of electric vehicles

No Comments

Leave a Comment.