Computers have made it a long way since the invention of Colossus, which was the computer built by British scientists to crack the Enigma code. Back then, computers would fill entire rooms and require a team of engineers to run. Now, less than a century later, they’re compact enough to fit inside of a pocket. But let’s look at what the future may have in store for computing (ScienceNews).

Artificial intelligence is a phrase that gets thrown around a lot in science fiction movies, but what does it actually mean? Technically, artificial intelligence is already here. It’s what filters search results to our liking and recommends videos, songs, and TV shows to us in streaming services; done via a process of analyzing statistics and creating algorithms automatically known as “machine learning.” We even have artificially intelligent cars that can drive us to our destinations. But these forms of artificial intelligence are made to fulfill specific purposes. What about AI that can think like a human? This is a concept known as artificial general intelligence, or AGI. Alan Turing famously devised a test to determine whether a computer could be considered thinking. If a human could have two conversations, one with a computer and the other with a human, and not be able to guess which was the human or the computer, then the computer passed the Turing test. Until we can achieve that, AI will largely rely on machine learning to stay as advanced as it is (TowardsDataScience).
One of the main building blocks of computers, as well as other electronic devices, are transistors. Transistors serve two basic functions. The first is amplifying currents. One end of a transistor may take in a smaller electric current while the other end outputs a much larger current. The other function of a transistor is acting as a sort of switch. One small current may determine the output of larger currents, much like the valve on a faucet (PhysLink). Transistors have become smaller and smaller over time, and now a single computer chip can hold hundreds of millions or even billions of transistors. But as transistors continue to get smaller, it is expected that we may hit a limit of how small they can physically get, meaning that improvements will have to be found elsewhere. The current design for transistors, known as “FinFET,” will likely be replaced in favor of a new design called the “gate-all-around” design. This new design is expected to use less energy, take up even less space, and work faster than current transistors (ScienceNews).

Another major development we may see in the coming years is the use of quantum computing. Quantum computers are built upon the science of the supersmall, and in the future, could completely transcend the capabilities of current computers. For now, though, quantum computers are impractical. An IBM quantum computer in New York is currently capable of simple calculations. What separates quantum computers from current computers is the use of “qubits.” Whereas a normal computer will store information in terms of bits (ones and zeroes), a quantum computer can use qubits, which can represent the values of one, zero, or both at the same time, something known as a quantum superposition. This allows for parallel computations that will speed up the process of working out solutions. Once quantum computers become more powerful, they will revolutionize the way computing is done (ScienceNews, the last source listed, the article about quantum computers).
With the rapid speed that computers have been developing over recent years, what was once thought to be science fiction is now a reality, and further development of computers will only exceed our current understanding of technology and the capabilities of computing. So be on the lookout for the amazing changes that may happen in the future!.
Written by: Matthew Jenkins
Date: March 14, 2022
Source:
https://www.sciencenews.org/century/computer-ai-algorithm-moore-law-ethics#chasing-intelligence
https://www.physlink.com/education/askexperts/ae430.cfm
https://towardsdatascience.com/what-is-artificial-general-intelligence-4b2a4ab31180
https://www.sciencenews.org/article/quantum-computers-are-about-get-real