The future of tech is here, and it is very exciting.
On the heels of Microsoft’s big set of sweeping announcements and updates, I thought it would be appropriate to take a look at what the future of tech has in store, and discuss one common trend that the entire industry is witnessing.
That shift has to do with going from the world of digital computing to analog.
What is the difference?
Normally, when we refer to the terms ‘analog’ and ‘digital’, the first thing that comes to mind is a clock.
In this case, we tend to lean towards the digital clock. After all, that’s the one that appears to be more technically advanced. But when it comes to things like sound, or, in this case, computing, analog is far more advanced. It is uninterrupted and does not rely on relaying lines of code in the way that digital does.
The difference between these two concepts when we move away from the traditional image of the clock looks something like this:
Perhaps the easiest way to understand how these two concepts differ is to compare them side by side.
|Signal||Analog signal is a continuous signal which represents physical measurements.||Digital signals are discrete time signals generated by digital modulation.|
|Waves||Denoted by sine waves||Denoted by square waves|
|Representation||Uses continuous range of values to represent information||Uses discrete or discontinuous values to represent information|
|Example||Human voice in air, analog electronic devices.||Computers, CDs, DVDs, and other digital electronic devices.|
|Technology||Analog technology records waveforms as they are.||Samples analog waveforms into a limited set of numbers and records them.|
|Data transmissions||Subjected to deterioration by noise during transmission and write/read cycle.||Can be noise-immune without deterioration during transmission and write/read cycle.|
|Response to Noise||More likely to get affected reducing accuracy||Less affected since noise response are analog in nature|
|Flexibility||Analog hardware is not flexible.||Digital hardware is flexible in implementation.|
|Uses||Can be used in analog devices only. Best suited for audio and video transmission.||Best suited for Computing and digital electronics.|
|Bandwidth||Analog signal processing can be done in real time and consumes less bandwidth.||There is no guarantee that digital signal processing can be done in real time and consumes more bandwidth to carry out the same information.|
|Memory||Stored in the form of wave signal||Stored in the form of binary bit|
|Power||Analog instrument draws large power||Digital instrument drawS only negligible power|
|Cost||Low cost and portable||Cost is high and not easily portable|
|Impedance||Low||High order of 100 megaohm|
|Errors||Analog instruments usually have a scale which is cramped at lower end and give considerable observational errors.||Digital instruments are free from observational errors like parallax and approximation errors.|
Table credit: Diffen
What does this have to do with tech?
In the world of tech, the safest method of computing has long been digital. The reason for that is the repercussions of causing errors in analog computing. An interrupted signal interrupts communication, whereas digital computing allows for a virtually error-free communication. But that comes with limitations, like the necessity for both the sending and receiving devices to be speaking the same languages.
Analog, if working optimally, allows us to communicate seamlessly with devices in natural language. This opens new doors, like holographic imaging, shown recently by Microsoft at their event.
In a recent interview with Alex Kipman, Microsoft’s chief inventor, he was quoted as saying, “[The next era of computing […] won’t be about that original digital universe.] It’s about the analog universe.”
Wired senior writer Jessi Hempel goes on to explain, “You used to compute on a screen, entering commands on a keyboard. Cyberspace was somewhere else. Computers responded to programs that detailed explicit commands. In the very near future, you’ll compute in the physical world, using voice and gesture to summon data and layer it atop physical objects. Computer programs will be able to digest so much data that they’ll be able to handle far more complex and nuanced situations. Cyberspace will be all around you.”
Now, if that isn’t a concept to get very, very excited about, I don’t know what it. Welcome to the future!
Latest posts by Corey Padveen (see all)
- Rising Utilitarianism in Decision Making: The Sharing Economy - July 20, 2017
- The Unintended Rise of Utilitarianism in Decision Making (Part 1) - June 30, 2017
- Shameless Self-Promotion: Marketing to Millennials for Dummies - June 28, 2017