The Future of Computing

25 May 2016
ENIAC

Quick question: What weighed 30 tons, was 100 feet long, consumed 150 KW of power and cost $500,000?

Answer: The first actual digital computer.

This was ENIAC. It was completed around 1946 and was in use for about 10 years. At its peak, it was capable of about 400 FLOPs (Floating Point Operations per Second). It used 18,000 vacuum tubes and had five million hand-soldered joints. It was not user-friendly, had no memory to store results and required a crew of 30 just to operate and maintain it.

Eniac 2

To us today, it seems inconceivable that such a behemoth could serve any practical purpose, yet this device was critical in the invention of one of history’s most significant devices—the atomic bomb. So, it was (arguably) useful.

Another quick question: What weighed two ounces, and traveled in interstellar space?

Answer: The Intel 4004 microprocessor.

Built in 1971 and installed on the Pioneer spacecraft launched in 1972, the venerable little “computer” was capable of about 92 KIPS (thousand Instructions per second). It was still running in 2007 when Pioneer 10’s transmitter finally went quiet (at a distance of over 7 billion miles from Earth).

Intel 4004

Twenty-five years separate these two devices.

Looking back, we are astonished at how utterly crude those early machines were and are amazed at what those early inventors could do with them.

Today, Abaco Systems builds computers that are the size of a business card and operate on 10 watts of power while providing 300 MFLOPS (Millions of FLOPS) of computing power. We build small computer “systems” about the size of a 6-pack that can do 4 GFLOPS (that’s billions of FLOPS) as well. The applications for these devices (and their many siblings) boggle the mind; they are found in everything we touch and they impact all that we see and experience.  Sixty-five years ago, no one could have imagined what ENIAC would have wrought.

Unimaginable

It stands to reason that the future of computing is just as unimaginable to us—but unlike those computing pioneers in the 1940s, it will happen in most of our lifetimes.

As we move to the future, improvements in technology will continue to drive SWaP (size, weight and power) down and will enable computers to enter ever smaller spaces. There are already “printable” circuits and sensors, computers that can be swallowed for a closer look inside the body. There are even “molecular” computers composed of organic compounds which foreshadow SWaP improvements measured in orders of magnitude. Computers will be available in every form and size and will no longer dictate their use space. The use space will define them. 

We are leaving the domain of “build-it-and-they-will-come” computing. 

In 1984, Apple computer ran the tag line “a computer for the rest of us,” meaning they had produced something new, something not available before—an “appliance,” a device designed for a particular use. In short, a device defined by what it did, not by what it was. Thirty years later, we see the same trend emerging in every application of computing—the “magic” of emerging technology must do something useful right out of the box, something not available before.

At Abaco, we are on it, developing hardware that is off-the-shelf with software tools that ease integration and shorten program schedules while reducing risk; appliances that deliver more utility with less SWaP and are supportable over many years with constant performance improvements.

Connectivity will also play a major role in future computing. The vast computing powerhouses of the 1980s were severely limited in application simply due to the fact that using them was a “local” thing. Remote connections were slow and made their utility questionable (I myself ran a climate simulation on an IBM mainframe in 1982 that required 52 hours to run. I could do it today in my home in minutes). In many ways, the lack of connectivity drove the development of the personal computer and the inertia built up from the established software infrastructure has driven PCs to higher and higher performance—but the trend is shifting. In America, where nearly 95% of our population has a broadband connection, we can tap the power of huge computer servers as easily as we can turn on our TV. Inter-connected devices, smartphones and IoT (Internet of Things) devices are an everyday part of our lives; we connect and never even think about it.  The trend from centralized to distributed computing has shifted back, but with a difference—the distributed devices we use are hugely more powerful than the centralized devices of the past. In the future, both sides with continue to grow in power and usability. Connectivity will increase and ultimately there will be no distinction between the two. Some have called this a “hive mind.” While this sounds spooky in a “resistance-is-futile” kind-of way, remember that every technological change has wrought its own “Chicken Littles”—but humanity has managed to cope with and benefit from radical changes.

Unfolding story

There is more to the unfolding story of future computing. We are already seeing the birth and growth of “learning machines.” The power of the hardware now allows us to “teach” rather than “program” computers, mimicking the way our brains develop. In these machines, we can no longer look inside and “debug the code”—yet it is there, just as the code of our DNA affects the machinery of all life. Learning machines are found in self-driving cars, real-time translators, they parse our newsfeeds and are making inroads into nearly everything else. The future will be just as inconceivable as a smart phone would have been to the builders of ENIAC.

Abaco will be there. We built this company knowing that change is coming and that innovation and agility are the key to future success. We intend to be the future of computing.

Larry Schaffer

Larry Schaffer has been with us in a business development role since 2001, and works to create and maintain long-term, strategic relationships with key companies engaged in embedded computing for ground systems applications with a strong emphasis on image processing and distribution. He was born in Pennsylvania and educated as an Electrical Engineer in New Jersey and California (where he now lives). Just don’t ask him to tell you about being a war baby…