The return of the platform

Some elements of the history of technology read like ‘Believe It Or Not” stories. Believe it or not, it was indeed a Transylvanian travel agent called Tivadar Puskás who devised1 the2 telephone exchange, back in 1876. He took his ideas first to Thomas Edison, at his research facility in Menlo Park, New York (one doesn’t need to wonder who booked the passage) and, subsequently, deployed the first exchange for Edison in Paris. And then, believe it or not, it was a schoolteacher-turned-undertaker from Kansas City, Almon B. Strowger, who came up with the idea3 of the automated telephone switch, some fifteen years later — which of course led to the leaps of mathematical brilliance made by Konrad Zuse and Alan Turing. You really can’t make this stuff up.

The relationship between computer smartness and human smartness, as illustrated by these blips in a decades-long sequence of minor epiphanies and major breakthroughs, is too important to ignore. At each juncture, individual endeavour has frequently given way to corporate objectives, as the opportunity to monetise each breakthrough has become all too apparent. Frequently (and right to the present day) this has involved some level of disruption to whatever went before, as old intermediaries have been replaced by new, or discarded altogether — a recurring theme since the telegraph replaced the penny post, right up to e-commerce sites displacing travel agents, and indeed beyond. But even as computers become more powerful and one set of business models is replaced by another, the nature of the coastal paradox means complexity continues to win. Even today, with the best will in the world, the ‘smartest’ companies we know — Google, Amazon and the like — are more harvesting data, than really exploiting its full potential. And even capabilities such as Facebook and Twitter, clever as they are, are based on relatively straightforward ideas — of getting messages from one place to another.

This is not necessarily a bad thing. While technology has continued to advance during the 1980s and 1990s, a significant proportion of the effort has gone into making it more of a tool for the masses. It is not coincidental that the hypertext4 ideas behind Web links were conceived, by self-styled5 “original visionary of the World Wide Web” Theodor Nelson in 1967 (““Arguably, the World Wide Web is derivative of my work, as is the entire computer hypertext field, which I believe I founded,” he said), nor that the computer mouse was designed by Douglas Engelbart in 1968, nor that the Internet’s default operating system, Unix, was written in 1969. Simply put, they were good enough for the job, even if they had to be patient before they achieved their full potential. This corollary of the Law of Diminishing Thresholds, which we might call the Law of Acceptable Technological Performance — in that some ideas are simple enough to be correct — could equally be applied to the Simple Mail Transfer Protocol, or indeed the Ethernet protocol, both of which were seen as less reliable than other options (X.400 and token ring respectively). Simply put, they were good enough. And as things become good enough, they commoditise and become generally accepted. And as they do, they become more affordable, as they are shared. A Netflix subscription costs so little, for example, because so many people are paying the same money to watch a single film.

Technological progress has been equal parts of mathematical and scientific innovation, corporate competition and community-spirited free thinking and downright rebellion, each keeping the flywheel of progress spinning. This ‘magic triangle’ has created the quite incredible foundation of software upon which we are now building. Today there’s lots to build upon. Above all we have an infrastructure, we have a growing set of algorithms, we have an API economy and an innovation-led approach — in other words, we have our foundation for smart. The foundation itself has been getting smarter — smart orchestration algorithms provide the basis for the cloud, and as we have seen, massively scalable open source data management tools such as Redis and Hadoop have offered a more powerful replacement for traditional databases. And it will continue to do so. Despite diminishing thresholds, and whether or not one believes that we are heading towards a climate change disaster, the fact is that the future will be both highly resource constrained, whilst involving far more powerful compute platforms than today, coupled with a continuing appetite for network bandwidth, CPU cycles and data storage.

The impetus is there, we are told by economists who look at changing global demographics and the pressure this puts on global resources, pointing at diminishing reserves of fossil fuels and rare metals. Note that the former power our data centres and devices, and the latter play an essential part in electronics of all kinds. Indeed, the computing platform itself could be heading towards what we could call the orchestration singularity — the moment at which computers, storage, networking and other resources can manage themselves with minimal human intervention. If this happened, it would change the nature of computing as we know it for ever. While we are a long way from this, the near future is unassailably a globally accessible substrate of resources that requires is controlled by a dynamically reconfiguring stack of clever software to work. This is the platform upon which the future shall be built.

But what can we do with such a platform of technological resources? Therein lies the rub. The answer to the question is evolving in front of our eyes as, outside of technology infrastructure, the algorithm is growing up. Software already exists to enable us to communicate and collaborate, to buy and sell, to manage ourselves and the things we build and use. The platform is itself increasing in complexity, going beyond simple creation, manipulation and management of data. Historically, such capabilities were built or bought by companies looking to do specific tasks — payroll for example, or managing suppliers and customers. But even these packages are commoditising and becoming more widely accessible as a result.

Are we destined to another century of automation, or will computers become something more? All eyes are on the ultimate prize — to replicate, or at least run in a similar way to, the most powerful computer that we know, a.k.a the human brain. Back in 1945 John von Neumann stated, with remarkable prescience, “It is easily seen that these simplified neurone functions can be imitated by telegraph relays or by vacuum tubes,” kicking off another theme which has repeated frequently ever since. Science fiction authors, have been all over it of course, moving their attention from intelligent aliens to thinking robots or mega brains, such as Douglas Adams’ Deep Thought or Iain M Banks’ creations. In the world of mathematics and science, experts like M. Huret Senior led the vanguard of forays onto Artificial Intelligence was the thing, with luminaries like Patrick Henry Winston6 setting out how rules-based systems would support, then replace human expertise. Largely due to the inadequacies of computer power of the age however, the late 70’s also saw reality simply unable to deliver on the inflated expectations of the time. Artificial intelligence was the original and best “idea before its time” and the hype was not to last. Computers simply weren’t powerful, nor cost-effective enough to deliver on the level of complexity associated with what we are now calling machine learning.

Repeated attempts have been seen to deliver on the dream of thinking computers. In the 1990s, neural networks — which could learn about their environments, and draw increasingly complex inferences from them, were the thing. Today, computer algorithms are a core part of the tools used for financial trading on the stock markets, and as we have seen, big data analytics software can identify a fair number of needles in the gargantuan haystacks of data we continue to create. Will the platform itself become recognisably smart? Mike Lynch thinks so, and new concepts as Precognitive Analytics7 show a great deal of potential. But this does not mean that the old, ‘analog' ways are done with, far from it. For the foreseeable future, the world will remains more complex than the processor power available to model it. While we are only scratching the surface of tech's potential, we are still only scratching the surface of the complexity we are dealing with. But equally, it doesn’t have to be finished to be started. While we may not be at the brink of intelligence, we currently see the results augment, rather than supplant our own abilities.

We shall look at the longer term potential of technology soon, but right now we are back with people. Here’s a thought experiment: what if everyone suddenly had access to every computer in the world? First of all, they probably wouldn’t know what to do with it all – and people would create ways of using it and put themselves in the middle, and charge for the privilege. Which is exactly what is going on. We can see a small number of major corporations building walled gardens, in which they attempt to lock their users into a certain ways of working. We can see governments panicking and trying to control everything, even while using technology. No doubt we will face standards wars and vendor isolationism, new global entities emerging from nowhere (my money’s on a subset of the cloud orchestration companies), calamitous security events and threats of government intrusion.

And we shall also see the common man struggling to make sense of it all, even as his or her very behaviours are transforming. We’re all as confused as each other: like the scene in the Matrix, when one agent says to another, “He is only…” and the other replies, “…Human.” In the next section we consider just how profound the impact of technology has been, and the ramifications — positive and negative, though rarely indifferent — on ourselves, our daily and working lives, in our communities and around the globe.

To do so we first, once again, return to ancient Greece. Enter: Demosthenes.