It is very likely that this moment in history will be recorded as a major turning point in human development, commensurate with the invention of the sail, the building of the first canals, and the abandonment of powdered wigs as a masculine status symbol. We are, indeed, passing such a significant milestone that one might question whether we are still in the era that succeeded the Atomic Age, known popularly as the Information Age, or if the later itself is coming to its finish and a new paradigm has commenced, one that we are trying desperately to come to grips with.

New technologies arise every day, and they transform the landscape in which we live dramatically. Rather than peak and then taper off like most of its predecessors, however, this new revolution seems primed, impossibly, to continue accelerating indefinitely. One of the co-founders of Intel, Gordon Moore, noted in a paper in 1965 (revised a decade later) that the number of the number of components one could fit on an integrated circuit had doubled every two years since the invention of the microchip in 1958, and predicted that this trend would continue for the next decade. Fifty-five years in, Moore’s Law is still rocking on: Image by wikipedia user Wgsimon

Note that the physical transistor count proceeds logarithmically rather than linearly, multiplying by ten at each interval beyond the second, expanding faster than the American waistline. Add to this the fact that the chip components themselves are constantly being refined and improved, and what you have, according to David House, another Intel exec from that era [who I just discovered now happens to be Chairman of the Freaking Board at freaking Forbes], is a doubling of chip performance every 18 months. This is nothing short of open-ended, literally exponential growth, a frictionless acceleration of technological advancement the likes of which has never before occurred.

Ray Kurzweil, a polymath inventor and one of the most respected futurists in the world [labelled “Edison’s rightful heir” by Inc. magazine and “the ultimate thinking machine” by the aforementioned freaking Forbes], believes that by 2040 computers will exceed the human brain in processing power. This is an event referred to as the “Singularity,” originally conceived of by mathematics professor and Hugo-award winning novelist Vernor Vinge. This claim has generated a great deal of controversy by sheer force of implication but a great many prominent thinkers are hard-pressed to deny it. Others have pointed out, however, that hardware is only as good as the software one can devise to run on it, and it is possible that software will never be created that can operate on the same level as the human mind, regardless of the power of the processors behind it.

Kurzweil, for his part, is an optimist who believes that the combined resources of genetics, nanotechnology and robotics (or artificial intelligence) will become so powerful that we are ensured a bright and happy future, one in which we master our physical bodies and minds with science and merge them with technology to produce a new kind of being which cannot age or die. Kurzweil also believes that godlike robots will ressurect his dead father for him, so we must take his speculations with a grain of salt, but he has made a great many accurate predictions in the past (ever heard of the Internet?) and the future he describes seems plausible. The word for such a human/machine hybrid is “cyborg,” a term coined by Manfred Clynes and Nathan Kline, in a 1960 paper entitled “Cyborgs and Space.” The Klynes were likewise optimists, believing that technological modification of the body could enable a human being to operate in the vaccum of space without the encumbrance of a suit.

When most people think of cyborgs, however, their visual impression is not a perfected version of themselves but rather something akin to the Borg, either the mindless males or one of the super-hot-yet-scary-as-hell females. These creatures are more of a crude robot/zombie hybrid, a monster, than a true synthesis of human and machine, and represent our deepest fears as to what may happen to us when our technology advances to the point that we can no longer control it. The Borg are therefore not cyborgs but anticyborgs, machines which incorporate human components. A true cyborg would still retain her humanity; the only purpose of technology itself is after all to enhance the human experience, not to supplant it. Whether or not the definition of “human” would be the same to a cyborg as it would to, say, a person living in the 20th century, is still an open question.

While the majority of us may not often indulge in optional technological modification of our bodies or brians (No, vodka does not in this instance count as either technology or optional), this is frequently not due to a lack of ability so much as having easier ways to do what we want. Most of the abilities one would associate with a cyborg are in fact already available to a large segment of the population. Hundreds of millions of human beings already have continuous network access wherever they happen to find themselves, make frequent use of artificial intelligence programs with natural voice interfaces, access massive databases of information through a variety of readily available means, communicate with large numbers of peers simultaneously through electronic social networks, and share data of all kinds through a wide array of other electronic channels, often automatically.

The rapid advancement of information systems has accelerated the rate of development in every other field as well, from medical imaging systems to sex toys, pushing the marraige of the human body and mind with technology all the further. We have for many years made constant use of thousands of continually upgraded technologies to directly modify and enhance our appearance and abilities. We devise new technical solutions daily to cater to our every need, from vibrating razors with half a dozen perfectly aligned nano-engineered blades to specialized compounds capable of adjusting the texture or appearance of our skin or hair, to vanishingly thin films of porous plastic that cling to the very surface of our eyes to enhance our vision. Add to this the fact that numerous elective surgeries for both cosmetic and sensory enhancement have now become commonplace, and there appear to be very few inches left on the cyborg yardstick. The case may easily be made that we are already cyborgs today.

Perhaps “Cybernetic Age” would be the appropriate term to describe the new human paradigm. It can hardly be denied that this vast and ever-evolving array of technology has changed the human experience forever, spreading out our personal perceptions and interactions to encompass a dramatically larger area than before, empowering our species in ways never imagined in the past. It remains to be seen how this liquefaction of our personal lives will impact human history, but it seems clear enough that nothing will ever be the same again.