Endless, Senseless and Anti-Human: The Future of Technology?
Preface
I often wonder what the future of computer technology bears in store for us. Computers have developed at such an incredible rate within the past 50 years that it's hard to envisage what they will be capable of in the future. With that in my mind, I recently began my own research into both the history, and future of technology. I was mortified by my initialI findings.
Many of todays technologies have embedded themselves, almost accidentally within our lives. Web 2.0 threatens to decrease our personhood and shrink our understanding of what it is to be a person. We are destined to be simultaneously locked within our design flaws and outclassed by our own tools. Within this essay, I plan to explore the worlds of both software web 2.0.
Locked-in
"Lock-in".While this term may seem alien, we are all almost irreversibly destined to become more familiar with it as technology develops further. Jaron Lanier describes the theory of lock-in at the beginning of his book: "You Are Not A Gadget"; he uses the example of MIDI. MIDI may (somewhat ironically) sound unfamiliar to most of us, but I can guarantee that all of us have heard it. MIDI is software embedded at the heart of almost all music software. MIDI was created in the early 1980's, for one man and his synthesiser. He created the software to give himself a broader spectrum of sound to experiment with. MIDI could represent simplistic musical ideas, e.g a key being played; MIDI worked well at what it was designed to do, and thus it rapidly expanded. There was MIDI this and MIDI that, everything was vying to support MIDI. However, MIDI had soon expanded far beyond its original scope of design, it was being used to represent things much more complex than a keyboard; things much more complex than it was capable of. Better, tidier and more advanced alternatives to MIDI were created, but it was too late. So many pieces of software and hardware had been created to support MIDI by this point that the fiscal investment to replace the software was unfathomable. Thus, still to this day, musicians are bound by the limitations of software almost 30 years old. MIDI is locked-in. It's quite a frightful thought indeed; that a small project of work by one man could have such a drastic impact.
Many of the technologies that we use on a daily basis are actually locked-in solutions, alternatives to which are almost unimaginable. Websites for example, were not always destined to be designed as "pages". Computer "files" were not always a certainty, yet the concepts of a "file" or a "webpage" are so fundamental to our technology that it's tricky to seriously envisage any other systems. All of these (and many, many more) concepts had equabilitylly (if not more so) credible alternatives.
"Lock-in" does not exist solely within the world of technology, it can be seen in any aspect of life with either a history or a future. Lock-in can be seen within the world of railways for instance, e.g the dimensions of tracks. The London Tube was designed with narrow tracks and matching tunnels. Many of these tunnels cannot accommodate air-conditioning because there is no room to ventilate the hot air from the trains. Thus, tens of thousands of modern-day residents in one of the world's richest cities must suffer a stifling commute because of an inflexible design decision that was made over one hundred years ago. But unfortunately, lock-in within software is much more painful than railroads. Software must always adhere perfectly to a boundlessly particular and messy system of rules. So while lock-in may be an annoyance in the world of railroads, it's death by one thousand cuts in the digital world.
There are some, although depressingly few, situations in which we escaped "lock-in". For instance, there was an active campaign in the 1980's and 1990's to promote the importance of visual aesthetics within software. Thankfully, the movement was successful due to the efforts of influential engineers within companies such as Apple and Microsoft- thus we have been saved from ugly fonts and software, for now at least.
Whilst lock-in lurks menacingly within the world of software, our online world faces dangers of its own. Anonymity, fragmentation and laziness are diminishing what it means to be a person in the online world. Lanier claims that web 2.0 is asking us to shrink ourselves and join the anonymous online masked mass.
Web 2.0
Tom Young used his last column article on computing.co.uk to highlight some of the "creeping dangers" embedding themselves within the internet. He claims that the foremost, and most prominent danger is anonymity. He asks us to "glance at the comments below any newspaper opinion article and you will be given a whirlwind tour of the most unpleasant aspects of the public psyche". Whilst I do disagree with his assertion of "any", I think all of us are very much aware of what Young is referring to. The freedom and anonymity offered by the internet encourage many people to express themselves in ways they may be unable to within the "real world". Young suggests that this freedom has negative consequences, as people with controversial opinions are given free reign to broadcast themselves. However, if we begin to judge and censor the internet; then we risk net-neutrality and the nature of the internet itself. Whilst I agree that anonymity is an ailment, I disagree with Young's diagnosis.
I think the real danger of anonymity is the impact it has upon our individuality.I think that the individuality of internet users is rapidly losing ground to a collective, unknown, omnipresent crowd. As Lanier states "Real people must has left all those anonymous comments on blogs and video clips, but who knows where they are now, or if they are dead? The digital hive is growing at the expense of individuality".
WriteSomething epitomises the anonymous, digital hive described by Lanier. WriteSomething asks us to write anything, something, as long as we do it quickly and without thinking (and of course, anonymously). The website was established as an "endless, senseless, collaborative book". However, the project has somewhat predictably degenerated into picayune chatter. Interestingly however, the rare, well written pieces on the website usually stem form the few registered and named members. This somewhat crude pattern of identifiable=good and anonymous=bad seems to apply to other online communities as well.
Lanier embellishes upon the idea of a "digital hive mind "in an online essay "Digital Maoism: The Hazards of the New Online Collectivism". He states "If we start to believe that the Internet itself is an entity that has something to say, we're devaluing those people [creating the content] and making ourselves into idiots.".
Wikipedia is often targeted by Lanier for a multitude of reasons. The sterile style of writing on Wikipedia removes any flavor or trace of humanity. In doing so, it filters the subtlety of authors opinions and essential information is lost. Furthermore, the collective authorships approach of Wikipedia tends towards the ideas and opinions of the crowd, potentially devaluing its own content.
Web services such as Twitter ask us "what are you doing now?", yet the question is fundamentally flawed. Twitter does not ask us to share ourselves, it asks us to share fragments of ourselves. Twitter is asking us to adapt our own behavioural habits for ease of exploitation; it's much easier to collect fragments than it is to collect people. It's worth remembering that "you have to be somebody before you can share yourself".
But even more worrying than the prospect of computers changing us, is the idea of computers "replacing" us. There is a peculiar, but increasingly popular trend of thought within digital communities: "Anti-human". Kevin Kelly states we no longer require authors or identifiable writers as all information can be compiled into one single, global book. Chris Anderson (editor of Wired) proposes that science should no longer seek theories that scientists can understand, because the digital cloud will understand them better anyway. This anti-human rhetoric is literally madness; people seem to be willing their purpose away. Computers were created as tools for humans. To say that humans are useless because of computers is akin to saying gardeners are useless because of lawnmowers. "You can believe that your mind makes up the world, but a bullet will still kill you. A virtual bullet however, doesn't even exist unless there is a person to recognise it as a representation of a bullet. Guns are real in a way that computers are not."
Ideas of computers replacing or outpacing humans are evident within our popular culture as well. Successful film franchises such as Terminator and Matrix have made millions of dollars from the concept. Literature such as I Have No Mouth, And I Must Scream explores similar concepts.
People seem to be becoming blasé with their own intelligence. It's true that computers have achieved many remarkable things: they've beaten the world-chess champion, they can recognise different types of dog bark, heck some of them can almost climb stairs. But let's not forget that a computer is just a tool.
We are quick to blame ourselves when software or hardware doesn't work as it's supposed to. Instead of demanding that technology be created that suits our needs, we continually adapt ourselves to the whims of our own tools.
Bumblebees can perform calculations faster than supercomputers. We can generate grammar and understand language at a rate unimaginable within the world of computers. We can build computers. We create life and some of us create art. We feel; we're alive in a way that computers will never be. As Pablo Picasso once said "Computers are useless. They can only give you answers". We should stop aiming to be mechanical or robotic with our decision making and remember how very good it is to be human.
We have allowed computers to become much more than the sum of their parts. The first step to avoiding Lanier's future is to wake up and remember that these things are our tools. We think of them as integral and invincible, but they're mostly just plastic. Stupid plastic.
LiteNite
<a href="http://www.youtube.com/watch?v=UPfvt6xa-NE">http://www.youtube.com/watch?v=UPfvt6 xa-NE</a>
!!!