00:00
00:00
WaterShake
://witten_selfie.png

Age 30, Male

Graphic Designer

Edinburgh

Joined on 6/7/09

Level:
13
Exp Points:
1,820 / 1,880
Exp Rank:
33,901
Vote Power:
5.54 votes
Rank:
Police Officer
Global Rank:
10,288
Blams:
178
Saves:
736
B/P Bonus:
10%
Whistle:
Normal
Medals:
44

Endless, Senseless and Anti-Human: The Future of Technology?

Posted by WaterShake - January 10th, 2011


Endless, Senseless and Anti-Human: The Future of Technology?

Preface

I often wonder what the future of computer technology bears in store for us. Computers have developed at such an incredible rate within the past 50 years that it's hard to envisage what they will be capable of in the future. With that in my mind, I recently began my own research into both the history, and future of technology. I was mortified by my initialI findings.

Many of todays technologies have embedded themselves, almost accidentally within our lives. Web 2.0 threatens to decrease our personhood and shrink our understanding of what it is to be a person. We are destined to be simultaneously locked within our design flaws and outclassed by our own tools. Within this essay, I plan to explore the worlds of both software web 2.0.

Locked-in

"Lock-in".While this term may seem alien, we are all almost irreversibly destined to become more familiar with it as technology develops further. Jaron Lanier describes the theory of lock-in at the beginning of his book: "You Are Not A Gadget"; he uses the example of MIDI. MIDI may (somewhat ironically) sound unfamiliar to most of us, but I can guarantee that all of us have heard it. MIDI is software embedded at the heart of almost all music software. MIDI was created in the early 1980's, for one man and his synthesiser. He created the software to give himself a broader spectrum of sound to experiment with. MIDI could represent simplistic musical ideas, e.g a key being played; MIDI worked well at what it was designed to do, and thus it rapidly expanded. There was MIDI this and MIDI that, everything was vying to support MIDI. However, MIDI had soon expanded far beyond its original scope of design, it was being used to represent things much more complex than a keyboard; things much more complex than it was capable of. Better, tidier and more advanced alternatives to MIDI were created, but it was too late. So many pieces of software and hardware had been created to support MIDI by this point that the fiscal investment to replace the software was unfathomable. Thus, still to this day, musicians are bound by the limitations of software almost 30 years old. MIDI is locked-in. It's quite a frightful thought indeed; that a small project of work by one man could have such a drastic impact.

Many of the technologies that we use on a daily basis are actually locked-in solutions, alternatives to which are almost unimaginable. Websites for example, were not always destined to be designed as "pages". Computer "files" were not always a certainty, yet the concepts of a "file" or a "webpage" are so fundamental to our technology that it's tricky to seriously envisage any other systems. All of these (and many, many more) concepts had equabilitylly (if not more so) credible alternatives.

"Lock-in" does not exist solely within the world of technology, it can be seen in any aspect of life with either a history or a future. Lock-in can be seen within the world of railways for instance, e.g the dimensions of tracks. The London Tube was designed with narrow tracks and matching tunnels. Many of these tunnels cannot accommodate air-conditioning because there is no room to ventilate the hot air from the trains. Thus, tens of thousands of modern-day residents in one of the world's richest cities must suffer a stifling commute because of an inflexible design decision that was made over one hundred years ago. But unfortunately, lock-in within software is much more painful than railroads. Software must always adhere perfectly to a boundlessly particular and messy system of rules. So while lock-in may be an annoyance in the world of railroads, it's death by one thousand cuts in the digital world.

There are some, although depressingly few, situations in which we escaped "lock-in". For instance, there was an active campaign in the 1980's and 1990's to promote the importance of visual aesthetics within software. Thankfully, the movement was successful due to the efforts of influential engineers within companies such as Apple and Microsoft- thus we have been saved from ugly fonts and software, for now at least.

Whilst lock-in lurks menacingly within the world of software, our online world faces dangers of its own. Anonymity, fragmentation and laziness are diminishing what it means to be a person in the online world. Lanier claims that web 2.0 is asking us to shrink ourselves and join the anonymous online masked mass.

Web 2.0

Tom Young used his last column article on computing.co.uk to highlight some of the "creeping dangers" embedding themselves within the internet. He claims that the foremost, and most prominent danger is anonymity. He asks us to "glance at the comments below any newspaper opinion article and you will be given a whirlwind tour of the most unpleasant aspects of the public psyche". Whilst I do disagree with his assertion of "any", I think all of us are very much aware of what Young is referring to. The freedom and anonymity offered by the internet encourage many people to express themselves in ways they may be unable to within the "real world". Young suggests that this freedom has negative consequences, as people with controversial opinions are given free reign to broadcast themselves. However, if we begin to judge and censor the internet; then we risk net-neutrality and the nature of the internet itself. Whilst I agree that anonymity is an ailment, I disagree with Young's diagnosis.

I think the real danger of anonymity is the impact it has upon our individuality.I think that the individuality of internet users is rapidly losing ground to a collective, unknown, omnipresent crowd. As Lanier states "Real people must has left all those anonymous comments on blogs and video clips, but who knows where they are now, or if they are dead? The digital hive is growing at the expense of individuality".

WriteSomething epitomises the anonymous, digital hive described by Lanier. WriteSomething asks us to write anything, something, as long as we do it quickly and without thinking (and of course, anonymously). The website was established as an "endless, senseless, collaborative book". However, the project has somewhat predictably degenerated into picayune chatter. Interestingly however, the rare, well written pieces on the website usually stem form the few registered and named members. This somewhat crude pattern of identifiable=good and anonymous=bad seems to apply to other online communities as well.

Lanier embellishes upon the idea of a "digital hive mind "in an online essay "Digital Maoism: The Hazards of the New Online Collectivism". He states "If we start to believe that the Internet itself is an entity that has something to say, we're devaluing those people [creating the content] and making ourselves into idiots.".

Wikipedia is often targeted by Lanier for a multitude of reasons. The sterile style of writing on Wikipedia removes any flavor or trace of humanity. In doing so, it filters the subtlety of authors opinions and essential information is lost. Furthermore, the collective authorships approach of Wikipedia tends towards the ideas and opinions of the crowd, potentially devaluing its own content.

Web services such as Twitter ask us "what are you doing now?", yet the question is fundamentally flawed. Twitter does not ask us to share ourselves, it asks us to share fragments of ourselves. Twitter is asking us to adapt our own behavioural habits for ease of exploitation; it's much easier to collect fragments than it is to collect people. It's worth remembering that "you have to be somebody before you can share yourself".

But even more worrying than the prospect of computers changing us, is the idea of computers "replacing" us. There is a peculiar, but increasingly popular trend of thought within digital communities: "Anti-human". Kevin Kelly states we no longer require authors or identifiable writers as all information can be compiled into one single, global book. Chris Anderson (editor of Wired) proposes that science should no longer seek theories that scientists can understand, because the digital cloud will understand them better anyway. This anti-human rhetoric is literally madness; people seem to be willing their purpose away. Computers were created as tools for humans. To say that humans are useless because of computers is akin to saying gardeners are useless because of lawnmowers. "You can believe that your mind makes up the world, but a bullet will still kill you. A virtual bullet however, doesn't even exist unless there is a person to recognise it as a representation of a bullet. Guns are real in a way that computers are not."

Ideas of computers replacing or outpacing humans are evident within our popular culture as well. Successful film franchises such as Terminator and Matrix have made millions of dollars from the concept. Literature such as I Have No Mouth, And I Must Scream explores similar concepts.

People seem to be becoming blasé with their own intelligence. It's true that computers have achieved many remarkable things: they've beaten the world-chess champion, they can recognise different types of dog bark, heck some of them can almost climb stairs. But let's not forget that a computer is just a tool.

We are quick to blame ourselves when software or hardware doesn't work as it's supposed to. Instead of demanding that technology be created that suits our needs, we continually adapt ourselves to the whims of our own tools.

Bumblebees can perform calculations faster than supercomputers. We can generate grammar and understand language at a rate unimaginable within the world of computers. We can build computers. We create life and some of us create art. We feel; we're alive in a way that computers will never be. As Pablo Picasso once said "Computers are useless. They can only give you answers". We should stop aiming to be mechanical or robotic with our decision making and remember how very good it is to be human.

We have allowed computers to become much more than the sum of their parts. The first step to avoiding Lanier's future is to wake up and remember that these things are our tools. We think of them as integral and invincible, but they're mostly just plastic. Stupid plastic.


Comments

What the fuck are you smoking?

Your article is interesting but the majority of the reasons you listed are far from being mortifying.

"He created the software to give himself a broader spectrum of sound to experiment with."

This statement isn't really accurate. MIDI is not sound. (The synthesizer the MIDI is connected to is the sound) It's simply a system of digital data that triggers events. A broader spectrum of sound would be related to innovations in sound synthesis and acoustic and acoustic-electric instrument development and new ways to play these instruments to get different types of timbres.

"Better, tidier and more advanced alternatives to MIDI were created, but it was too late."

Such as?

"Thus, still to this day, musicians are bound by the limitations of software almost 30 years old. MIDI is locked-in. It's quite a frightful thought indeed; that a small project of work by one man could have such a drastic impact."

What you see as limitations are most likely the limitations of the software and hardware, not the actual design of MIDI itself. And MIDI has improved more or less in the past few decades. And it's not solely credited to the work of one man- like most if not all ideas that stand the test of time they start off very small by one or a few people and develop into behemoth projects built through the ideas of others.

In fact, the type of data that MIDI stores has been used since the late nineteenth century- i.e. "piano rolls".

"All of these (and many, many more) concepts had equabilitylly (if not more so) credible alternatives."

Such as?

Other concepts have not came to fruition because they're still (were) in their draft stages or we lack the technical requirements to create legitimate alternative solutions.

(WTF Tom and staff? There's significantly less space in the comments now. Continued->)

"But unfortunately, lock-in within software is much more painful than railroads. Software must always adhere perfectly to a boundlessly particular and messy system of rules. So while lock-in may be an annoyance in the world of railroads, it's death by one thousand cuts in the digital world."

I'm struggling to find out what you mean by "death by one thousand cuts in the digital world". In order to change the the railroads it will require hours upon hours of labor of restructuring the rails, walls, possibly trains and the overall system to accommodate a more efficient and newer train transportation. With software, to make an update all the developer needs to is open up the source code file, append and change what's there and it's done. Press CTRL C and CTRL V to make copies and upload to the web. Hard Drive space is practically infinite and digital resources are easily accessible. Maybe you mean something entirely different, I don't know.

"There are some, although depressingly few, situations in which we escaped "lock-in". For instance, there was an active campaign in the 1980's and 1990's to promote the importance of visual aesthetics within software. Thankfully, the movement was successful due to the efforts of influential engineers within companies such as Apple and Microsoft- thus we have been saved from ugly fonts and software, for now at least."

If you trying to associate an early computer model with your negative perception of "lock-in" ideas then there are probably far from just a few instances were the technology "escapes". Once a product or idea reaches its zenith and outlasts it's initial purpose, then the luxuries and aesthetics come into play. A telephone has a visual screen and even DVD players and web browsers. A portable CD player has transformed into a pocket size Ipod that can hold the equivalent of millions of CDs and even play games and pick up radio frequencies. Cameras and video recorders are provided with photographic manipulation programs that allows people to edit such media with the craziest shit.

(Continued)

"The freedom and anonymity offered by the internet encourage many people to express themselves in ways they may be unable to within the "real world". Young suggests that this freedom has negative consequences, as people with controversial opinions are given free reign to broadcast themselves."

An honest opinion is better than no opinion. How is expressing a controversial opinion considered negative? Aside from cyberbulling and spamming, which can be moderated irregardless.

Censorship breeds falsifications and inaccuracies. If one wants censorship in their news, turn on the television or read the newspaper. If one wants to hear the real ideas of other people, look on the internet.

"I think the real danger of anonymity is the impact it has upon our individuality."

Lacking a visceral identity can bring out someone's true identity thus bringing out their individuality even if such is perceived as a collective anonymous unit.

"Real people must has left all those anonymous comments on blogs and video clips, but who knows where they are now, or if they are dead? "

Does it really matter? an opinion is an opinion. whether youre male or female, gay or straight, white or black, if youre a criminal or a priest it shouldn't have any weight or special entailment on what a comment says.

"The sterile style of writing on Wikipedia removes any flavor or trace of humanity. In doing so, it filters the subtlety of authors opinions and essential information is lost."

Define "humanity". Wikipedia is an encyclopedia, it's not a journal for the author's personal thoughts.

(Continued)

"Furthermore, the collective authorships approach of Wikipedia tends towards the ideas and opinions of the crowd, potentially devaluing its own content."

It's not really based on the actual IDEAS and even moreso the OPINIONS of the crowd, but rather the information that the crowd has to contribute to the website. You cannot post your opinion on whether Barack Obama should get a second term on his page without it getting deleted. You cannot post your own idea of a 9/11 conspiracy theory unless it's fairly well recognized and has merit. And there's a a rather large group of neurotic pedants who filter and moderate the website so the encyclopedia retains a satisfactory level of legitimacy. It's up to the reader to check the articles' cited sources and from there evaluate the articles credibility. Being able to think for yourself will always trump censored content that's *just* fed to you by people with college degrees.

"Web services such as Twitter ask us "what are you doing now?", yet the question is fundamentally flawed. Twitter does not ask us to share ourselves, it asks us to share fragments of ourselves."

Unnecessary use of semantics. Twitter gives you like 50 characters per message, it's obviously just intended as a communication medium parallel to a text message that can be shared with the world.

As for the rest of your essay. Sure people may rely on computers and technology too much (as it's decreasing the amount organic communication and replacing leisure outlets that would normally be allotted to activities that involved excercise with computer games and online gossip) but I really think you're making an argument that no one is really arguing with to begin with.

Computers are indeed very stupid, if anyone disagrees they don't understand the question and/or the logic and process in which a computer works. Computers are simply nothing more than a (extremely useful) tool, but no one is deifying them in the first place to begin with.

(Continued)

Gardeners will become useless once there is lawnmowers that can mow the grass by automation (which I'm sure has been invented). And there's no real harm in "robots" taking over human tasks. Thinking otherwise is delaying the progression of humanity (ironic eh?)- would you rather have a calculator do your calculations or have a team of ancient mathematicians move the pegs on an abacus?

Once the technology to cover a human job has developed, there are ALWAYS new job opportunities that arise for humans, anyway.

the drugs aren't right