Logo

Online
Journalism

Technology not as Sophisticated as the Book

Written by

November 1, 2012

The United States, which never quite tires calling itself the most powerful democracy in the world, for decades now has had presidents whose core strength has been their ability to pip their opponents at television debates. It has been so since Ronald Reagan down to Barack Obama. The election of Barack Obama as president though […]

The United States, which never quite tires calling itself the most powerful democracy in the world, for decades now has had presidents whose core strength has been their ability to pip their opponents at television debates. It has been so since Ronald Reagan down to Barack Obama. The election of Barack Obama as president though historic for being the first black person to take up residence in the White House, has another less touted first to its credit – the use of social media as a critical campaign tool. His campaign team’s clever use of social media networks is unprecedented. The Obama team of young tech enthusiasts had changed elections dynamics for ever, at least in the United States. This, of course, is not to write-off the power of the television screen just yet.

It is never a bad idea of the largest democracy in the world to keep a close watch on what is moving and shaking the most powerful democracy in the world.

Ever since Tim Berners Lee in the early 1990s thurst a face to the cold, unfriendly, geeky Internet in the form of the World Wide Web; this thing called the Web has been creeping up on us – on all aspects of our living lives. It is not bad, not all bad at least.

In essence, the Web in particular and the Internet in general, has been about access and interaction. Access in an ever-improving user-friendly manner for consumption of information and also its production. Interaction in the easiest manner to potentially anybody who owns a computer, mobile and any other Internet-enabled device. It was the unfriendly interface to access the Internet that drove Berners Lee to design the Web. The Web made it possible for ordinary non-geeky folk to be able to create content, link to content created by others and access content seamlessly from potentially anywhere in the world.

The first phase of the Internet, that is, when it was opened to the public, it had some important constituents: the World Wide Web (Web, for short), Usenet, Internet Relay Chats (IRC), E-mail and Mailing Lists. Even some later-day geeks might hardly have used Usnet and IRC, that’s because the Web in the meanwhile subsumed all these Internet technologies, every time giving them a cleaner, friendlier interface. So the first dot-com boom in the late 1990s had people accessing their e-mails from the comfort of a browser. The browser is the applications that has been the window through which people have traversed the Web. The example of the emails being accessible via the Web was indeed a very significant technological leap, which has a local flavour in the form of Sabeer Bhatia. Bhatia, a young man from Bangalore who was studying in the US, made that happen when he launched the erstwhile hotmail.com service. Hotmail.com gave out free e-mails that could be accessed from anywhere via a Web browser. The company was quickly bought over by the software giant Microsoft that left Bhatia a millionaire many times over. Microsoft recently announced that it was moving hotmail to Outlook mail, in a sense, bringing to a rather unusually slow end to a chapter in Internet history. As to be expected, there were many clones of the service. Even big portals like Yahoo and AOL have viewed doling out free email services to increase “eyeballs” to their portals. While the big fish have survived, many more have not been so lucky, especially after the ascendence of Gmail, Google’s free e-mail service. One probably does not need to elaborate on the search engine giant, Google.

Google, in essense, was a need. It’s success and continuing relevance as a critical destination on the web is proof not only of the fact that people had a lot to say with the filters that “old media” had imposed on them, but also that they want to get to the best content and link to that content. The crux of Google’s search algorithm is its attempt to ferret out from millions of web pages the best content that “you” are looking for.

Usenet is a discussion system that allows people to subscribe and participate in discussions on any topic you can think of. Say, you want to discuss how to take care of your pet poodle to your thoughts on astrophysics. So discussions, sharing of information or even just eavesdropping on “smart” talk among themselves has been part of the “net culture”. The possibilities that it offers is immense. Way back in 2000, a Russian nuclear submarine sank to the bottom of the Barents Sea in icy cold Arctic Ocean. It took down with it the lives of 118 sailors. It is a tragic tale that became personal for me because of the possibility the Internet offered me like no other medium can. The big question troubling the world was whether the submarine went down because of a nuclear explosion inside it. I was scanning through newsgroups if there was any “additional” news I could find on the disaster. I was sure academics would be talking about it. Lo and behold I had a little post by a then Post-doctoral Research Associate from University of Arizona saying that the submarine could not have sunk because of a nuclear accident on board. He had proof of it based on the seismic activity around the time of the disaster. The researcher provided a link to his homepage in the university website. That was great news for the world and for me. While the world could heave a sigh of relief that the explosion wasn’t nuclear, I could get a story out on this sitting in Bangalore about an accident in the Barents Sea with proof from a scientist in the US. This, nevertheless, is still Web 1.0.

Another high in my life was when I got to know that Columbia University’s new media department had a class newsgroup. The group was meant to keep the professor and the students connected even after class. So the professor used to post his thoughts and articles on new media, while students would discuss things till the class met again. I thought it wouldn’t be a bad idea to ask them if I could be part of that class. This again is from way back in the year 2000. I wrote to the good Prof. John V. Pavlik, who was then in Columbia University and had started the newsgroup. He promptly wrote back to this unknown journalist from Bangalore, India. Pavlik wanted me to contact the student who was in-charge of the group and said he didn’t have a problem with me joining in. The students of the class agreed. And here I was, sitting in Bangalore and devouring things that were happening in the new media course’s NewsLab in Columbia University. This too, nevertheless, is still Web 1.0.

So newsgroups or Usenet was where a lot of information was being shared and heated debates conducted. With the rise and the rise of the Web, Usenet’s momentum was slowing down. It was left to the corporate capacity of Google to take on the responsibility of saving these decades-old conversations and making it accessible to the world. With chat coming too to the Web, as an add-on to your free e-mail address, the use of IRC too has declined to oblivion. So the Web subsumed them and in the process transformed itself to become Web 2.0 – the second phase.

My stories above might sound passe now, given the march of modes of communication that are all bundled under Web 2.0. It is no more a question to be asked, every big incident, take the tsunamis or other natural disasters or take the so-called Arab Spring or other political upheavals, the stamp of new media is writ large.

One can, without much difficulty, download courseware from some of the web universities in the world. One can, without much difficulty, be a part of online course and seminars that go on around the world. It has become important for news organisations to solicit and display reactions from readers and viewers via Twitter or Facebook. The Web, especially in its latest avatar as a social media has made it imperative for exclusive citadels like universities and media to open up or at least be seen to be doing so. So do you think the Massachussetts Institute of Technology is a great place to be? All you need to have is access to the ItunesU app, through which you can subscribe to the course of your choice and make sure for yourself what you think of MIT’s courses.

So you think Syria’s Assad is not pounding his foes with bombs or you think that the anti-government forces are unarmed “fighters”? All you have to do is follow a few twitter hashtags. Hashtags are words that users create and spread around to string together tweets on the topic. As I write this, a hashtag, Girisha Hosanagara Nagarajegowda is “trending” in India, which means it is popular among twitter users. Obviously, Nagarajegowda’s paralympic feat has caught the imagination of people. You could be rest assured that Nagarajegowda will get a good share of broadcast and print media attention, I’d dare say it is because his name is trending in Twitter.

That’s the decisive change that has happened in the world of social media. It is no more an echo-chamber. It has a tangible impact on the real world. One has only to follow twitter accounts of journalists. One, you can get a sense of what is going to be uppermost in the nightly news and two that you can actually have conversations with editors and reporters. Social media gives you a sense of the news trends and it has brought ivory-tower editors to your mobile/computer screens.

Today a journalist is judged by how responsive she or he is to the issues raised by users. That should not come as too much of a surprise. In the west, especially in the US, politicians too have had the uneasy responsibility of being accessible via social media. They are being ranked on their social media provis, but what’s also being  judged is how chatty are corporate honchos.

That should sound like music. With more technological footfall, more and more fortresses will become accountable, politicians, media, corporate bigwigs and you can add anybody else to this list.

Yes, there always is a flip side. Look at allegedly how the Obama campaign’s new media strategy has been worked out. In a sense, it exposes how much one has become vulnerable to “social media” profiling.

It is assumed that from your Friends list, your public postings and your profile that you are a supporter of the Democratic party, your friends too should be so. So, with something called data mining, one is able to peruse through tonnes of data that people have willingly put out in social media sites, to figure out ones political affiliation, interest in the kind of commercial products and so on.

So be it political campaigns or corporate ones, the average citizen is now an average consumer. The consumer then can be fed with the right kind of message. It is efficient for the campaigner and almost hypnotically acceptable to the “profile”.

There was a promise of the Internet. A promise that was built on a lot of voluntarism. I believe for instance, the first spam was sent out on May 3, 1978. It advertised the release of a new set of System-20 minicomputers from the now defunct company DEC. It reached a paltry 600 people, but it was generally frowned up on by the geeky recipients too. The perpetrator was easily identified, his boss got a dressing down and the system was purged off the offending message. With increasing access, spam now is a billion-dollar problem. The perpetrators cannot be identified and we have a thriving industry that provides anti-spam software. It would, therefore, be naive to want to go back to those days when the Internet was a effort of volunteers, colleges and government agencies. It today has very many “stakeholders”. While we should recognise the ability of organisations to get hold of information that is personal, we should understand that the promise of this media is still very much in our best interest.

There are of course less complicated technologies that have found greater acceptance amongst the general population. SMS, for instance, is one such form of communication. But even that is prone to gaffes. Recently a family friend ended up in an embarrassing situation. He is ruing the moment he decided to forward an SMS he received from an ex-colleague of his. The ex-colleague’s message read something like “My wife passed away. Funeral today.” The saddened friend thought it best to spread the word around by forwarding the SMS to people in his contact list, little realising that the forwarded message now erroneously announced the death of his own wife, as the message went out in his name and not of his ex-colleague. Such embarrassments are of course redeemable. There are other recent instances of the use of “new media” that have resulted in a lot of hardship. For every episode of “illegitimate” use of communication media there are equal, if not more, number of examples to cite about the “boon” that new forms of communication have ushered in. The press is full of such stories, especially highlighting the seamier side of it.

Allow me; nevertheless, to step back a little and dwell on something that borders on the “boundaries” of new global media communication. Not long ago there was a furore in the social media world when a group of people belonging to the North East of India were not able to create accounts in a social networking site, ostensibly because their surname, chutiya was an objectionable word as decided by the inputs of Indians themselves. So in its efficient wisdom, the social networking site deemed it fit to remove a group created by people belonging to the chutiya brethren. Calling any other Indian chutiya might result in a stern rebuke, to put it mildly, but this was a different kind of “denial of service”, it is a misunderstanding, of course, but also a logistical nightmare. For designers of websites the instructions are that colour has different cultural connotations and being sensitive to it is important. It is a difficult yet easier task. A website typically is designed for a particular milieu, but in the case of social networking sites that boast of a total subscriber base far exceeding the populations of many countries put together, the nuances of what is culturally correct, where to draw the line, how much to bend are concerns that can fall foul with “liberal” thought. There are no easy answers for this, but all answers might not be found in the technology of the day.

Take for instance the exodus that followed the SMSs that many people from the North East residing in many cities in South India got. Social media too was used to cause an exodus seen most dramatically in the railway station in Bangalore. The government too countered the rumours being spread by using the same technologies and media, to little effect. People were willing to listen to unknown rumour-mongers than people in authority.

The bias, prejudices and divisions in society often get reflected in social media, more obviously than in traditional media. For those who choose to critique media would be better off understanding that the new digital mirror that is reflecting on society might be looking deeper into the crevices than past media have managed to do. The solution therefore must lie in society not in new media.

But let us be objective, these incidents expose that like all earlier media, new media too is discriminatory. They invariably leave large swats of the population disenfranchised. The exclusion comes in the form of economic barriers, of course, but what is often less obvious is the technological limitation that masquerade as inevitabilities.

But a book, if it is missing from the shelves of homes, is quintessentially because of socio-economic reasons. In an interesting report on The Magic of Reading for Microsoft Corporation’s research on the psychology of reading, the author Bill Hill writes: “The book is a complex and sophisticated technology for holding and capturing human attention. It is hard to convince people of its sophistication; there are no flashing lights, no knobs or levers, no lines of programming code (there really is programming going on, but not in any sense we’d recognise today…)” He reminds us that, “The book as we know it today did not happen by chance. It evolved over thousands (arguably millions) of years, as a result of human physiology and the way in which we perceive the world. Such luxuries of time and space over which the book as a technology has evolved is inconceivable for digital technologies. And it shows.

So what you do end up with is a device and means by which communication, interaction, publishing, and so on becomes easier. It is possible for a mother to click a few buttons and talk to her daughter across oceans. It is quite in the realm of the possible that those clicks of buttons will not be required. In the meanwhile, it will take a little of getting used to. Given what we gain out of it, there seems to be an eagerness to put up with such technology-related irritations. Let’s hope that new communication technologies get to be as sophisticated as the unobtrusive book, sooner.

 

The Kannada version of this articles appears in the Deepavali Annual issue of Prajavani.

+ posts