A few days ago I enjoyed the first episode of a wonderful new documentary series on BBC2 called ‘Rise of the Continents‘. It’s hosted by geologist Professor Iain Stewart – a highly engaging and skillful presenter. The programme sets out to explain the geological formation of Earth’s great continents from the ancient mega-continent of Pangaea, and the immense subterranean forces in the Earth’s core that drive geological change over aeons.
Geology, like other sciences, has the potential to be televisually rather dry. However, ‘Rise of the Continents’ uses clever animated graphics to add context and immediacy to the landscapes that Stewart visits. The climactic revelation of the opening programme was the presence of a massive ‘mantle plume‘ beneath the Great Rift Valley in North East Africa – a phenomenon that would eventually rip the ancient landmass apart.
Mantle plumes are colossal protrusions of magma, extending from the Earth’s core towards the upper mantle and the crust. These plumes, explained Stewart, exerted huge volcanic pressure in surface “hot spots” which eventually thin and break the crust – tearing the continent into new land formations. The programme ended with Stewart drawing lines on a map with black marker to illustrate potential rupture points in the African landmass.
Epic stuff, and great factual TV. I have no knowledge of geology, so I watched as an enthusiastic amateur – thrilled by the grandeur, and the mind-boggling facts imparted with calm assurance by the professor.
Did I say facts? Aah… Read more…
History, when viewed closely, can look quite mundane.
Twenty years ago this week, a short document was released by CERN, the European research centre – builders of the Large Hadron Collider. It was a simple, dull document. But on page 2 there was a short paragraph that changed everything.
“CERN relinquishes all intellectual property to this code, both source and binary form and permission is granted for anyone to use, duplicate, modify and redistribute it.”
They were talking about the rights to software code that drove a new way for computers to talk to each other. They were talking about the World Wide Web.
This week I have been disturbed by two vivid memories, 23 years apart. Both memories are from sporting events that I attended. Both were memories of the crowd that attended with me. And both, in their way, concern technology.
The first memory, only weeks before, is of an athlete powering towards the finishing line to claim Olympic gold. Many in the crowd scream for joy. Many more stand mute – arms raised to faces or above heads – viewing the spectacle vicariously through the pale LCD displays of camera phones, or through the huge screens on the stadium roof. Others type furiously onto keypads. Remote cameras on high wires or on high-speed track fly after the athlete, relaying the action to countless unseen places beyond the stadium. As the athlete crosses the line, the truth of it crosses continents – the images simultaneously encoded onto the storage cards of countless individual mobile telephones. This crowd is not co-located, but attached by technological threads to a network beyond the stadium walls. The devices in the hands of these people enable testimony to leak through terraced walls like gamma radiation.
The second memory, twenty-three years before, is of a football match in Sheffield, England. I stand watching a tight row of policemen, tense and confused, who have been ordered to guard the area in which I sit, in expectation of some non-existent violence. Behind their backs, on the football pitch, a scene of genuine tragedy and horror takes place. Crushed and suffocated corpses of civilians are being assembled on the turf. A father, shocked dumb, carries his young child, limbs hanging, faced bloated with asphyxiation. Denim-clad teenagers weave between the line of policemen to tear the advertising hoardings from the terrace walls to use as makeshift stretchers. The policemen, under orders, stay completely still – the desperate screams of the public they are charged to protect audible behind them. The teenagers dash back through their ranks to attempt a rescue of the still-living. The ambulances, like the policemen, do not come.
It is 1989. In the crowd, there are no digital cameras. Nobody is texting. The mobile telephone is the absent, oversized preserve of the wealthy. The crowd is locked into a purely co-located horror. The only testimony that leaks beyond these walls is through highly-regulated broadcast television, or the radio signals of the police. The CCTV camera footage in the stadium would soon, mysteriously, disappear.
These two memories diverge in one crucial respect. The first event is celebrated as a joyous, global truth. The second, until last week, officially never existed in the way that I’d witnessed it.
* * *
Technology has had a long, and sometimes stormy, relationship with truth. It embraces both the civil enlightenment of Caxton’s printing press, and the doctored photographs of Stalin. Modern mobile communications technology is both blessed and cursed for its communicative power. To some it is a democratic guardian, as demonstrated in Tahrir Square, while to others it is a brain-melting teenage curse. Yet what was striking about the mobile telephony in the hands of the Olympic crowd was the fracturing of a single event into countless channels of digital information – transmitted to a thousand places, or recorded in multiple media. The truth is not simply witnessed any more: it is encoded as deferred or mediated experience.
So why does this observation, and the two contrasting memories, cause me to lose sleep?
Because I can’t escape the thought that if the Hillsborough tragedy had occurred years later, then mobile telephony might have borne witness to a truth that the authorities would have been unable to suppress. A thousand images, texts, calls and facebook posts from supporters inside the stadium would have scattered the message to the world before a cover-up could be concocted. Technology might have dignified the innocent and the young dead in a way that the authorities were unwilling to do. The powerful might have been persuaded of the futility of their deceit, and reminded of their duty of care. Technology might have summoned medical help. And some of the innocent might still be alive.
Or would they? Am I naïve to think that digital technology would rescue such a truth? In many large public events, it is impossible to receive a decent phone signal due to overcrowded cell networks. And would information be allowed to flow so freely if the circumstances did not suit those who hold power? After all, under ‘emergency protocols’, mobile networks can quickly be switched off by security services. And might imaging devices not conceivably be confiscated to ‘help in enquiries?’ In other words, is technology only as powerful as authority allows it to be?
Paranoia? Perhaps. I suppose seeing the innocent dead slandered as liars, hooligans and drunken thugs by those in power for 23 years can challenge ones perspective on reality.
Still, I’d have given anything for a mobile phone that day. If only so I could have called home to tell my mother that her two sons were still alive. When we left the stadium, every telephone box in Sheffield was crammed with supporters trying to relate their private truth to loved ones. We eventually found a phone box on the high moors outside of Manchester. It was late. My mother was relieved beyond belief to hear us safe. Landline telephony from an earlier age had stepped in to deliver a small mercy.
Somewhere out on those same moors, in that same earlier age, a twisted couple had murdered and buried innocent children. For those victims, there had been no small mercy. Portable telephony did not exist for those vulnerable children to carry when danger threatened. No SMS could leak the truth of their horror. The suffering and the crime was co-located.
Technology has had a long, and sometimes stormy, relationship with truth. Yet technology cannot deliver truth itself, only its representation. The longest and stormiest relationship of all is surely between the truth, and those most advantaged by its distortion. It is, sadly, a dislocation older than the wheel.
I’ve just written a post for Imperial College’s Science Communication blog, Refractive Index, in which I review Steven Soderbergh’s new virus thriller Contagion, and its portrayal of medical practitioners. In particular, I discuss the power of narrative drama to deliver a meaningful science message – not something film is usually credited with doing well.
You can read the post here.
Most scientists, in my modest experience, are pretty critical about how they are portrayed in drama. With some justification. Yet they may be surprised at just how common their complaints can sound. Years spent portraying other people as an actor has taught me that few – if any – professionals in our society are happy to see themselves on screen. Scientists, policemen, lawyers, priests, journalists - you name it, most have collared me at some point to complain that a drama I appeared in bore no relation to “how things really were” in their world. This was often in spite of painstaking script research – on-set advisers – legal checks. Given the amount of criticism, it’s remarkable that drama ever manages to make anyone in society suspend disbelief long enough to tell a story.
Yet it does. Time and time again. How?
Microsoft has unveiled a new-look Windows 8 operating system at the Build developers’ conference in California. The lightweight OS sports the innovative ‘Metro’ interface familiar to users of Microsoft’s latest mobile phone software. This has been designed for a future of tablets and touch-screen interfaces, rather than traditional mouse and keyboard. In the software giant’s endless struggle between revolution and evolution, this is Castro giving Darwin a mighty slap.
Don’t get too excited just yet. This is a very early developer build. By the time Windows 8 is finally released in 2012 (2013?), we may all be using jet-packs, eating astronaut food, and browsing on our new 3D Apple Macs
But I like Metro. It’s a fresh way to do navigation that doesn’t slavishly ape the iOS paradigm. For years I bemoaned the fact that Microsoft sat on its fat monopolist backside and talked innovation rather than actually doing it. The corporate complacency that produced the obese and sluggish Vista seemed to exemplify the company’s philosophy. Promise much – deliver little and late. Protect the core monopolies of Office and Windows and dictate a glacial pace of technology change for a whole industry.
What a difference real competition makes.
Smoking in movies subsidised by the government undermines efforts by health authorities to combat nicotine addiction in the young. That’s the verdict of new research published by Imperial College’s School of Public Health.
The paper estimates that between 2003 and 2009, British tax credits of £338 went to US films that featured cigarette use. This is in spite of an earlier World Health Organisation study which showed that young people who are heavily exposed to tobacco smoking in films are about three times more likely to begin smoking than those less exposed.
The authors argue that such films should receive an adult rating and be denied public funding as an incentive to comply with public health initiatives.
Anti-smoking lobbyists have been strongly supportive. Martin Dockrell, Director of Research at Action on Smoking and Health, said: “The research is clear: the more a young person sees smoking in films the more likely they are to try smoking themselves.”
The release of this paper raises interesting issues for me right now, which I’m happy to share. As well as being a science communicator and technology journalist intern, I am also a professional actor. Over the summer, I’ve been filming a medical TV drama series set in 1950s London. The series is destined for peak-time transmission on British television in 2012. I play a general practitioner in an inner-city surgery.
This in itself is nothing remarkable. I sweep through the early years of the National Health Service with a smart leather bag, white coat and stethoscope. My part is structurally important, but peripheral to the main female storylines.
What is relevant is one particular character detail.
My doctor smokes like a chimney.
Last week there was distant thunder on Facebook’s blue horizon. A new social network is being trialled by Google, called Google+. For the moment it’s only available to invited test users. I’ve apparently been deemed worthy. Is this what the Rapture will be like? Fire and brimstone, or an email from Mountain View California?
First impressions are… impressive. The interface is clean, fast, and there are small innovations that might make it worth shifting from Zuckerberg’s panopticon at some critical mass in the future. A full first-glance review by Wired’s Ryan Singel is available here. It doesn’t strike me as that different from Facebook (if you exclude the sparsely-populated feeling of a nightclub at about 7pm). Yet I feel there’s something much bigger at play here. It’s a deathmatch involving the world’s social tech giants – each leveraging the life out of their consumer data in order to seize a captive audience for their own particular ecosystem. It began with the salesman’s rhetoric of web 2.0 articulated by Tim O’Reilly, and ends with a Web characterised by the acreage of its walled gardens.