My friend Jessamy Carlson, an archivist and researcher at the UK’s National Archives, has just posted a fascinating blog entitled ‘No hope without Penicillin‘. Jess and me share an interest in the administration and production of penicillin in the late stages of World War 2 – which heralded the dawn of the antibiotic age of public medicine.
My interest began as a very personal one: my father’s life was saved by the use of Penicillin on his wounded body in the D-Day invasion. Its early production in 1944 was confined exclusively to the war effort. Only when peace came did the general population begin to know the extent of this medical miracle – and its ultimate limits.
Jess focuses on particular UK government correspondence in those weeks leading up to the first widespread use of the drug. What’s fascinating is that the rumour machine is in full swing – the medical community knows that something big is coming, and their desperation for this new wonder drug is evident. The papers report that inquiries are flooding in from medics far and wide, begging access to this new drug for treating their desperately ill patients. In one moving telegram, a doctor pleads for the life of a child with peritonitis and septicaemia, aware that penicillin is the only thing that might save them. Yet the authorities must withhold the drug for battlefield use.
Another interesting aspect is that knowledge of Penicillin’s efficacy has not yet been fully established – with some believing it might be effective in treating diseases like cancer or leukemia. Yet for me the most arresting item is the prescient remark made by Sir Edward Mellanby (of the Medical Research Council) On the 7th December 1943, who wrote:
“No substance can be more easily wasted, especially if it is poured into patients by systematic treatment”
Mellanby was talking about the risk of antibiotic resistance. Even before the antibiotic age had begun, there were those who worried that we may come to regret squandering this rarest of medical miracles…
Take a look at Jess’s post here:
I was pointed to a thought-provoking article on National Geographic this morning, entitled “How Indiana Jones Actually Changed Archaeology“. The article accompanies an exhibition at the National Geographic Museum in Washington that celebrates the famous Spielberg movie franchise and – interestingly – both its contribution to, and debt to, real-world archaeology research.
Also interesting were the Twitter comments of the person who posted the link to this article. It was Professor Francesca Stavrakopoulou, Head of Theology and Religion at Exeter University in the UK, who said – “As a kid, Raiders of the Lost Ark & Temple of Doom definitely played a part in shaping my love of ancient religion.”
She wouldn’t be the first eminent academic who identified a popular cultural text that fired an early enthusiasm for their specific field. My work in Science Communication has led me to many conversations with science-fiction inspired scientists – or medics who never missed an episode of their favourite TV hospital drama. Yet the subtle symbiosis between popular creative culture and the academic work it can both inspire and feeds off is, in my view, too rarely acknowledged or accommodated.
Recently I chatted with an academic about my belief in the power of popular narrative to allow our society to ‘think out loud’ about human problems or aspirations that have an academic dimension. As I explain in this recent article for the JRSM, I believe popular cultural narratives – even highly fictionalised – can enable an emotional engagement with academic meanings beyond the simple presentation of academic facts. The academic smiled kindly and, with a gentle wave of one hand, suggested that popular fiction or dramatic entertainment was tolerable as a basic means to “get the public through the door” of a serious institution or museum – after which, (I assumed) the proper work of real-world education could begin.
This, I felt, was not simply dismissive of the public as a sophisticated citizenry, capable of extracting valid meanings from popular cultural outputs, but also worryingly naive. It doesn’t just patronise – it fails to grasp the essential human interplay of public imagination and material reality, of emotional engagement and academic imperative, that defines the human relationship with reality.
As a society we dream, and create, and pray, and love, and pretend, and imagine, and dramatise. As the very same society we also study, and construct, and cure, and deduce, and calculate, and think. The idea that these essential human activities operate separately in our brains and in our society under a form of medieval feudalism – rationality and education the noble squire under which a serf-like popular imagination toils – is itself an imaginative fiction. Popular culture is not simply the child of an imagined public ignorance. It is, as Professor Stavrakopoulou suggests, the parent of our academic motivations.
We dream before we specify. We imagine before we do. But we do both – together and separately. We also do not ‘ascend’ from the consumption of popular cultural myth to hermitic rational wisdom. We extract insight from a shared ore of human experiences. The driest of our academic insights are distilled from the same human force that painted its hands onto the walls of the Cueva de Las Manos. That joyous bricolage of hands that the paint reveals made the first tools upon which our modern world was built.
Look at those hands. No individual statuses to be discerned amongst its subjects. Serving no clear evolutionary purpose. Inexact. Illogical. Unnecessary. Yet as immortal as the rock itself.
‘We are here. We feel. We dream.’
Perhaps the most poignant message in these walls is that the cultural distraction of these ancient people has now become the subject of serious research by our modern world’s academics. What was popular and shared has become the food for a detailed, rational analysis. Yet the hands of the experts who analyse are indistinguishable from the wild hands they study.
The following message marks a one-off break with my normal blog practice. It is not written by me, but by my young nephew, Harry – a busy student nurse in a hospital in the midlands of the UK. Harry felt moved to pen a message on the day of voting in the UK General Election. I felt his words deserved to be seen more widely – and this was the simplest place to make that happen. I have edited and changed nothing. The words are all his.
I hope whoever emerges as the victor of today’s voting has the dignity, honour and pride to acknowledge – truthfully – what our NHS is. I hope they will endeavour to put pilfered resources back into it – and place it back into the hands of the public.
As a soon to hopefully qualify student nurse, even in my comparably short time training I’ve held a hand of life as it passed, both peacefully and tragically. I’ve watched it begin, both dramatically and beautifully. I’ve watched it break, and then helped it heal. All while under the same word: NHS. And I’m not the only one. Just on my course alone I’ve met countless incredible student nurses and training doctors – all of whom will go on to be inspiring and admirable practitioners. We all want our NHS; and we all want to fight for it. For you. But we need your help, too.
If you’re voting today, just remember that you may one day be at its doors with your loved ones and your life may well be at it’s lowest point; but we, nurses, doctors, healthcare assistants, physiotherapists, occupational therapists, cleaners, caterers, countless other healthcare workers, us fellow human beings; will do our utmost best to make things right again for you. To make things normal again. Even if ‘normality’ is not an option, we will do our best to make it as comfortable and pain free as possible.
Don’t let the NHS be turned into a business.
We are not for profit. We are for life. Human life.
On Friday I had the great and terrifying pleasure of presenting a keynote speech to the Imperial College Graduate School’s Annual Research Symposium.
I used the opportunity to share thoughts on a subject that intersects both my academic studies in science communication and my long career as an actor in TV, theatre and film. The subject was science communication in its widest public sense – in mass public culture, and through mass media.
I called the talk ‘science in the public eye’ – rather than ‘science in mass media’, ‘popular science’, or any other variation – because I believe that the concept of a mass-cultural “public eye” – somewhere where science is increasingly seen through communicators like Brian Cox or Stephen Hawking – has its own character and challenges, easily overlooked by those who wish to see science communicated most widely.
A few days ago I enjoyed the first episode of a wonderful new documentary series on BBC2 called ‘Rise of the Continents‘. It’s hosted by geologist Professor Iain Stewart – a highly engaging and skillful presenter. The programme sets out to explain the geological formation of Earth’s great continents from the ancient mega-continent of Pangaea, and the immense subterranean forces in the Earth’s core that drive geological change over aeons.
Geology, like other sciences, has the potential to be televisually rather dry. However, ‘Rise of the Continents’ uses clever animated graphics to add context and immediacy to the landscapes that Stewart visits. The climactic revelation of the opening programme was the presence of a massive ‘mantle plume‘ beneath the Great Rift Valley in North East Africa – a phenomenon that would eventually rip the ancient landmass apart.
Mantle plumes are colossal protrusions of magma, extending from the Earth’s core towards the upper mantle and the crust. These plumes, explained Stewart, exerted huge volcanic pressure in surface “hot spots” which eventually thin and break the crust – tearing the continent into new land formations. The programme ended with Stewart drawing lines on a map with black marker to illustrate potential rupture points in the African landmass.
Epic stuff, and great factual TV. I have no knowledge of geology, so I watched as an enthusiastic amateur – thrilled by the grandeur, and the mind-boggling facts imparted with calm assurance by the professor.
Did I say facts? Aah… Read more…
History, when viewed closely, can look quite mundane.
Twenty years ago this week, a short document was released by CERN, the European research centre – builders of the Large Hadron Collider. It was a simple, dull document. But on page 2 there was a short paragraph that changed everything.
“CERN relinquishes all intellectual property to this code, both source and binary form and permission is granted for anyone to use, duplicate, modify and redistribute it.”
They were talking about the rights to software code that drove a new way for computers to talk to each other. They were talking about the World Wide Web.
This week I have been disturbed by two vivid memories, 23 years apart. Both memories are from sporting events that I attended. Both were memories of the crowd that attended with me. And both, in their way, concern technology.
The first memory, only weeks before, is of an athlete powering towards the finishing line to claim Olympic gold. Many in the crowd scream for joy. Many more stand mute – arms raised to faces or above heads – viewing the spectacle vicariously through the pale LCD displays of camera phones, or through the huge screens on the stadium roof. Others type furiously onto keypads. Remote cameras on high wires or on high-speed track fly after the athlete, relaying the action to countless unseen places beyond the stadium. As the athlete crosses the line, the truth of it crosses continents – the images simultaneously encoded onto the storage cards of countless individual mobile telephones. This crowd is not co-located, but attached by technological threads to a network beyond the stadium walls. The devices in the hands of these people enable testimony to leak through terraced walls like gamma radiation.
The second memory, twenty-three years before, is of a football match in Sheffield, England. I stand watching a tight row of policemen, tense and confused, who have been ordered to guard the area in which I sit, in expectation of some non-existent violence. Behind their backs, on the football pitch, a scene of genuine tragedy and horror takes place. Crushed and suffocated corpses of civilians are being assembled on the turf. A father, shocked dumb, carries his young child, limbs hanging, faced bloated with asphyxiation. Denim-clad teenagers weave between the line of policemen to tear the advertising hoardings from the terrace walls to use as makeshift stretchers. The policemen, under orders, stay completely still – the desperate screams of the public they are charged to protect audible behind them. The teenagers dash back through their ranks to attempt a rescue of the still-living. The ambulances, like the policemen, do not come.
It is 1989. In the crowd, there are no digital cameras. Nobody is texting. The mobile telephone is the absent, oversized preserve of the wealthy. The crowd is locked into a purely co-located horror. The only testimony that leaks beyond these walls is through highly-regulated broadcast television, or the radio signals of the police. The CCTV camera footage in the stadium would soon, mysteriously, disappear.
These two memories diverge in one crucial respect. The first event is celebrated as a joyous, global truth. The second, until last week, officially never existed in the way that I’d witnessed it.
* * *
Technology has had a long, and sometimes stormy, relationship with truth. It embraces both the civil enlightenment of Caxton’s printing press, and the doctored photographs of Stalin. Modern mobile communications technology is both blessed and cursed for its communicative power. To some it is a democratic guardian, as demonstrated in Tahrir Square, while to others it is a brain-melting teenage curse. Yet what was striking about the mobile telephony in the hands of the Olympic crowd was the fracturing of a single event into countless channels of digital information – transmitted to a thousand places, or recorded in multiple media. The truth is not simply witnessed any more: it is encoded as deferred or mediated experience.
So why does this observation, and the two contrasting memories, cause me to lose sleep?
Because I can’t escape the thought that if the Hillsborough tragedy had occurred years later, then mobile telephony might have borne witness to a truth that the authorities would have been unable to suppress. A thousand images, texts, calls and facebook posts from supporters inside the stadium would have scattered the message to the world before a cover-up could be concocted. Technology might have dignified the innocent and the young dead in a way that the authorities were unwilling to do. The powerful might have been persuaded of the futility of their deceit, and reminded of their duty of care. Technology might have summoned medical help. And some of the innocent might still be alive.
Or would they? Am I naïve to think that digital technology would rescue such a truth? In many large public events, it is impossible to receive a decent phone signal due to overcrowded cell networks. And would information be allowed to flow so freely if the circumstances did not suit those who hold power? After all, under ‘emergency protocols’, mobile networks can quickly be switched off by security services. And might imaging devices not conceivably be confiscated to ‘help in enquiries?’ In other words, is technology only as powerful as authority allows it to be?
Paranoia? Perhaps. I suppose seeing the innocent dead slandered as liars, hooligans and drunken thugs by those in power for 23 years can challenge ones perspective on reality.
Still, I’d have given anything for a mobile phone that day. If only so I could have called home to tell my mother that her two sons were still alive. When we left the stadium, every telephone box in Sheffield was crammed with supporters trying to relate their private truth to loved ones. We eventually found a phone box on the high moors outside of Manchester. It was late. My mother was relieved beyond belief to hear us safe. Landline telephony from an earlier age had stepped in to deliver a small mercy.
Somewhere out on those same moors, in that same earlier age, a twisted couple had murdered and buried innocent children. For those victims, there had been no small mercy. Portable telephony did not exist for those vulnerable children to carry when danger threatened. No SMS could leak the truth of their horror. The suffering and the crime was co-located.
Technology has had a long, and sometimes stormy, relationship with truth. Yet technology cannot deliver truth itself, only its representation. The longest and stormiest relationship of all is surely between the truth, and those most advantaged by its distortion. It is, sadly, a dislocation older than the wheel.