A strong, clear signal in a world of noise
• Part 2 of Chris’ 2016 blog is here.
June 14th 2016: Microsoft to buy LinkedIn: My take
Microsoft has announced its intention to acquire social business platform, LinkedIn, for $26 billion. The huge cash deal – a bubble in its own right – values LinkedIn at $196 a share, or $60 per user. Jeff Weiner will stay onboard as CEO of the network, and report to Microsoft chief Satya Nadella.
As LinkedIn will retain its brand and independence, the huge size of the deal reveals that Microsoft must see the promise of a financial return in each of LinkedIn’s 433 million users, despite the value of LinkedIn’s online ad revenues plummeting this year. And of course, $26 billion buys the data of 433 million people – who gain nothing from being turned into a Microsoft asset.
Worrying for Microsoft must be the fact that only one-quarter of LinkedIn’s members use it monthly. For most it is something to keep ‘topped up’, rather than a core communications tool. However, registered user numbers are rising by about 20 million a quarter – impressive – and the trend has been upward from day one.
In recent months, LinkedIn has improved its interface and expanded into a blogging platform, and offers its premium users a range of business, analytics, and networking tools. So it’s a safe bet that Microsoft plans to expand the network’s functionality still further – as it has done with its other main platforms, turning them into cloud-based application suites.
Given the prevalence of unified communications and collaboration tools within Microsoft’s services, and its shift of emphasis towards enterprise comms in the cloud, a conceivable future for LinkedIn might be as a corporate communications dashboard and collaboration tool. Certainly, the acquisition says that Microsoft’s focus is now on business users, not on consumers.
Less appealing for the average user would be a future as a noise- and lead-generator for opportunistic marketers – not to mention for the soft-selling of Microsoft products that is a constant presence on its other public-facing platforms. Certainly, LinkedIn is seen by many users as a ‘self-centred’ messaging tool, as much as it is a networking platform. Stand by for a torrent of spam.
And here lies both the advantage and the risk to the world’s largest software company, and to LinkedIn: Redmond’s deep-rooted tendencies to over-engineer solutions, to replace simplicity with clutter and unwanted functions, and to mistake its own (often clumsy) desire for business advantage for a human-friendly interface.
Goodwill is core to a social network, and Microsoft is very bad at generating it: witness the nagware pushing people towards Windows 10, for example. Online, the only people pushier than Microsoft are spammers and SEO consultants.
Indeed, spending $26 billion represents a special kind of pushiness: it’s money that shouts: “We’re still here. We’re still relevant, even though we’re nowhere in mobile or the consumer space. We can spend $26 billion on buying someone else’s idea and customers.”
But under Nadella, there has at least been a recognition that the “more is more, and louder is better” approach of his predecessor no longer works. Sometimes ‘less’ really is the answer, and complexity is best hidden – especially in a world in which users like to feel in control of their own destinies and technologies.
That is certainly the case on LinkedIn, where users manage their own careers, too. So let’s hope that ‘simple, focused, open, and extensible’ remain the watchwords for LinkedIn, rather than ‘complex, broad, closed, and pushy’.
9th June 2016: Surveillance review “a stitch up”, says MP
The government has been slammed by an MP over its dogmatic attitude to the revised Investigatory Powers Bill, which incorporates the so-called Snooper’s Charter.
The Greens’ Caroline Lucas, one of the few MPs to oppose the Bill, claims that while Home Secretary Theresa May has listened to objections from the House of Commons, civil liberties campaigners, and the IT industry itself, she has taken little action in answer to the criticisms.
She told me, “Despite claims from the Home Secretary that the government has listened and learned, its latest proposals do not offer proper protection against unwarranted surveillance.
“The proposed blanket collection and retention of personal data is a breach of fundamental rights; there’s insufficient clarity about the role of the Investigatory Powers Commissioner; and any legislation of this nature should include regular evaluation, including to ensure it reflects the most up-to-date evidence.”
The government has announced a review of its proposed powers, but Lucas has added to the mounting criticism of how that review will be implemented. She said, “Appointing a former Director of Technology and Engineering at GCHQ to be one of the three people looking into our surveillance laws doesn’t fill me with confidence. On the contrary, it suggests the whole thing is likely to be a stitch up.
“It’s encouraging that the Home Secretary has recognised the problems with bulk retention powers, but I am not wholly persuaded that the announced review, reporting in the summer, will address the important questions that need to be answered.
“When governments want to undertake large-scale infringement of the individual’s right to privacy – for example, by allowing records of all our communications to be kept as a matter of course – we need effective scrutiny more than ever.”
She added: “If this review is genuinely aimed at enabling such scrutiny, and allowing the development of an evidence-based approach to surveillance, the Home Secretary needs to guarantee that the process will be fair – and she really ought to be halting the further progress of the Investigatory Powers Bill until we know the outcome of her review.”
The revised Investigatory Powers Bill is flawed and misguided, will have the reverse effect of its claimed purpose, and is driven by an apparent belief that context-free data can reveal criminal intent [see this standalone report for more on this]. Significantly, many technology and communications companies oppose the plans, not least because they will be pushed into the frontline of enforcement, putting their own customer relationships and data protection policies at risk.
As separate private enterprises, IT suppliers’ responses will inevitably be piecemeal and inconsistent, while the recent hack of TalkTalk, which compromised the data of thousands of accounts, demonstrated that many service providers are ill-prepared for the task. If the Bill becomes law, it will be open season on citizens’ browser histories and private communications: a rise in cybercrime, in fact.
More, if the proposed transatlantic trade deal, TTIP, is approved then it will give any transnational corporation the power to sue the UK government if their business, reputation, customers, or products are in any way compromised by the Bill’s technical requirements and obligations.
Multibillion-dollar lawsuits are a real possibility. Apple’s tussles with the FBI and Microsoft’s recent decision to sue the US government over private data requests are evidence that IT suppliers will defend their products, services, and business models against state intrusion. Earlier this year, a number of major IT companies formed a coalition to fight government surveillance.
While tech companies have little choice but to pay lip service to Whitehall’s plans, many are now actively pushing their customers towards the use of encrypted communications in order to strengthen their privacy and data security. This alone suggests that the Bill may end up being an expensive waste of time. The estimated cost has already soared from £150 million to over £1 billion. The final cost could be ten times higher than that.
The potential Brexit adds another dimension to the debate, as if the UK leaves the European Union this will strip away many citizen protections and rights, along with essential judicial oversight.
With the EU’s incoming General Data Protection Regulation, which is designed to strengthen and unify data protection across the continent, a Brexit could leave the entire UK data protection environment in turmoil. Factor in the British government’s surveillance powers coming into effect soon after, and it is hard to see why anyone would see the UK as a safe place for digital business.
MPs had another chance to vote on the Investigatory Powers Bill earlier this week, and the Greens’ Lucas was – again – one of only a handful of MPs to oppose it. “I was deeply disappointed that Labour MPs abstained on block and did not join me in opposing the government’s plans to snoop on you,” she said.
“Earlier this week, I also backed a number of amendments to try to improve the legislation if it does pass – which, sadly, is very likely, given only a small number of MPs have been prepared to speak out in opposition.” She has previously said that it is already too late to stop the Bill from becoming law.
One amendment won by opposition parties this week was to exempt people from any surveillance caused by their membership of a trade union, but the fact that this had to be won as an exemption is surely evidence of legislation that could be highly political in purpose and execution.
History has proved time and time again that surveillance powers in the control of dogmatic, ideology-driven governments will always be used for political ends. This plus a jingoistic, nationalist-driven Brexit represent dark days indeed for the UK.
The Bill is now being considered by the House of Lords.
23 May 2016: Media Training and Public Speaking Skills
From the end of June, I’ll be co-running a series of two-day courses that redefine media training. For years, identikit media training courses have presented the same basic set of ideas and skills. They’re great, but miss an entire skill set: presentation and public-speaking skills, and the ways in which media training and public speaking merge in a world of TED Talks, conferences, events, social media, blogs, viral videos, and more.
This new set of courses will be co-hosted by myself (media training for the digital age) and by a Guildhall-trained actor and trainer, who will focus on public speaking, presentation, and stage skills in a linked, two-day course that aims to make you the best that you can be, regardless of the skills that you believe you have in terms of stage craft and public speaking.
Remember: all the world’s your stage.
9 May 2016: IBM puts quantum computing in the cloud.
But is the future on, off, or both at the same time?
Enterprise behemoth IBM stepped into the quantum age last week by making a quantum processor available in the cloud as an on-demand service for experimental applications.
The ‘IBM Quantum Experience’ combines the new machine with a dynamic user interface in the cloud, demonstrating that quantum computing is no longer a theoretical concept, but a working technology with a roadmap for scalable development in the decades ahead.
The processor itself is housed at IBM’s TJ Watson Research Center in New York, where it is kept in a highly controlled, supercooled environment. This is because quantum systems and quantum information are extremely sensitive and error-prone, and can be affected by vibration, radiation (including heat), and other local interference.
So how does it work? Here’s the science bit.
Bits of the truth
The processor comprises five superconducting quantum bits (qubits). Qubit-based computing is a radically different concept to classical, silicon-based systems, in which a switch (transistor) is either on or off. By definition, there’s no state between 1 or 0 in a binary system, but in a quantum processor any number between 1 and 0 is valid, according to Dr Stefan Fillip, quantum researcher at IBM Research in Zurich.
Such processors also use the principle of quantum entanglement, in which any change in one of a pair, or group, of subatomic particles creates an instantaneous change in the other(s), meaning that the particles form a single integrated system.
The higher the number of quantum bits, the higher the number of simultaneous calculations that can be made by the processor – a form of highly efficient parallel computing.
IBM’s Fillip explains that the five-qubit machine is a small but significant milestone on the road towards creating a universal quantum computer. He says: “Yes, it is a true quantum computer, but a very small one. Working with superconducting qubits […] gives us the opportunity to play around and test quantum mechanics and what you can compute with the technology. It’s a stepping stone towards a large quantum computer.”
But what is a quantum computer for? Says Fillip: “The application is probably not geared towards personal use… [but] what IBM is doing is fostering more applications for the technology. Making this publicly available will help to answer the one big question of what it is for and develop the field of quantum computing.”
In a sense, therefore, IBM is outsourcing and open-sourcing the questions ‘why?’ and ‘what do we do next?’ But there’s a caveat, explains Fillip: the processes of a five-qubit quantum machine can be simulated on a classical binary device, so “there’s no claim that there’s a ‘quantum advantage’ as yet,” he says.
However, in five to 10 years’ time, IBM predicts that 50-100 qubit systems will be up and running. Much beyond that, and quantum processors would pass the point at which binary systems could keep up with them without access to impossibly vast memory resources.
So the Moore’s Law-defined era of binary processors may be drawing to a close, says IBM. But is it, given that the nature of the computations would seem to change with the nature of the computer? Do quantum computers have day-to-day or enterprise applications?
The long-term answer is yes: a core advantage of quantum processors is their ability to analyse large amounts of unstructured data. Quantum computing and big data (or the types of supermassive data that will be gathered by the SKA Programme – see blog entry below – and similar projects), would seem to be made for each other.
And arguably, the less structured data becomes, the less likely that research will be prone to confirmation bias, opening up ever more esoteric areas.
Other real-world applications are fast emerging. Increasingly, R&D is taking place at subatomic (quantum) level across many different industries. For example: the search for, and manufacture of, new materials with extraordinary properties, such as super strength, resistance, or superconductivity, and the quest for new medicines.
Fillip says: “Nature is quantum, in that you have to deal with the laws of quantum mechanics to describe nature and to simulate what happens in nature, such as whenever you want to understand or develop new types of material or processes, or chemicals and drug designs. Classical computers are not as good at simulating the natural world.”
Another application is security, says Fillip. In so-called ‘blind quantum computing’ the computer has no ‘idea’ of what type of data it is processing, unlike in binary systems, in which data are typically described.
However, given the extreme sensitivity of quantum systems, it’s highly unlikely that there will be a quantum computer on your desk in the next decade. But Fillip adds: “In the long term, there is no known obstacle to making them more compact and personal. We are at the beginning of this stage.”
To the future…
Even quantum physicists don’t fully understand the strange and counter-intuitive world of quantum mechanics, in which single particles move like waves or communicate with each other instantaneously (in apparent defiance of Einstein’s universal speed limit). But the scientists know one thing: quantum mechanics works, and some of the most bizarre theories have been tested under lab conditions.
While the comparatively slow speed of cloud communications would seem to strip away at least one benefit of quantum computing, the advantage of ‘qubits in the cloud’ is the unique processing power of those quantum states. Over the next 10 to 20 years, that potential will become increasingly obvious, even as the technology becomes more accessible.
So our on/off romance with silicon may soon be replaced with a younger, less predictable model: a force of nature, no less, and one that’s super small and supercool.
• An earlier version of this article was first published on Diginomica last week.
27 April 2016: The biggest Big Data project in the universe
Fifty-five years ago, cosmonaut Yuri Gagarin became the first man in space, four years after the launch of Sputnik. Since the dawn of the Space Age (and the first radio telescopes in the 1930s) we’ve gathered more data about the universe than in all the rest of human history, via technology in space and on the ground.
Cosmological data is Big Data; the biggest there is. And one organisation knows more about planning for Big Data and how to process it when it arrives than any other enterprise on the planet. Indeed, we may need to coin a more appropriate phrase for what it gathers in the decades to come: supermassive data. [See below for some astonishing facts and figures.]
The Square Kilometre Array (SKA) is the biggest science project on or off Earth: the building over the next two decades of a series of giant radio telescopes in remote parts of Australia and southern Africa, to create a globe-spanning dish (in effect) with a surface area over 200 times larger than the Lovell Telescope at Jodrell Bank.
This UK-centred international programme – headquartered at Jodrell Bank – is designed to understand aspects of fundamental physics on a universal scale, such as gravity and magnetism, all the way out to more traditional astronomy topics, such as supermassive black holes, the origins and evolution of the universe, dark matter, and the dark energy that scientists believe is driving the universe’s accelerating expansion.
A universal time machine
Professor Phil Diamond is Director General of the SKA Organisation. He says: “In many ways, you can think of the SKA as a time machine, as we’ll be able to look back in time and make movies of the evolving universe. We’ve recently published our science case. It comes in two volumes, totalling 2,000 pages, and when dropped on a minister’s desk, the nine kilograms make a resounding thump – which is the principal aim of the printed copy!
“I haven’t mentioned SETI [the Search for Extraterrestrial Intelligence], but we will be the ultimate SETI machine, too. It’s not one of our main aims, it will be a byproduct, but if we do detect that little signal then I think that would address some of the funding issues we might have.”
The UK has committed £200 million to the SKA to date, and the Australian government A$300 million, but over the next few years the project will need billions of dollars of investment, the case for which the organisation is building today. Currently, it is a not-for-profit UK company, limited by guarantee, but it will eventually become a treaty organisation and intergovernmental project, similar to CERN.
The cosmos is unimaginably vast: a notional beam of light travelling away from Earth at 671 million miles an hour would take 46.5 billion years to reach the edge of the currently observable universe – a universe that is constantly expanding, with its multitude of galaxies rushing away from each other. In it, there are more stars than there are grains of sand on all the beaches in the world, but the cosmos is so vast that the light they emit has not had time to reach us during the universe’s 13.7 billion year history. This is why the sky at night is dark, rather than filled with constant light.
Using the most common element, neutral hydrogen, as a tracer, the SKA will be able to follow the evolution of the universe all the way back to the cosmic dawn, a few hundred thousand years after the Big Bang itself.
But over billions of years the wavelength of those ancient hydrogen signatures becomes stretched via the doppler effect, until it falls into the same range as the terrestrial radiation emitted by mobile phones, aircraft, FM radio, and digital TV. This is why the SKA arrays are being built in remote, sparsely populated regions, says Diamond.
“The aim is to get away from people. It’s not because we’re antisocial – although some of my colleagues probably are a little – but because we need to get away from radio interference, phones, microwaves, and so on, which are like shining a torch in the business end of an optical telescope.”
Eventually there will be two SKA radio telescopes. The first, consisting of 130,000 two-metre dipole low-frequency antennae, is being built in the Shire of Murchison, a remote region about 800km north of Perth, Australia – an area the size of the Netherlands, but which has a population of less than 100 people. Construction kicks off in 2018. In the long term, there may be up to a million of these smaller antennae.
In Phase 2, says Diamond, the SKA will consist of half a million low- and mid-frequency antennae, with arrays spread right across southern Africa as well as Australia, stretching all the way up from South Africa to Kenya – a multibillion-euro project on an engineering scale similar to the Large Hadron Collider.
The supermassive data challenge
Which brings us to that supermassive data challenge for what, ultimately, will be an ICT-driven facility.
Diamond says: “The antennae will generate enormous volumes of data. Even by the mid-2020s [Phase 1 of the project], we will be looking at 5,000 petabytes – five exabytes – a day of raw data. This will go into huge banks of digital signal processors, which we’re in the process of designing now, and then into high-performance computers and an archive for scientists worldwide to access. Our archive growth rate will be somewhere between 300 and 500 petabytes a year: that’s science-quality data coming out of the supercomputer.”
But those volumes are only for SKA Phase 1, says Diamond. “For the full SKA, the figures will go up by a factor of 100. But that’s in the 2030s. We’re designing now for the 2020s, but in the following decade, the data problem will be much worse.
“To put this in perspective, worldwide annual Google searches generate about 100 petabytes of data. Facebook is about twice that. Global business emails generate about 3,000 petabytes of data. But the raw data from SKA Mid, we estimate, will be 62 exabytes (62,000 petabytes).
“We will have to design equipment to handle something that’s 20 times larger than global email traffic. Total global internet traffic is one zetabyte. Ultimately [when the full SKA is up and running, and in the long term], we’ll have five zetabytes within our internal systems alone.”
All of this means that each stage of SKA will need supercomputers that don’t even exist yet – “with a speed of approximately 300 petaflops,” according to Diamond. The fastest supercomputer in the world is currently China’s Tianhe-2, which runs at 33.86 petaflops, so the SKA will need access to a computer that processes data 10 times quicker than the fastest machine on earth today.
But none of this bothers Diamond: “The IBMs and Intels of this world tell us that this is entirely within their forecast capability. In fact, I’m pretty sure that the NSA already has something a little faster, but they won’t tell us.”
The four Vs
And as with all big data, the supermassive SKA data will not only be defined by its volume, velocity, and variety, but by also by an all-important fourth ‘V’: value.
“What we then have to do to these enormous volumes of raw data is detect and amplify them [extract the signal from the noise], digitise them and line them up, correlate them and integrate them, process them, and then create sky images, which the scientists will use. The SKA will be providing science-ready data products, calibrated and quality controlled.
“Traditional radio astronomy goes through this process many times, but we will only be able to do it once. We won’t be able to store all the raw data, it’s a one-pass system. So we have to understand our systematics much better than any existing facility on Earth.
“For us, the main principles are scalability, affordability, and maintainability, but we also have to maintain innovation. We have bright people throughout the world developing the algorithms to process this data, but we’ve got to be able to replace them too, straightforwardly, as new ideas emerge.”
The new space age
In many ways, it’s clear that the golden age of space exploration wasn’t the 1960s, it’s today. Over the next few years, four great observatories for the 21st Century will add to our knowledge of the universe. The Atacama Large Millimetre Array – 64 antennae positioned at high altitude to get above most of the water vapour in the Earth’s atmosphere – is already in operation in the Chilean High Andes. Soon it will be joined by the European Extremely Large Telescope, a 39-metre dish that’s currently under construction nearby; the James Webb Space Telescope, due for launch in 2018 as the long-term replacement for Hubble; and the SKA.
That means four supermassive data projects, while the European Space Agency’s Euclid programme will be another. Euclid aims to map the entire dark (unseen) universe, as opposed to the fingernail-sized patch of sky that’s currently visible to Hubble, in terrestrial terms.
In itself, the SKA is an awe-inspiring project, and Diamond hopes that the intergovernmental organisation to drive and develop it to its full potential will be in place by 2017.
So let’s hope that local politics don’t derail this extraordinary international effort. Might the UK’s potential exit from the European Union imperil this and other ‘big science’ programmes? And has the SKA Organisation considered deeply enough the physical (as opposed to data) security of its continent-spanning arrays?
In the end, much will come down to how important people consider such scientific programmes to be, and what their long-term terrestrial applications might be.
As a species, we should appreciate space exploration much more than we do. As Canadian astronaut Dr Robert Thirsk pointed out at the Space Innovation Congress in London in April 2016 [day 2 of which was co-hosted by myself], the space programme didn’t give us teflon and velcro, as many people believe, but it did give us some things that are much more important…
Thirsk said that he wished there was a button that could switch off all of the applications we get from space, just for a few seconds, so that we could all see their benefits. He explained: “We would lose long-distance voice and data communications, GPS, weather forecasting, disaster management, management of our natural resources, telemedicine, search and rescue, satellite communications, and even ATMs, which rely on GPS data.”
So let’s hope that the SKA succeeds and reaches its full potential. The local benefits alone could be enormous: the biggest amount of data ever gathered and processed, passing through the UK and managed by a UK team for the benefit of all mankind, unlocking the secrets of matter – and, potentially, antimatter? Let’s hope we can all see that big picture.
• Professor Diamond was speaking at the Space Innovation Congress in London, 7-8 April, 2016. Chris Middleton was co-host of day two of the event.
• A version of this article was first published on Diginomica, immediately after the conference.
21 April 2016: Victoria Wood
Even in a year of unexpected deaths, Victoria Wood’s came out of nowhere: sudden, impossible, somehow deeply unfair.
She was one of those vanishingly rare people who created a whole world, instantly recognisable but inimitable, even down to a turn of phrase (“If something went wrong ‘down below’, you kept your gob shut and turned up the wireless”). Our finest, foremost comedienne, a brilliant sketch writer, playwright, performer, actress, pianist, composer, songwriter, lyricist… the list goes on.
She was a great observer of the North/South divide (“We’d like to apologise to viewers in the North. It must be awful for them.”), but most of all she was a unparalleled writer for, and about, women. Her focus on the undocumented lives of working class and middle class women has no equal, and it is hard to imagine who might take her place.
But while we think of her comedy as being as warm as a winceyette nightie, and laugh at the cross-channel swimmer whose parents go to London to see a show, or the couple wanting a test-tube baby because they live in a maisonette, there was often a tremendous loneliness in her characters, a truth about the unobserved lives of millions who are battling on and making do. Yet there was also a sense that she loved every one of her creations, and was celebrating, rather than satirising, their lives.
“Good morning, Miss Jones? And how are we feeling?”
“Not too bad, apart from the agonising pain.”
15 April 2016: Why cut-and-paste journalism lets us all down
Many of us saw the story this morning about a man who permanently deleted all the data from his hosting business – and his customers’ businesses – with an errant line of server code. “Man deletes entire company!” shrieked the headlines. Even the backups were gone, they claimed, it was an insoluble problem!
The story was picked up by major news outlets across the globe, in each case using near-identical text – evidence that the story had been largely cut-and-pasted from whichever one ran it first.
Reports about reports about reports, all disseminated as bylined news at near-lightspeed: how very 2016. But it took me two minutes this morning – before I even got out of bed – to establish that it was nonsense, a non-story at best, simply by looking at the original tech forum that caused all the fuss. [The thread has since been deleted by its host, Serverfault.com.]
First, the non-story. On the tech support thread in question, the original poster (OP) later added: “Luckily we recovered almost all data!”, something not mentioned in any of the reports. It’s a safe bet that none of the journalists checked the source, or wanted to undermine a good yarn with some inconvenient details.
But given the claimed seriousness of the disaster – described in apocalyptic terms by expert respondents worldwide (it’ll take weeks to fix!, you’ll need a specialist data recovery company!, you’ve killed your business!, find a lawyer!, and so on) – the fact that he recovered so quickly proves one of two things: that it was a minor glitch, or that this ‘unfixable problem’ never occurred in the first place.
Indeed, some respondents suggested that the claimed mistake would have been all but impossible for a hosting company or programmer to make, while others suspected that the OP was simply an attention-seeking troll – the most likely explanation. As a coder friend put it to me this morning, “I assumed it was a Daily Mash article, because it was so obviously bollocks.”
Or perhaps the OP was a clever hacker seeking info about the failsafes on a specific technology set? Who knows? Certainly none of the reporters. As with most such forums, the OP neither identified himself, nor his company.
So take a bow, The Telegraph, The Independent, The Express (well…), The Times of India, Trusted Reviews (!), International Business Times, The New York Daily News, The Metro, and the many others that ran the story – cut and pasted, stated as fact, and yet ultimately sourced from some anonymous pixels via whichever journalist spotted them first.
If journalists never look beyond page one of Google or ‘below the fold’ on their screens, they’re not in the information business; they’re random noise generators. Because if they can’t even spend two minutes standing up or knocking down the simplest details of a story, then how can they be relied on to report anything complicated – like Panamanian tax affairs, the Brexit, politicians’ views on Israel, or the standoff between Russia, the US and Europe?
Soviet planes buzz a US warship! Yes… less than 70km from a Russian naval base. Jeremy Corbyn has taken millions from the taxpayer! That’s right: if you add up his salary over 33 years, like any public servant of his age on a middling-to-good income. It’s a world of garbage, the journalistic equivalent of grey goo.
But what can we learn from this? That if one outlet spots a story – in a world where information is valued by its speed, not its veracity – then others will copy it without checking the source? Big deal.
Well, yes. Because while, individually, each example might be trivial, collectively they create a serious long-term problem: a world of misleading surface, defined by information’s speed and volume, in which most people never check or question the ‘facts’.
Put another way, these are the same networked Chinese whispers that gave us the Millennium Bug and other non-stories, and which gifted some portals their entire business models of noise, clickbait, and cats.
That culture is now turning the internet into a billion channels of static – a tragedy, given the World Wide Web’s beginnings in academic peer review. Noise sells, so make more noise!: an outlook that’s making mainstream public debate into a shouting match of gibberish.
We have public servants who routinely lie in Parliament, because they know their statements will be reported as fact by a compliant and complicit press. At its extreme, you can blame the phenomenon for Donald Trump’s presidential campaign: the ultimate noise-generator that millions listen to, but few are prepared to question in the US. The Brexit campaign? Yet another charade in the ‘post truth’ world in which we find ourselves.
Page views are everything, says conventional wisdom. But since when is wisdom conventional? Why join a race to the bottom? Why jump off the cliff of mankind’s accumulated wisdom into an ocean of stupidity?
As more and more respected titles fire their experts and pay graduates to tweet press releases* and ‘repurpose vanilla copy’ (yes, that’s a thing), such a world is a liar’s dream, because that first page hit, that first response, those all-important eyeballs, are the only things that count.
Say anything, say it quick, say it loud, and move on. That way you’ll really stand for something! It just doesn’t stack up when you put it like that, does it?
*: Press releases are not intrinsically bad. They remain a standard means of telling journalists that something has happened – and perhaps one in 100 is useful, especially to trade titles. But it’s best practice to move the story on through first-hand research, commentary, analysis, and context-creating links to related articles. (‘Organisation X has released details of Y, explaining a, b, and c. So what are the implications? And how does this link to Z?’) However, most broadsheets now rehash Press Association stories almost verbatim, while many B2B portals reproduce corporate press releases without comment, question, or analysis in bylined articles. That’s what happens when speed and clicks are given primacy over depth, ‘voice’, and standards. But if you really have no choice but to publish a press release, then tell people it’s a press release. ‘A has released B. “We’re very excited about it,” said CEO X in a written statement from the company this morning.’ Where’s the harm in that?
4 April 2016: Space Innovation Congress, London, 7-8 April
A new conference takes place in London on 7-8 April: the Space Innovation Congress, which will be hosted at 200 Aldersgate, near St Paul’s.
The event investigates and celebrates how space technologies, space exploration, and pushing the limits of human experience, endeavour, and endurance are benefiting mankind here on earth.
The Congress brings together many of the world’s leading space experts, including scientists, technologists, and representatives of NASA and the Space Agencies of the UK, Europe, Canada, and France – including several astronauts and the innovators behind some specialist startups.
I’m delighted to be playing a small part in several of the sessions. On Day 2 in the morning, I’ll chair the Big Data thread, including presentations on the Copernicus and SKA programmes (with the latter gathering more data when it goes live than the entire internet combined).
Then, in the afternoon, I’ll be hosting the second half of the ‘Space Collaboration and Facilitating Life on Earth’ track, which culminates in perhaps the highest-profile event of the two-day conference: ‘the Astronauts panel‘, which will focus on the lessons that space explorers have learned from zero gravity.
The panel will feature:
NASA astronaut Dr. Kjell Norwood Lindgren, who launched to the International Space Station (ISS) on July 22, 2015, and returned in December after a 141-day mission. He’s also spent more than 15 hours walking in space.
Jean-François Clervoy, a member of the ESA’s Astronaut Corps, based at the European Astronaut Centre in Cologne. Jean-François flew twice on the Space Shuttle Atlantis and once on Discovery, spending a total of 675 hours in space.
Dr. Robert Thirsk, the first Canadian astronaut to make a long-duration spaceflight. Over two missions, he spent nearly 205 days in space, 188 of which were on the ISS.
Libby Jackson, the Astronaut Flight Education Programme Manager for the UK Space Agency. Among other things, she’s responsible for helping everyone in Britain to find out about Tim Peake’s time on the ISS.
And Karin Nilsdotter, a future astronaut, CEO of Spaceport Sweden, and one of the most inspiring and influential women in science today.
The organisers are also talking to a bona fide space legend.
It’s hoped that Buzz Aldrin, second man on the moon and ‘the grand old man of space’, will appear via a US link-up before or after the astronauts’ panel.
Buzz is currently deep into a US tour for his latest book, but he’s determined to have a platform at the event. If it’s at all possible for him to speak to delegates, the small matter of being in the US on a packed touring schedule will be no obstacle for a man who travelled 240,000 miles to land on the moon.
*: Disclosure. I am co-hosting Day 2 of the event, but have no financial stake in the venture, and have offered my services free to help get it off the ground.
30 March 2016: Why surveillance + TTIP = sleepwalking to litigation
The UK’s national surveillance plan has the potential to be a Whitehall IT disaster to rival the NHS National Programme for IT, Universal Credit, and other botched, late, and vastly over-budget schemes.
The likely cost of the programme has already soared from initial estimates of £150 million to over £1 billion. Add another zero to that, and you’re probably closer to the truth in the long term.
For one thing, there are the potential legal ramifications. Litigation is a very real prospect if companies are forced to weaken their products and any can prove that this led to security breaches, loss of trade, or financial damage to them or their customers.
The government may be making a rod for its own back: the secretive EU/US trade agreement, TTIP (designed to “liberalise one-third of global trade”), is widely believed to hand greater powers to transnational enterprises to sue any governments that restrain trade or lose them money, while sweeping aside some market protections and regulations.
Put another way, it appears to be a wholesale transfer of power to global enterprises. And TTIP is supported by a British government that claims to oppose any loss of national sovereignty that comes from being an EU member! Staggering.
In this context, a clumsy, ill-advised national surveillance programme is akin to sleepwalking into a multibillion-dollar legal minefield. If any company’s business, security, products, or reputation are in any way compromised by the government’s surveillance scheme, then TTIP may give them the powers to sue the UK.
That’s a truly bizarre state of affairs, and one that suggests a lack of joined-up thinking at Cabinet level, where the same ministers have pushed for both the surveillance powers and the new trade agreement. But at least we’ll know who to blame.
Anyone thinking that technology companies would refrain from suing governments, take note: Microsoft – often seen as the most compliant provider in terms of government requests – has filed a lawsuit against the US Department of Justice as a direct result of snooping requests, which the company believes violate the US Constitution.
In the UK, Whitehall has a lamentable history of IT programme disasters that have resulted in prolonged legal action. Factor in TTIP to the government’s surveillance plans – proposals that may force multibillion-dollar suppliers to weaken data security technologies and standards – and the financial risk becomes vast.
A vicious circle
It should be stressed that there are a great many respected IT experts in the public sector overall and many stories of exemplary innovation. However, all central-government IT disasters have shared the same elements over the years: ideology overcoming common sense; short-term political expediency; poor specification; little understanding of technology or the supplier community; unrealistic budgets and timescales; busloads of expensive consultants; and slow-moving bureaucracy – not to mention constant political change, interference, and (under Tony Blair, at least) an obsession with being ‘modern’.
Censure by the Public Accounts Committee usually comes just before the lawsuits start, the court cases cost more money than the programmes have saved, and by the time a scheme limps into the public arena in much-reduced form, the technology is usually a decade out of date. But this time, we may all have to pay the price of a catastrophic misjudgement.
Part of the problem is that the British government is placing all the burden of its surveillance plans on private companies, and the 2015 hack of TalkTalk suggests that some just aren’t ready to be in the front line of plans that, in most cases, they strongly oppose.
By leaving it up to the market to decide how to react to a centralised programme of national intelligence gathering, the UK government’s plan has the makings of being a dangerous ideological gamble with national data security. And as dozens of separate private sector entities, suppliers’ response will be piecemeal, rather than a co-ordinated national strategy.
The revised Investigatory Powers Bill may also force some technology companies to break the laws of the countries in which they are incorporated, as Apple CEO Tim Cook explained to the Home Secretary in 2015. O2, Vodafone, EE, and 3 were among others to express their disquiet.
Since then, the FBI paid $1.4 million to hackers to break the iPhone’s security, revealing the ‘damned if you do, damned if you don’t’ problem facing any provider that refuses to weaken its products on government orders. This is a dangerous state of affairs that benefits no one and does nothing to enhance national security, or confidence in either government or enterprise technology.
Everyone supports the need to tackle terrorists, organised criminals, and abusers. However, governments on both sides of the Atlantic ignore an important fact by pushing through blunt-instrument surveillance plans against the advice of technology experts: they’re not just declaring war on terror, as they claim, they’re also declaring war on IT suppliers’ business models and on their customers – the FBI’s payment to hackers is simple proof of that.
All of this may have long-term impacts on the world’s digital economy and the security of many sectors – including digitised public services. Post-Snowden, the lesson must be that surveillance programmes never achieve what they set out to do, which is to reinforce trust; their effect is always the opposite.
Many vendors are already telling customers to adopt greater privacy controls via encrypted email platforms such as SafeGmail and Tutanota, and devices such as the BlackBerry Priv. Meanwhile, Google is threatening to name and shame suppliers that fail to support encrypted communications. Overall, there has been a big uptick in business for providers that focus on privacy and encryption.
The conclusion is obvious: IT suppliers, telcos, app providers, and more, may be paying lip service to the government’s aims, but most fundamentally disagree with them and are actively taking steps to counter the proposals.
All of this means one thing: years of endless compromises by the government to the scheme, until it becomes a useless fudge that has done little more than signal that the UK is no longer a safe place to do digital business.
Add in TTIP, and the government may be facing a multibillion-dollar fiasco, and humiliation on the world stage.
• A version of this article was first published on UCInsight.com.
The Snooper’s Nightmare
4 March 2016: Peter Gabriel vs. The School of Life
To Westminster [on 1 March, 2016] to eavesdrop on a conversation between music legend, activist, and tech pioneer Peter Gabriel, and philosopher Alain de Botton.
The event was hosted by The School of Life, a venture that uses philosophy and therapy to offer grown-ups “some of the life lessons that school never taught them”. In a way, it is part TED, part gift shop, and part self-help group: a very English solution to developing emotional intelligence from its Swiss co-founder, de Botton.
His purpose is to make philosophy more relevant to people’s daily lives, which in today’s world of surface, speed, and fleeting sensation can only be a good thing – if you’re able to afford the school’s services.
The evening was billed as ‘Peter Gabriel: Life Lessons from a Rock Star’ and de Botton embarrassed his guest by introducing him as “one of the great men of the century”, causing the self-effacing singer to cackle with laughter – “Say something good!” he shouted.
So what it is about music – that axe to the frozen sea within us* – that still drives Gabriel more than his other pursuits? “We like to thump our chests, dance, and find a mate,” he said. “When you’re a shy and spotty boy and you’re in a rock band, the world opens up in mysterious ways.
“I was sent to a public school [Charterhouse], and it was just on the cusp of suffering but it was still pretty oppressive, and for me music was a retreat… I would turn it up as loud as I could and just dance… It was a release.” (As an ex-independent school boy myself, I remember rock bringing an explosion of colour, noise, sex, and outrage to a grey and silent Home Counties world.)
This may be why the Gabriel songbook is full of heroic characters who journey through an underworld of tests, ordeals, and misfortunes, to emerge changed but triumphant.
Gabriel the Belieber
On the subject of which, it was a surprise to discover which musician he’s a particular fan of today. “My listening consists mainly of Justin Bieber,” said Gabriel. And he meant it. Bieber’s new album is “very musical, has great rhythms, it’s beautifully produced, and he sings very well. So for those who dismiss him as just teen fodder, I would say open up your ears a little bit. There is a serious artist there.” (I agree about the new record.)
But he added that he has little choice in the matter: Bieber is what his seven-year old demands in the car on the school run every day. “And it’s a very long journey…” sighed Gabriel.
Today, Gabriel’s own musical journey has become a “quest for greater simplicity”, he explained, despite his deep-rooted tendency to explore tangents and musical byways. A new album is in the offing, “September,” he said. “But I won’t say which year… Fuck it, when I’m ready.”
But what other life lessons can we learn from a man who has spent nearly 50 years in the public eye – at least one of them dressed as a flower? Be a renaissance man, perhaps: Gabriel is far more interesting, far-sighted, and entrepreneurial than his reputation for being a dabbler and dilettante might suggest.
Pern you up, Pern you down?
Unlike the Gabrielesque figure in the Brian Pern comedy series – which puts the ‘mock’ in ‘mockumentary’ – few people come across as less self-important in conversation than Gabriel, a warm and engaging man possessed of that quiet power that typifies shy people who’ve mastered stagecraft. Indeed, he’s a legendary “ummer and ahher” (in Phil Collins’ words), despite being driven and successful across a number of diverse fields.
Alongside music and activism – he puts his money and time where his mouth is with projects such as Witness and The Elders – 66-year-old Gabriel has been bang on target with his technology bets and cultural predictions over the years.
In the 1970s, he experimented with bio-feedback systems (“But I couldn’t sell it to the band,” he explained) and predicted that music would one day be sold by phone companies – 30 years before it happened. He set up WOMAD, nearly bankrupting himself in 1982 with an event that became the blueprint for today’s family-friendly boutique festivals.
He was a rich-media advocate in the 90s, mixing music with interactive video and games, and in the Noughties an early co-founding investor in music streaming and online jukeboxes, with OD2 (On Demand Distribution) and We7. In 2003-04 he co-proposed a musicians’ union for the digital age, MUDDA.
Today, he co-owns high-end audio company Solid State Logic (SSL), a canny move with the resurgence of interest in high-end, aficionado tools. Then there’s the Real World record label and studio complex, not to mention the Society of Sound and other ventures – such as his long-cherished dream of building a theme park, where visitors can delve into different levels of experience (that underworld of change and transformation once again).
All of which should give him a reputation to overshadow Damon Albarn’s today, and yet Gabriel is increasingly labelled ‘the Prog Father’ and seen as a voice from the past – possibly because his activities so often take him behind the scenes and away from music.
Entire genres, careers, social movements, governments, and technology platforms have been and gone since his last album of new music, 2002’s ‘Up’ (although he has toured many times since then, and reinterpreted his back catalogue with the ‘New Blood’ project). “The worst thing you can give an artist is too much choice,” he told his audience – apparently without irony.
In 2003, he described the music industry as being “the canary down the coal mine” in the emerging world of peer-to-peer file-sharing, so we shouldn’t dismiss his startling prediction – made at The School of Life – that thought transference systems are less than a decade away from commercial viability, with search engines that will show us pictures based on what we’re thinking.
Gabriel predicts that human beings will become “completely transparent to each other” via this technology (the idea behind his 1977 song ‘Here Comes the Flood’, in fact). He’s experimenting with prototype systems, he said – just as he did with early audio sampler the Fairlight CMI in the early 80s, becoming the first owner of such a device in the UK. Not bad for a man with the bearing of a medieval friar who’s wandered in from the Somerset hills.
While – post Brian Pern – it might seem easy to mock some of his activities, such as making music with Bonobo apes (do watch the video) and experimenting with the inter-species internet – “Will apes like porn as much as we do?” he wondered – there’s a serious point to the things he does in the lengthening gaps between albums.
As he put it: “Why spend so much time looking for intelligent life in outer space when we’ve done so little to understand the other intelligent species here on earth?”
But the ghosts of Gabriel’s past are never far away – for some of his fans, at least (his “little cash machines”, as one of them said recently on Facebook). “I just wanted to tell you that I’ve always loved ‘Wind and Wuthering’!” shouted a Genesis devotee from the audience.
“A technical point: ‘Wind and Wuthering’ was the first album I didn’t appear on,” said Gabriel [wrongly, that was A Trick of the Tail.] But that’s Gabriel’s lot: always living in the future, but eternally dragged back to an imagined past from which people with tin ears never let him escape.
But, since he brought it up himself: Genesis? Gabriel admitted that some of the band’s music “doesn’t speak to him” in the same way anymore, but he still has great affection for “parts of ‘Supper’s Ready’ and ‘The Lamb…'”.
But, he stressed, “The door is open”.
A very Gabriel state of mind.
• If you quote from this article, please credit the source. Thank you.
*: Nietzsche originally said this of books, and Gabriel has used the phrase to describe music’s effect on human beings.
3 March 2016: A pretty big diversity disaster
Why aren’t more girls pursuing science? asks EDF. The energy giant is behind a campaign to encourage more girls to go into STEM – science, technology, engineering, and maths – careers. The campaign’s website says it’s designed to inspire girls’ curiosity in these subjects, and makes the point that only one in seven adults who work in STEM is female, which is staggering.
On the surface, then, an excellent campaign to target and inspire girls, and one that’s designed to tackle a serious problem in British society: how to get more females interested in science and technology. And this from a company, EDF, that’s also made some exemplary statements about diversity and inclusion. So far, so good.
But EDF really needs to have a word with its PR agency…
For reasons best known to its designers, the campaign is called ‘Pretty Curious’ and it’s spearheaded by some expensive-looking but rather creepy promos – like this one. The videos open with a series of photo library-style shots of different girls looking straight into camera, with each one captioned “I’m pretty”.
Eventually, words like ‘curious’, ‘inventive’, ‘determined’, ‘intrigued’, and ‘focused’ fade in after ‘I’m pretty…’, and the promos make their clumsy point: each young person is pretty curious about robotics, astronomy, or particle physics. And EDF is pretty serious about redressing the gender imbalance – as it should be. A pun-tastic message so badly conceived that it beggars belief.
EDF: why focus on the word ‘pretty’ at all? Why are the girls in the campaign videos depicted as passive observers, rather than actively engaged in science? And why is the pretty focused girl in the video-grab, above, gazing at the camera while putting her hair behind her ear – rather than, say, looking through the telescope at the stars?
I understand that this is a well-intentioned campaign targeted at school-age girls in today’s selfie culture (hey, you can be clever as well as pretty!). The aim is presumably to subvert any preconceptions its young viewers might have that girls should be physically attractive, rather than, say, intelligent, skilled, strong, confident, talented, independent, inquisitive, or ambitious. (Not that a human being can’t be those things and photogenic, of course, just as you can be intelligent, strong and confident and not have a modelling contract.)
The problem is that the execution of the idea is… well, pretty weird, frankly. It’s as if the whole thing was dreamt up in 1963 by Mad Men‘s Pete Campbell, while swigging his tenth bourbon and leering at his secretary.
But this isn’t the only cockup of the campaign. The ‘Pretty Curious Challenge’ set out to find the best new science/tech product idea submitted to EDF by one of the many curious/inventive/determined girls that the campaign is designed to reach.
The prize was won by a boy called Joshua, who receives an iPad and a day out at a science fair.
Well done, everyone!
• Better work in encouraging girls and women to pursue scientific careers is being done by people like Dr. Sue Black, and others. Do check out her blog.
• This article was first published on diginomica.com.
1 March 2016: Captain Buzz, forever!
On Sunday 28 February, I met my childhood hero: Buzz Aldrin, something that would have made the little boy I used to be very proud and happy. Dr. Aldrin was interviewed at the Science Museum by another hero of mine – Britain’s great populariser of science, Prof. Brian Cox – as part of the institution’s Cosmonauts – Birth of the Space Age exhibition, which explores the Soviet Union’s contributions to the conquest of space.
(Not only did Russia put Sputnik and the first man into space, Yuri Gagarin, but also the first woman, Valentina Tereshkova. Cosmonaut Alexey Leonov was the first man to walk in space, and he might have been the first to set foot on the moon, too, had the LOK/N1 project not been cancelled after Apollo 8’s successful lunar orbit.)
Dr. Aldrin explained how the space ‘race’ had always been a collaborative venture – the Cold War thawed in space long before it did on land. And Leonov told him, years later, how he had crossed his fingers and hoped for a safe landing as Aldrin and Armstrong approached the moon’s surface on 20 July, 1969.
I did too. I was a child of the Space Age, a toddler growing up in a self-assembled private world of astronauts, robots, books, music, radio, Ray Bradbury, Thunderbirds, and the stories I used to tell myself when there was nothing else between me and the cold night sky. My parents sat in separate rooms with their backs to the door, leaving me to assemble a life raft of ideas out of whatever junk I could find – one on which I knew I could escape.
But someone was more important than all of these things to the dream-lost boy that I was: Captain Buzz. But why? (And isn’t he a Colonel?, you ask). Let me explain…
In 1969, as I watched Aldrin clamber out of the Eagle and bound around, testing the lunar gravity, a caption flashed onto the screen. And because our television was a primitive beast, made of wood and tubes and cathode rays, there was a loud buzzing sound as the caption blazed onscreen. “What was that?” I said, turning to my mother. I realise now that she said, “Oh, that’s caption buzz. It’s nothing to worry about, dear.” But what I heard was, “That’s Captain Buzz.”
And so I believed that Buzz Aldrin got his name from his ability to make my television buzz with pleasure from a distance of 240,000 miles.
It was secret-agent code between an astronaut and the little boy who was waving back at him from the lonely blue planet on the moon’s horizon. And the message seemed to say: although there is nothing but dust on the moon of love – not even air – everything will be alright.
Years later I discovered many wonderful things about Captain Buzz: that he was a pilot, an adventurer, a space walker, and a war hero, but also a rocket scientist and a brilliant theoretician – which alienated him from his peers; that, post-Apollo, he’d battled depression and alcohol, and sold cars in the desert before turning his life around and reclaiming his place in history.
And he is a poet at heart. After all, this is the man who said “Magnificent desolation” as he surveyed the moon’s surface. Plus he took the world’s best ever selfie, floating high above the earth.
Today, Aldrin is the driving force behind the ShareSpace Foundation, which is designed to spark children’s literacy in science, technology, engineering, and maths (STEM), but also, brilliantly, the arts. Because he understands more than anyone in history, perhaps, that space exploration can ignite our creative spirit too. (Trust me, I know it can.)
And so it was on Sunday, in a room full of space debris and miracles, that this story finally completed its decades-long orbit and I found myself face to face with Captain Buzz.
So what did I say to him, and what did he tell me?
That, dear reader, is for me to know. Find your own hero!
30 January 2016: The myth of ‘broadband Britain’
A damning report has criticised the broadband quality and speed provided by BT and its standalone infrastructure subsidiary, Openreach, and demanded the cutting of all ties between the two for the good of the UK economy.
The report, ‘Broadbad’, is published by the British Infrastructure Group (BIG), a cross-party MPs’ pressure group dedicated to improving the national infrastructure. According to the BIG, “systemic underinvestment” stemming from the “natural monopoly” of BT and Openreach is “stifling competition, hurting constituents, and limiting Britain’s business and economic potential”.
More, the report claims that the broadband service provided to many customers is so unreliable and slow that it is potentially losing UK plc as much as £11 billion per annum.
“Unless BT and Openreach are formally separated to become two entirely independent companies, little will change,” says the report. “Openreach makes vast profits and finds little reason to invest in the network, install new lines, or even fix faults in a properly timely manner.”
Openreach’s 2015 operating profit was £1.25 billion on revenues of £5 billion, while the overall BT Group reported an operating profit of £3.7 billion (£2.5 billion after tax) on revenues of £17.8 billion.
“We believe that Britain should be leading the world in digital innovation. Yet instead we have a monopoly company clinging to outdated copper technology with no proper long-term plan for the future,” continues the report. “We need to start converting to a fully fibre network so we are not left behind the other nations who are rushing to embrace digital advancement. However, we will only achieve this by taking action to open up the sector.”
BT CEO Gavin Patterson has described the report as “misguided” and claimed that “over 90 per cent of the UK can get superfast broadband today”.
However, Paterson’s claim is contradicted by Openreach, the company responsible for connecting BT customers to fibre networks. According to the most recent statistics published on Btplc.com, over 75 per cent of UK premises have access to fibre broadband, not “over 90 per cent”.
And then there is the question of what constitutes “superfast”. According to 2015 global rankings from Akamai, the UK’s average broadband speed is just 13Mbps. Although 87 per cent of UK broadband customers have connection speeds in excess of 4Mbps – hardly a state-of-the-art service – only 46 per cent have speeds of over 10Mbps, and just 28 per cent have speeds of over 15Mbps (again, not fast by today’s standards).
So the implication is that the national average is being skewed by a small minority of people who have access to much higher connection speeds than others; most are probably London-based enterprises that have paid for private installations.
The UK lags behind South Korea (where the average national connection speed is in excess of 20.5Mbps); Sweden (17.4); Norway (16.4); Switzerland (16.2); Hong Kong (15.8); Netherlands (15.6); Japan (15); Finland (14.8); Czech Republic (14.5); Denmark (14); and Romania (13.1).
However, the average UK connection speed is just above that of the US, according to Akamai – which is hardly surprising, given the vast landmass and rural heartlands of North America. In the UK, few people genuinely live long distances away from a major town or city.
Then there is the type of fibre connection in play. In many cases, BT services are Fibre to the Cabinet (FTTC), with ageing copper connections to premises, rather than Fibre to the Premises (FTTP) connections, which are up to four times faster – up to 330Mbps against up to 80Mbps for FTTC (according to Openreach statistics).
So, with a theoretical fibre broadband top speed of 330Mbps in the UK, the average UK connection speed is actually 25 times slower than that, while the majority of UK customers experience speeds that are over 50 times slower than the fastest FTTP connection – hardly “superfast”, Mr Paterson.
Meanwhile, any copper-based users who are situated a long way from their local exchange will get a poor service indeed, particularly if they’re based in one of the UK’s many ageing buildings. And as everyone knows, there is a postcode lottery for cable or satellite alternatives.
While it may be true that between 75 and 90 per cent of the UK population has theoretical access to broadband services of some kind, widespread anecdotal evidence from across the country suggests that BT’s service is often patchy and unreliable, with speeds nowhere near those that customers pay for* – thanks to those ageing copper connections again. Rural connectivity is especially poor, and ‘notspots’ abound nationwide.
The BIG report itself reveals that while broadband connectivity may be good in London, Manchester, and one or two other centres, it is often poor elsewhere – even in the South East and in a number of major cities*, such as Bristol.
And there is another factor that prevents faster improvements to the UK’s ageing infrastructure: it’s simply not in BT’s or Openreach’s commercial interests to upgrade more parts of the network when they can charge some customers – especially enterprises – a premium rate for private installations.
(BT recently told me exactly that, that it “wasn’t in its commercial interests” to upgrade the cabinet in the major shopping, business, and residential street in Brighton in which I live.)
In short, this is yet another example of the fatal tension between short-term shareholder value and long-term taxpayer benefit that affects so much of our national life, especially in sectors such as utilities, finance, and privatised health. Stakeholders in large, arrogant private businesses who believe they have a market sewn up prefer to get rich quick rather than invest in building something that would benefit everyone in the long term (while issuing deeply misleading statements about their progress).
Why are we waiting?
There are other problems for UK plc, too. According to 2015 Openreach figures, businesses face an average wait of 33.49 working days (nearly seven weeks) for broadband installations, or 69.95 working days (14 weeks) if the service requires a new build – hardly ‘utility’ service levels. But that’s no surprise: someone in the chairman’s office of BT once remarked to a colleague of mine, “Broadband is not a utility, and it never will be.” Staggering.
The apparent disconnect between BT, Openreach, customer service staff, and repair and maintenance teams is another grumble for many users, at a time when BT seems more focused on becoming a content player than the infrastructure provider that its customers actually want.
The conclusions are stark: whether separately or together, BT and Openreach must set aside their absurd quest to be media moguls and focus on providing what the UK needs: a genuine superfast network for the 21st Century – network coverage, not multimillion-pound sports coverage. And rather than pour even more millions of pounds into CRM technologies that seem to be making BT ever more remote, aggressive, and sales focused, the company should try something truly radical: treating customers with care and respect.
*: I write from one of the UK’s prime digital hubs, Brighton – home to countless apps, mobile, web, SEO, and games companies – where at my office in the centre of the city, BT broadband speeds are never more than 3.5 – 5Mbps at best, against a billed-for speed of 20Mbps. Infinity is not available – despite endless sales pitches to this address saying it is – in what is one of Brighton’s (and the UK’s) major retail and business locations.
• A version of this story was first published on UCInsight.
• Further reading: BT Sport contract deemed “a mistake” by UEFA.
11 January 2016: David Bowie
Two things happened immediately after I woke up this morning: I logged onto Google and saw that David Bowie had died, and then, while I was still trying to process the news, the doorbell rang. It was the postman with my limited-edition, transparent vinyl copy of Bowie’s Blackstar LP. A final message from Major Tom… one whose real meaning was now as clear as the disc itself. Since then, I’ve been surprised to find myself in mourning, and I want to try to express why.
For a small boy growing up in an even smaller town and finding himself outside of everything but himself, discovering Bowie was like discovering 1,000 different escape routes – into the self, and into a world of possibilities, the greatest of which was the idea that you could invent not just one life for yourself, but 1,000 or more, and that there were no limits apart from a failure of the imagination.
Bowie was the ultimate outsider, a true alien, the king of the freaks, and queen of the lost boys. So I am sad because some of the people whom Bowie became felt like friends, and today they have all passed away together. And because he was beautiful, outrageous, erudite, sophisticated, and strange. And because he wrote some great songs, and sang all of them so beautifully. And because Major Tom was all alone in his tin can… and as a boy, I knew exactly what that felt like.
But do I mourn one man called David Bowie, or 1,000 people who all shared the same name? That I will never know. And that, in the end, is what elevates his life into a complete work of art, one for which – with typical grace – he was able to write a final chapter.
© Chris Middleton 2016