The Snooper’s Nightmare

Why increased surveillance will put one million children at risk.


Confident?
Confident?

UPDATED. The British government’s surveillance plans force telecommunications providers and ISPs to retain all customers’ phone and internet data for up to a year, while weakening encryption protocols and setting technology providers against their own customers’ interests.

So, beyond the vital questions of whether the scheme is acceptable, ethical, or viable – which are explored in detail below – a key question must be: are ISPs and telcos ready, given that they’re being placed front and centre of the government’s plans?

In the wake of the 2015 TalkTalk hack, in which unencrypted data from 157,000 customer accounts was stolen, the answer is clear: no.

The hack of the provider’s website demonstrated two things. First, that customer records and account information are irresistible to criminals in a world in which private data is a de facto currency. And second, that there is no legal obligation on internet companies or telcos to encrypt it.

The company’s CEO, Dido Harding, went as far as claiming the Data Protection Act itself as her defence against criticism of account data being stored ‘unsalted’. The Act fails to make data encryption a legal requirement, allowing providers to hide behind its vaguely worded terms.

This all-too-public failure of a major supplier to secure its customer data should serve as a warning to Whitehall: it will be open season on the internet records of UK businesses and private citizens, because their allure will be irresistible to criminals.

ISPs, telcos, mobile providers, and others, will be expected to secure all that data from day one, and, as dozens of separate private sector entities, their response will be piecemeal rather than a co-ordinated national strategy.

So, by leaving it up to the market to decide how best to react to a centralised programme of national intelligence gathering, the government’s plan has the makings of being a dangerous ideological gamble with citizens’ personal security.

The revised Investigatory Powers Bill may also force some technology companies to break the laws of the countries in which they are incorporated, as Apple CEO Tim Cook explained to the Home Secretary in 2015. O2, Vodafone, EE, and 3 are among many others to express their disquiet.

So the government ignores a vital fact by pushing through its surveillance plans: it isn’t declaring war on terror, as it claims; it’s declaring war on IT suppliers’ business models (as Apple’s US battle with the FBI demonstrates).

Post-Snowden, the lesson must be that national surveillance programmes never achieve what they set out to do, which is to reinforce trust; their effect is always the opposite. Inevitably, the Bill will drive more people towards encryption, and privacy-shielding apps or devices, while reducing citizens’ trust in their own government. Indeed, the IT community itself is already pushing its customers to adopt greater privacy, as this story reveals. 

Common sense suggests that these facts alone risk increasing cybercrime and decreasing national security, rather than bolstering Whitehall’s genuine fight against terrorism, organised crime, and abusers. At the same time, the government has just forced its own law-abiding citizens to safeguard themselves from the snoopers: an own-goal if ever there was one.

Dido Harding
Dido Harding

• Increased risk to children

Away from the TalkTalk boardroom, Dido Harding is Baroness Harding of Winscombe, a member of the House of Lords (and wife of Tory MP John Penrose), who has stated her personal determination to make the Web safer for all, especially for children.

So, will the amended Bill protect children? No. Quite the reverse.

A programme of national surveillance will massively increase the risk of harm to the UK’s children, for three simple reasons.

• First, children’s rights to privacy and free association are protected under Articles 13 to 17 of the UN Convention on the Rights of the Child, so the British government would have no choice but to seek a national security exemption against the Convention – the most widely upheld piece of human rights legislation in history – given that any national data-mining exercise would gather data about children’s internet activities as well as those of adults.

• Second, teachers, children’s charities, anti-bullying organisations, and other childcare professionals all tend to agree on one thing: that once children reach adulthood at 18, they should be given the right to have their childhood internet activity ‘forgotten’, so as to prevent their own data from causing them problems later in life. The Bill makes that impossible, and therefore acts against the professional opinion of the very people whose job is to protect children.

• And third, because by far the largest distributors of explicit images of people under the age of 18 are minors themselves. This is the phenomenon known as ‘sexting’ – the sharing of self-made explicit texts, images, and videos – which a number of reports suggest affects up to 25 per cent of all teenagers, with 80 per cent of those involved being under 18.

So let’s do the maths. There are five million teenagers in the UK, according to government statistics. If up to 25 per cent of them are ‘sexting’, as surveys suggest, that’s 1.25 million young people. If 80 per cent of those involved are under 18, then one million children could be at risk of prosecution under anti-p*rn laws as a direct result of national surveillance.

mobile_131244566-thumb-380xauto-2009
‘Sexting’ is increasingly common among young people, say reports.

Scaremongering? No. Prosecutions have already taken place in both the UK and the US, where teenagers – including some aged over 16 in legal, consenting relationships – have been convicted under anti-p*rn laws for sending pictures of themselves to their partners.

Any explicit image of an under-18 is illegal, and so the two-year gap between the ages of consent (16 in the UK) and majority (18) is a legal minefield.

In the US, a Carolina teenager was recently found to be both the perpetrator and the victim of this crime, for having an explicit image of himself on his own mobile. There’s no evidence that he sent the image to anyone, but he’s now a convicted criminal.

Under a strict interpretation of UK law, any minor who sexts explicit images of themselves (or of anyone under the age of 18) is no different to the abusers the Bill is designed to catch, and they would be added to the same offenders register. Again, this applies even if they are over 16 and in a consenting relationship.

The legal situation is made all the more alarming by the fact that more and more young people now see sexting as normal behaviour: the figures quoted above are a couple of years out of date, and peer pressure alone means that the real number of sexting teens is almost certainly now much higher.

In the UK, a 2016 survey published in the Independent found that the majority of school pupils who engaged in sexting were underage – including pre-teens – and that situation had significantly worsened over the preceding two years. (Sexting is also known to be a common element in cyber-bullying, which is now by far the most common form of bullying among young people.)

What we, as adults, think about the morality of all this is largely irrelevant, as even the UK’s National Crime Agency (NCA) admits. In a recent interview, Zoe Hilton, head of safeguarding at the NCA’s CEOP Command said, “With smartphones and tablets, and new apps emerging all the time, this behaviour is becoming quite normal for teenagers.”

• The technical challenges

cca-prison-barsThen there are the technology challenges of an overt national surveillance programme, given that an internet that facilitates mass surveillance is an internet that is less secure and therefore makes crime much easier to commit.

For example, let’s suppose that an IP address is associated with a criminal act. IP addresses don’t reliably identify people – nor, in many cases, even devicesIP addresses can be bought and sold – as was evidenced by the British government recently selling off unused IP addresses to, among others, Saudi Arabia.

They don’t even pinpoint a user’s geographical location: IP addresses can be cloaked, and they can be rented off the shelf via (legal) IP-hiding and proxy server platforms that allow a device to appear to be anywhere in the world, from Colorado to Korea. Meanwhile, computers can be absorbed into malicious botnets without users having any idea that it is happening.

Just as significantly, there will soon be untold millions of new IP addresses, thanks to the Internet of Things (IoT): all those smart devices that will be coming online over the next few years under the extended addressing made possible by IPv6. A tsunami of data, in fact, which the police almost certainly lack the resources to deal with.

In short, the technology challenges inherent in any meaningful monitoring and data-mining of internet records to identify criminal behaviour are so vast and complex as to make the whole scheme nonsensical. Indeed, several providers have said that what the government wants may be technically impossible, not to mention vastly more expensive than Whitehall’s paltry estimates. 

Indeed, the surveillance plan has the potential to be yet another Whitehall IT disaster, to follow in the footsteps of the NHS National Programme for IT, Universal Credit, and countless other botched, overdue, and vastly over-budget schemes, some of which have descended into expensive litigation.

• All the signs of failure

There are a great many IT experts in the public sector, and many stories of exemplary innovation. However, all central government IT failures share the same elements: ideology over common sense; short-term political expediency; poor specification; little understanding of technology or the supplier community; unrealistic budgets and timescales; slow-moving bureaucracy; and inept line management – not to mention constant political change and interference.

Censure by the Public Accounts Committee usually comes just before the lawsuits start, and by the time a scheme limps into the public arena in reduced form, the technology is a decade out of date.

Meanwhile, it goes without saying that organised criminals, terrorists, and pornographers are the very people who have the skills to evade blanket surveillance. In fact, most ‘millennials’ already know how to sidestep the snoopers, while anyone can buy a pay-as-you-go phone, or a phablet from a secondhand store, and put in an untraceable SIM.

Google's Sundar Pichai: the company plans to name and shame suppliers that don't support encrypted email.
Google CEO Sundar Pichai: the company plans to name and shame suppliers that don’t support encrypted email.

More, the IT community itself is increasingly providing customers with the means to communicate privately and securely, with encrypted email services such as SafeGmail and Tutanota, and devices such as the BlackBerry Priv. The government has just given its law-abiding citizens the spur to adopt them, and their usage will explode over the next few months.

But there’s another consideration, one that almost no one talks about: automation. Someone in a back room somewhere will be asked to write algorithms to identify ‘suspicious’ patterns of behaviour, or ‘suspicious’ website visits and trigger words, because it stands to reason that human beings will have neither the time nor the resources to monitor an entire nation’s internet activity 24 hours a day, 365 days a year.

Who writes those algorithms, based on what rules, what keywords, and what concepts of ‘suspicious’ behaviour is an important question, and the fact that politicians, not judges, will have the power to hit ‘Enter’ on investigating someone’s activity suggests that decisions may have a party-political dimension. (For a detailed report on the very real dangers of machine-based decision-making, go here.)

Automated ideology is something that any country should resist at all costs. With no digital bill of rights in place to give citizens legal protection against being (to use analyst Ray Wang’s phrase) “oppressed by their own data”, this may create serious problems in any society that’s driven by a powerful political belief system, as the UK currently is.

• The business angle

telemarketrobot-300x225Now, beyond the alarming lack of preparedness of some ISPs and telcos, few of these unfortunate side effects of the amended Bill will have a direct impact on business, perhaps.

However, three things certainly will: the rise of Bring Your Own Device (BYOD) schemes and ‘shadow IT’ in business; the growing grey area between ‘business’ and ‘private’ use of technology, and any lack of trust in the UK’s digital economy and in its technology providers.

The first two can be taken together. Increasingly, employees use their own devices for business, and informally use their own cloud applications and storage facilities at work, too. This shifts their private internet usage into the corporate realm. Many of those employees also work from home or wherever there is free wifi. This has the reverse effect: it shifts corporate internet use into public and private spaces.

The enterprise use of encrypted communications, proxy servers, and other tools is central to the successful running of the digital economy, especially in areas such as financial services, defence, and any business that revolves around intellectual property – which, in this day and age, is most business. In short, these technologies are core to the UK’s economic health.

But many private individuals are also ‘companies’: self-employed people, freelancers, and so on – anyone who uses limited company status for business. Will their use of encrypted communications be accepted by the government, or does it only apply to large enterprises? If so, why should large businesses be allowed to encrypt communications, but not anyone else? Are they inherently more trustworthy? And what about all those flexible and remote workers who access enterprise systems at home via secure private networks?

This whole area has so many pitfalls that, again, it may prove to be completely unworkable.

Which brings us to trust.

• Why trust is essential

For our digital economy to succeed, citizens need to trust that digitised governmental services have their best interests at heart. Customers need to trust that their service providers can protect their private data. And private companies need to trust that the UK is a safe place to do business. In each of these areas, the Bill puts trust at risk. The government is tearing up any trust that citizens have in digitised services, especially those provided by the government itself.

The impression is that politicians and inexpert officials have rushed surveillance plans through, rewriting and amending them on the fly, with little consideration of the real-world consequences that may follow. And they are doing so against the advice of most technology providers, let alone civil liberties campaigners.

There’s also the suspicion that because security services’ covert surveillance of UK citizens was probably illegal, the programme has been forced above ground in the form of the Bill. If that is the case, then giving judges the final authority, not politicians, would go some way towards restoring trust.

• The UK is a primitive digital nation, says survey

imagesThe impression of expedient measures being rushed through without considering their impact on citizens is only reinforced by a report by Deloitte on the progress of digital government programmes worldwide.

The 2015 report, The Journey to Government’s Digital Transformation, explains that one of the signifiers of an “early stage” digital government – rather than of a “developing” or “digitally maturing” one – is a primary focus on cost-reduction, rather than citizen benefit.

Out of over 1,200 respondents in government organisations across 70 countries, the survey reveals the British government as being by far the most focused on cost-reduction (and therefore, implicitly, the least focused on citizen benefit). The conclusion is inescapable: the UK is revealing itself to be a primitive digital nation, as a direct result of Whitehall policy.

The report even quotes Mike Bracken on the need for organisations to develop a supportive culture for technology change. At the time he was interviewed by Deloitte, Bracken was presumably still head of the UK’s Government Digital Service (GDS). However, he resigned in August 2015 and has since joined the Co-Operative, citing an institutional lack in Whitehall of the culture necessary to effect digital transformation and bring about real citizen benefit.

In this context, the amended Investigatory Powers Bill should cause alarm. An ill-informed, cost-focused, blunt instrument to fight crime – one that’s politically expedient, technologically unworkable, ignores emerging technologies, and is reliant on a disparate group of private providers to secure citizens’ data (even though there’s no legal obligation on them to encrypt it)? Does that sound like a solution?

• It’s just not right

Human barcodeBut there’s another dimension to this that we cannot ignore: this is simply a scheme that has no place in any society that values free association, freedom of speech, and freedom of thought.

A culture in which security services police people’s habits online at politicians’ behest would be a catastrophic misstep, as mainstream Western politics swings further and further towards the right.

In a free society, no one follows citizens around bookshops or libraries noting what books they browse, borrow, or buy, or what opinions they read or listen to, because we know that people are not defined by their browsing or reading choices. Indeed, someone once said that a good library should contain something that offends everyone.

Reading something online is no different; just because a text is digitised does not somehow make it dangerous, any more than a piece of paper is inherently dangerous.

We read to learn, to find out about the world, to enquire, to ask questions, to gain different perspectives. And the roots of the World Wide Web lie in the dynamic linkage of information resources to aid the spread of knowledge and education, to the benefit of us all. Click on a Google News or Images link and you might be taken to text from anywhere in the world.

The problem with blanket surveillance is that there is an implicit presumption of guilt: that the simple truth of Bad People’s existence will be revealed by their browsing habits. But it’s nonsense.

Raul Lemesoff's 'Weapon of Mass Instruction': a mobile library in the shape of a tank.
Raul Lemesoff’s ‘Weapon of Mass Instruction’: a mobile library in the shape of a tank.

Suppose Person X wanders into a library or bookshop, and sees a book about the history of Islamic fundamentalism, or an analysis of the rise of fascism in 1930s Europe, and decides to read a chapter. Or let’s say he leafs through a book that’s critical of the Iraq War, or of global capitalism, or of Israel’s policy towards Palestine, or of the current Conservative government, or he reads a chapter that’s supportive of a hacking collective that’s taken a government website offline.

That browsing tells us nothing about Person X, because it is without context; we have no knowledge of his intent. He might be a student; an historian; an academic; a journalist; an economist; a theologian; an atheist; a cultural theorist; an artist; a writer; an analyst; a researcher; a politician; a businessman; a school teacher; or a school pupil. Or he might simply want to read more about the things he sees in the papers, to listen to other voices and other opinions. He might even vehemently disagree with everything the author says.

Now replace the word ‘book’ with ‘website’. What’s changed? Nothing.

To browse is not to condone

The point is this: to browse is neither to agree, nor to condone. The moment we start building assumptions of guilt or ‘danger’ into someone’s reading habits we are ascribing a threat to specific subjects and to the acquisition of knowledge about them. To investigate people based on their internet browsing is essentially to say, “Don’t read about X, Y, or Z”, which would be the death of reason itself.

Yet sooner or later a politician will utter the immortal words, “The innocent have nothing to fear”, but that is no longer the case. (This separate report demonstrates why some citizens have everything to fear in a society based on computer algorithms.)

This plus automation hardly equals a healthy democracy.
This plus automation hardly equals a healthy democracy.

As the UK adopts blanket surveillance of citizens’ browsing habits, make no mistake: we are opening the door to a form of digital ‘McCarthyism’ on an unprecedented scale, a ‘them and us’ culture of anger, suspicion, and disintegration. That can only damage our society, damage the economy, damage business, undermine trust, undermine digital programmes, and put the UK out of step with much of the developed world.

The rise of Donald Trump in the US and of right-wing nationalism in many parts of Europe are proof of how quickly societies can divide and turn against themselves, while the current government in the UK – which is pushing through more and more ideology-driven proposals against the tide of public opinion – is not known for its reasoned engagement with alternative viewpoints and policies. Does anyone doubt that in such governments’ – or potential governments’ – hands, mass surveillance might become a tool of political repression?

Yet ultimately, state surveillance is what those ‘other’ people did, those enemies of freedom, democracy, and Western values; those people who represented everything that we stood against during the Cold War. Remember them?

When did we become them?

Further reading:
Investigatory Powers Bill not fit for purpose, say 200 senior lawyers (The Guardian)
Seven reasons you should still be worried about the IP Bill (The Mirror)

.chrism

CMLogoSMALLEnquiries
07986 009109
chris@chrismiddleton.company

RSS
Follow by Email
Facebook
Google+
http://chrismiddleton.company/the-snoopers-nightmare/
SHARE
Pinterest
Pinterest
LinkedIn

© Chris Middleton 2015 and 2016