Published on March 27th, 2015 | by EJC0
New Tools Heat Up Censorship Arms Race
Online censorship may be as old as the Internet, but in today’s high-tech world, it is increasingly a perpetual arms race between censors looking for better ways to probe and muzzle, and a circumvention community looking to help the censored get and send information while protecting their digital identity.
There are several widely-used tools aimed at facilitating online access in closed societies by providing security, anonymity and ways to circumvent national firewalls. They include Virtual Private Networks (VPNs), proxies, peer-to-peer networks, encryption, and anonymous browsing services, among others.
Now, a more interesting technology being introduced in circumvention is traffic obfuscation or pluggable transports.
“This means concealing circumvention traffic by making it appear like other kinds of innocuous network activity, in order to slip past censors,” saidJosh King, Lead Technologist at Washington’s Open Technology Institute. “This has promise as something that could eventually be introduced into a number of different projects in order to make them much harder to detect.”
Traffic obfuscation attempts to make it harder to detect active encrypted networks services. In an email interview, King said a network provider doing traffic analysis may be able to detect a particular encrypted network stream, although they may not be able to read the contents of that stream.
“If the stream [corresponds] to the IP address of a known circumvention service [for instance, a Tor entry node], or if more sophisticated traffic analysis shows that certain attributes of the stream indicate that it originates from a known circumvention tool, they do not have to read that data in order to know that they should block it,” King said.
But King cautioned that no technology is completely secure, even though new circumvention tools and enhancements are always in the works. Censors are always trying to exploit existing technologies, some of which were designed in the first place “to respond to improved censorship techniques, such as creating additional entry points into the Tor network as authorities became more adept at blocking existing ones,” said King.
Rex Troumbley, a researcher at the University of Hawaii at Manoa, shared the sentiment that the best privacy technology is very transient; and there are always ways around it.
“Increasingly, what you see happening is a lot of states, for example, creating fake proxies or … they tap into a lot of the free ones that are available,” he said. “So people use them and they think that they are not being monitored, but they are. Or they [i.e., censors] just get the record from the proxy hosts and use that data to track them and find out after the fact.”
Censors have new tricks too
The next generation of censorship might be hard to detect by the Internet users being censored. Troumbley said newer techniques are less about denying people access or preventing them from seeing something and more about redirecting them to other content censors do not find objectionable.
“It’s about suggesting some kind of alternative so that they are not necessarily able to see the other option,” he explained. “So it doesn’t interfere … and it’s not just [in] China or Iran. This happens all the time.”
In these situations, it is hard for users to determine if they are running against algorithm issues or a deliberate block to prevent their search. Troumbley used Facebook’s newsfeed algorithm, which automatically sorts what people can see, as an example of this type of filtering.
“And so you are not aware of the fact that you are being curated or that you are being directed toward certain services,” he added. “In Facebook’s case, they do that pretty intentionally because they want you to keep using the Facebook service.”
In the case of Facebook, objectionable content that might otherwise alienate customers is filtered out. But Troumbley said this type of approach is also a way of “putting you in some kind of bubble of algorithmically-curated content that you are not entirely sure is there.”
“You’re not being told we can’t talk about something,” he added “You’re being told what to talk about or you are being directed toward certain topics that they would prefer, which are typically about consumption.”
The filter bubble, as it is called, is not necessarily censorship, added Troumbley. “It’s kind of like next-generation censorship … It’s much more like Brave New World,” than George Orwell’s 1984.
Adam Fisk, President of Brave New Software Project Inc., a non-profit that builds tools to facilitate open Internet access, agreed that filter bubbles are typically used by companies looking to optimize customer experience, although that sometimes results in what he called “collateral damage.” That means that users might not see content they might disagree with in their social media feeds, for example.
But he said one could argue based on a less strict definition of the filter bubble concept, that “censoring states are attempting to create massive filter bubbles, but where the bubble is not reinforcing a user’s existing beliefs but rather is shaping those beliefs to suit the desires of the censor. In the worst case, and certainly at times in practice, this is so effective that citizens aren’t even aware they’re in a bubble.”
With state censors, Fisk said the bubble is “far more insidious because you’re talking about citizens being completely uninformed about things like massive government negligence bordering on atrocities at times, while in the Facebook case, you might get a link to — minor stuff in comparison.”
About the Author:
Photo: David Goehring