WEI? I’m a frayed knot

Two pieces of string walk into a bar.

The first piece of string asks for a drink.

The bartender says, “Get lost. We don’t serve pieces of string.”

The second string ties a knot in his middle and messes up his ends. Then he orders a drink. 

The bartender says, “Hey, you aren’t a piece of string, are you?”

The piece of string says, “Not me! I'm a frayed knot.”

Google is adding code to Chrome that will send tamper-proof information about your operating system and other software, and share it with websites. Google says this will reduce ad fraud. In practice, it reduces your control over your own computer, and is likely to mean that some websites will block access for everyone who's not using an "approved" operating system and browser. It also raises the barrier to entry for new browsers, something Google employees acknowledged in an unofficial explainer for the new feature, Web Environment Integrity (WEI).

If you’re scratching your head at this point, we don’t blame you. This is pretty abstract! We’ll unpack it a little below - and then we’ll explain why this is a bad idea that Google should not pursue

But first…

Some background

When your web browser connects to a web server, it automatically sends a description of your device and browser, something like, "This session is coming from a Google Pixel 4, using Chrome version 116.0.5845.61." The server on the other end of that connection can request even more detailed information, like a list of which fonts are installed on your device, how big its screen is, and more. 

This can be good. The web server that receives this information can tailor its offerings to you. That server can make sure it only sends you file formats your device understands, at a resolution that makes sense for your screen, laid out in a way that works well for you.

But there are also downsides to this. Many sites use "browser fingerprinting" - a kind of tracking that relies on your browser's unique combination of characteristics - to nonconsensually identify users who reject cookies and other forms of surveillance. Some sites make inferences about you from your browser and device in order to determine whether they can charge you more, or serve you bad or deceptive offers

Thankfully, the information your browser sends to websites about itself and your device is strictly voluntary. Your browser can send accurate information about you, but it doesn't have to. There are lots of plug-ins, privacy tools and esoteric preferences that you can use to send information of your choosing to sites that you don't trust. 

These tools don't just let you refuse to describe your computer to nosy servers across the internet. After all, a service that has so little regard for you that it would use your configuration data to inflict harms on you might very well refuse to serve you at all, as a means of coercing you into giving up the details of your device and software.

Instead, privacy and anti-tracking tools send plausible, wrong information about your device. That way, services can't discriminate against you for choosing your own integrity over their business models.

That's where remote attestation comes in.

Secure computing and remote attestation

Most modern computers, tablets and phones ship from the factory with some kind of "secure computing" capability. 

Secure computing is designed to be a system for monitoring your computer that you can't modify, or reconfigure. Originally, secure computing relied on a second processor - a "Trusted Platform Module" or TPM - to monitor the parts of your computer you directly interact with. These days, many devices use a "secure enclave" - a hardened subsystem that is carefully designed to ensure that it can only be changed with the manufacturer’s permission..

These security systems have lots of uses. When you start your device, they can watch the boot-up process and check each phase of it to ensure that you're running the manufacturer's unaltered code, and not a version that's been poisoned by malicious software. That's great if you want to run the manufacturer's code, but the same process can be used to stop you from intentionally running different code, say, a free/open source operating system, or a version of the manufacturer's software that has been altered to disable undesirable features (like surveillance) and/or enable desirable ones (like the ability to install software from outside the manufacturer's app store).

Beyond controlling the code that runs on your device, these security systems can also provide information about your hardware and software to other people over the internet. Secure enclaves and TPMs ship with cryptographic "signing keys." They can gather information about your computer - its operating system version, extensions, software, and low-level code like bootloaders - and cryptographically sign all that information in an "attestation."

These attestations change the balance of power when it comes to networked communications. When a remote server wants to know what kind of device you're running and how it's configured, that server no longer has to take your word for it. It can require an attestation.

Assuming you haven't figured out how to bypass the security built into your device's secure enclave or TPM, that attestation is a highly reliable indicator of how your gadget is set up. 

What's more, altering your device's TPM or secure enclave is a legally fraught business. Laws like Section 1201 of the Digital Millennium Copyright Act as well as patents and copyrights create serious civil and criminal jeopardy for technologists who investigate these technologies. That danger gets substantially worse when the technologist publishes findings about how to disable or bypass these secure features. And if a technologist dares to distribute tools to effect that bypass, they need to reckon with serious criminal and civil legal risks, including multi-year prison sentences.

WEI? No way!

This is where the Google proposal comes in. WEI is a technical proposal to let servers request remote attestations from devices, with those requests being relayed to the device's secure enclave or TPM, which will respond with a cryptographically signed, highly reliable description of your device. You can choose not to send this to the remote server, but you lose the ability to send an altered or randomized description of your device and its software if you think that's best for you.

In their proposal, the Google engineers claim several benefits of such a scheme. But, despite their valiant attempts to cast these benefits as accruing to device owners, these are really designed to benefit the owners of commercial services; the benefit to users comes from the assumption that commercial operators will use the additional profits from remote attestation to make their services better for their users.

For example, the authors say that remote attestations will allow site operators to distinguish between real internet users who are manually operating a browser, and bots who are autopiloting their way through the service. This is said to be a way of reducing ad-fraud, which will increase revenues to publishers, who may plow those additional profits into producing better content. 

They also claim that attestation can foil “machine-in-the-middle” attacks, where a user is presented with a fake website into which they enter their login information, including one-time passwords generated by a two-factor authentication (2FA) system, which the attacker automatically enters into the real service’s login screen. 

They claim that gamers could use remote attestation to make sure the other gamers they’re playing against are running unmodified versions of the game, and not running cheats that give them an advantage over their competitors.

They claim that giving website operators the power to detect and block browser automation tools will let them block fraud, such as posting  fake reviews or mass-creating bot accounts.

There’s arguably some truth to all of these claims. That’s not unusual: in matters of security, there’s often ways in which indiscriminate invasions of privacy and compromises of individual autonomy would blunt some real problems. 

Putting handcuffs on every shopper who enters a store would doubtless reduce shoplifting, and stores with less shoplifting might lower their prices, benefitting all of their customers. But ultimately, shoplifting is the store’s problem, not the shoppers’, and it’s not fair for the store to make everyone else bear the cost of resolving its difficulties.

WEI helps websites block disfavored browsers

One section of Google’s document acknowledges that websites will use WEI to lock out browsers and operating systems that they dislike, or that fail to implement WEI to the website’s satisfaction. Google tentatively suggests (“we are evaluating”) a workaround: even once Chrome implements the new technology, it would refuse to send WEI information from a “small percentage” of computers that would otherwise send it. In theory, any website that refuses visits from non-WEI browsers would wind up also blocking this “small percentage” of Chrome users, who would complain so vociferously that the website would have to roll back their decision and allow everyone in, WEI or not.

The problem is, there are lots of websites that would really, really like the power to dictate what browser and operating system people can use. Think “this website works best in Internet Explorer 6.0 on Windows XP.” Many websites will consider that “small percentage” of users an acceptable price to pay, or simply instruct users to reset their browser data until a roll of the dice enables WEI for that site.

Also, Google has a conflict of interest in choosing the “small percentage.” Setting it very small would benefit Google’s ad fraud department by authenticating more ad clicks, allowing Google to sell those ads at a higher price. Setting it high makes it harder for websites to implement exclusionary behavior, but doesn’t directly benefit Google at all. It only makes it easier to build competing browsers. So even if Google chooses to implement this workaround, their incentives are to configure it as too small to protect the open web.

You are the boss of your computer

Your computer belongs to you. You are the boss of it. It should do what you tell it to. 

We live in a wildly imperfect world. Laws that prevent you from reverse-engineering and reconfiguring your computer are bad enough, but when you combine that with a monopolized internet of five giant websites filled with screenshots of text from the other four,” things can get really bad.

A handful of companies have established chokepoints between buyers and sellers, performers and audiences, workers and employers, as well as families and communities. When those companies refuse to deal with you, your digital life grinds to a halt. 

The web is the last major open platform left on the internet - the last platform where anyone can make a browser or a website and participate, without having to ask permission or meet someone else’s specifications.

You are the boss of your computer. If a website sets up a virtual checkpoint that says, “only approved technology beyond this point,” you should have the right to tell it, “I’m no piece of string, I’m a frayed knot.” That is, you should be able to tell a site what it wants to hear, even if the site would refuse to serve you if it knew the truth about you. 

To their credit, the proposers of WEI state that they would like for WEI to be used solely for benign purposes. They explicitly decry the use of WEI to block browsers, or to exclude users for wanting to keep their private info private.

But computer scientists don't get to decide how a technology gets used. Adding attestation to the web carries the completely foreseeable risk that companies will use it to attack users' right to configure their devices to suit their needs, even when that conflicts with tech companies' commercial priorities. 

WEI shouldn't be made. If it's made, it shouldn't be used. 

So what?

So what should we do about WEI and other remote attestation technologies?

Let's start with what we shouldn't do. We shouldn't ban remote attestation. Code is speech and everyone should be free to study, understand, and produce remote attestation tools.

These tools might have a place within distributed systems - for example, voting machine vendors might use remote attestation to verify the configuration of their devices in the field. Or at-risk human rights workers might send remote attestations to trusted technologists to help determine whether their devices have been compromised by state-sponsored malware.

But these tools should not be added to the web. Remote attestations have no place on open platforms. You are the boss of your computer, and you should have the final say over what it tells other people about your computer and its software. 

Companies' problems are not as important as their users' autonomy

We sympathize with businesses whose revenues might be impacted by ad-fraud, game companies that struggle with cheaters, and services that struggle with bots. But addressing these problems can’t come before  the right of technology users to choose how their computers work, or what those computers tell others about them, because the right to control one’s own devices is a building block of all civil rights in the digital world.. 

An open web delivers more benefit than harm. Letting giant, monopolistic corporations overrule our choices about which technology we want to use, and how we want to use it, is a recipe for solving those companies' problems, but not their users'.

Related Issues