There has been more noise about WebRTC making it possible to track users. We have covered some of the nefarious uses of WebRTC and look out for it before. After reading a blog post on this topic covering some allegedly new unaddressed issues a week ago I decided to ignore it after some discussion on the mozilla IRC channel. But this has some up on a the twitter-sphere again and Tsahi said ‘ouch’, here are my thoughts.
Claims
The blog post (available here) makes a number of claims about how certain Chrome behavior makes fingerprinting easier:
- Chrome started caching certificates for 30 days recently, creating a cookie-like attack surface for privacy
- this allows cross-origin tracking of users
- the incognito mode behavior is inconsistent with respect to this
Caching certificates
First, there is a claim that the way Chrome caches certificates changed recently:
In the past, Google Chrome used to generate a new self-signed certificate for every WebRTC PeerConnection. But now (using Chrome 46, or maybe earlier as i did not check) it generates a self-signed certificate which is valid for one month and uses it for all PeerConnections of a particular domain.
The code used to demonstrate this behaviour is rather odd, too. It uses the getStats API to the query the fingerprint, which is also available more easily in the SDP.
Chrome has cached certificates in this way for about two years, this is not real news. One of the reasons for this is that it is rather expensive to generate the current private keys for DTLS, especially on mobile devices. In the future, there will be more control over this behaviour. Neither Firefox nor Edge currently cache certificates.
To be fair, the WebRTC team made a serious blunder here. Until Chrome 45, the certificate was not cleared when cookies were cleared, only when all data was cleared. The bugfix for this only appeared in the Chrome 47 release notes:
Issue 510850 DTLS cert should be cleared when cookies are cleared
Cross-Origin Tracking
So this part is not really news. The second claim made in the blog post is that this enables cross-origin tracking:
To test this go to http://www.kapejod.org/tracking/test.html and to http://kapejod.org/tracking/test.html. Open the network tab of Chrome’s developer console and compare the urls of the requested “tracking.png”. They should contain the same fingerprint, now!
They do. Now, let’s look at this test page:
1 2 3 4 5 6 7 |
// make up some random id var transactionId = 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function(c) {var r = Math.random()*16|0,v=c=='x'?r:r&0x3|0x8;return v.toString(16);}); var fragment = document.createDocumentFragment(); var div = document.createElement("DIV"); div.innerHTML = '<iframe src="http://kapejod.org/tracking/identify.html?'+transactionId+'" width="1" height="1" style="display:none;"/>'; fragment.appendChild(div); document.body.insertBefore(fragment, document.body.childNodes[document.body.childNodes.length - 1]); |
It includes the URL http://kapejod.org/tracking/identify.html. Let’s also look at the code there as well. It executes the code shown above and logs the fingerprint to the console:
1 |
console.log('your fingerprint is: ' + fingerprint); |
Now why is the fingerprint the same? Well, the iframe is always included from kapejod.org. Which means the Javascript is executed within the context of this origin.
So Chrome can use the persisted fingerprint. As well as any cookies and localStorage data. The attack surface here is no worse than setting a cookie.
Another thing related to this (and I am surprised this has not yet been mentioned) are the deviceIds returned by navigator.mediaDevices.enumerateDevices. Those are also persisted with the same lifetime as cookies. The W3C mediacapture specification has a paragraph about security and privacy considerations on this:
The identifiers for the devices are designed to not be useful for a fingerprint that can track the user between origins, but the number of devices adds to the fingerprint surface. It recommends to treat the per-origin persistent identifier deviceId as other persistent storages (e.g. cookies) are treated.
Again, WebRTC and other HTML5 techniques increase the fingerprint surface. But by design, this is not worse than cookies or equivalent techniques like localStorage.
Incognito Mode
Last but not least the blog post makes claims about the incognito mode:
But to make it generate a new one you have to close ALL incognito tabs. Otherwise you can be tracked across multiple domains.
Again, this behaviour is consistent with the incognito mode behaviour for things like localStorage. In both Chrome and Firefox. In incognito mode, open a site, set something in localStorage. Open another tab. Close first tab. Navigate to same site. Check localStorage. Boo!
tl;dr
There is no real news here. In Germany, we call this ‘olle kamellen’.
{“author”: “Philipp Hancke“}
Aswath Rao says
Granting the attack surface identified in the reference articles are no more than what can be done with cookies, the conclusion shouldn’t be that there is no news here. Instead the question to be answered is that should regulations regarding cookies should now be extended to WebRTC use cases. Am I wrong?