Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Improving privacy and security on the web (chromium.org)
121 points by migueldemoura on May 7, 2019 | hide | past | favorite | 60 comments


Wow, if this works, this is basically the end of CSRF. Essentially: CSRF relies on an HTTP POST to VICTIM.COM triggered by HTML on EVIL.COM, and that request carrying cookies. Today, even though SameSite exists, the default --- SameSite=None --- maintains that longstanding status quo. But after the change, the Chrome default will be SameSite=Lax, and while EVIL.COM will still be able to trigger POSTs to VICTIM.COM, those requests will no longer carry cookies.

To get the cookies to work from EVIL.COM, VICTIM.COM's developers will have to explicitly set SameSite=None on their session cookies. Which nobody will do, because nobody sets SameSite at all today.

Better still: 99 out of 100 CSRF exploits (maybe 999 out of 1000) target endpoints for which SameSite=None isn't needed; they're cookies nobody ever uses cross-site to begin with. There are only limited cases where anyone needs the behavior to change, and those cases don't track the most sensitive cookies.

As a vulnerability researcher for whom exploitable bugs mostly exist to spark joy: good riddance to CSRF. It was a dumb bug class, and never, ever fun to exploit.


Before you start emitting samesite=lax, be warned that webkit/safari iOS12 has a weird implementation: if you set the cookie in a POST method and redirect to a GET (like most default oauth explicit grant login flows), the cookie will not be sent: they consider POST pages as "unsafe". See https://bugs.webkit.org/show_bug.cgi?id=188165#c12 for more details.

Edit: it looks like people have found other issues since I last looked at this bug:

> It's a serious issue affecting many common user flows, including the flow of visiting a website coming from a GMail link. If the user comes from GMail, it reaches the destination website without any cookies, thereby breaking functionalities that depend on session/login cookie and CSRF cookie. Only fix for now seems to be removing the Lax flag from cookies. (https://bugs.webkit.org/show_bug.cgi?id=188165#c47)

At this point, it looks like the safest approach is to only emit sameSite if the browser isn't safari.


We did just clarify this in the spec: https://github.com/httpwg/http-extensions/commit/49bcb4fddb8.... Hopefully that, plus the tests we'll add in https://bugs.chromium.org/p/chromium/issues/detail?id=960375 will help other vendors align their behavior with developer expectations.


Is a SameSite=Strict cookie sent if VICTIM.COM does a redirect?

If so, it sounds like the next trick to follow would be a type of "redirect gadget" to get SameSite=Strict to be exploitable again. If so, maybe it's only the end of CSRFs without a gadget.

Eg of "redirect gadget", CSRF via VICTIM.COM/open-redirect?url=/update-password...


I read half of your comment, and thought: wow, this guy really explains things very well. Incredibly well. It's that simplicity and clearness that you only find in great teachers with very deep technical skills.

And then I saw your username... Ah, of course! :)


This is a continuation of a long arc of convergent work [1][2][3][4][5] by various people over several years; I've been following along [6].

The innovation of this proposal is to work towards the crossdomain cookie transmission being less insecure-by-default, by eventually making the current, limitless behavior opt-in.

This shifts the incentive of developers: presumably those whose sites require crossdomain acceptance of cookies will modify their sites accordingly, while those whose sites don't, or those who haven't thought about the issue will see fewer incidences of the most egregious POST-based CSRF.

[1] https://www.microsoft.com/en-us/research/publication/atlanti... [2] https://bugzilla.mozilla.org/show_bug.cgi?id=795346 [3] https://github.com/mozmark/SameDomain-cookies/blob/master/sa... [4] http://homakov.blogspot.com/2013/02/rethinking-cookies-origi... [5] https://tools.ietf.org/html/draft-west-first-party-cookies-0... [6] https://news.ycombinator.com/item?id=13689697#13691022


This is great - the `SameSite=lax` attribute is arguably how cookies should have worked in the first place, and I'm quite pleased that it's an existing RFC and not a proprietary change being done just in Chrome. Hopefully other browsers follow suit.

What worries me is the vague commitment to stop browser fingerprinting - not a lot of detail there and I'm fearful that useful features might be getting crippled. I don't think I'm as convinced that browser fingerprinting is as big of an issue as CSRF (prevented by the cookie changes here). Time will tell I suppose.


The reason this is related to browser fingerprinting is that cross-site cookies aren't _just_ used for CSRF, they're also a way to track users across sites.

With this change, developers will have to _explicitly_ declare when they're using cookies for that purpose (by setting SameSite=none) which makes it easier for browsers to identify cookies used for tracking and give users control over them.


Interesting. So apparently Chrome is going to stop sending cookies in cross-site requests unless they're created with `CrossOrigin=None` and the page is loaded over HTTPS?


We're proposing treating cookies as `SameSite=Lax` by default (https://tools.ietf.org/html/draft-ietf-httpbis-rfc6265bis-03...). Developers would be able to opt-into the status quo by explicitly asserting `SameSite=None`, but to do so, they'll also need to ensure that their cookies won't be delivered over non-secure transport by asserting the `Secure` attribute as well.

https://tools.ietf.org/html/draft-west-cookie-incrementalism spells out the proposal in a bit more detail.


This is exactly the information I was looking for when I opened chromium blog post. Technical and to the point. Is there a reason why this couldn't be appended to the blog post?


Because you aren't really the audience for that post, and they can safely assume you'll dig deeper anyways?


If we're not the audience then who is? This was made to the Chromium open source blog, which is typically a developer heavy blog (with previous topics like "Hint for low latency canvas contexts"). Throwing a few reference links at the bottom shouldn't harm their message with the less technically savvy.


I'm just guessing. Something else that sparks joy for me: the fact that Google will never give any of their announcements the titles they're justifying, like, "OMG, WE KILLED CSRF!", and that I'll have to dig in a bit to see how big a deal what they just did is. It's like every "Improving privacy and security on the web" is a little gift I get to unwrap. It's like Justin Schuh and Mike West's version of "one more thing".


Good feedback. Chromium blog content varies from product announcements to technical summaries. We did link to this article in the Chromium blog:

<https://web.dev/samesite-cookies-explained/>

which does go into a good deal of technical detail. A challenge is that even experienced web developers didn't know much about SameSite prior to this announcement.


A few more reference links might not hurt, true.

As for who the audience is, perhaps people who were alarmed by the WSJ's scaremongering: https://www.wsj.com/articles/googles-new-privacy-tools-to-ma...

Someone leaked the news to the WSJ, which tried to spin this as an anti-competitive move.


Have you done some crawling or other means of checking as to how many web pages with a login your change would break?


Unfortunately, crawling isn't a terribly effective way of evaluating breakage, as the crawler doesn't sign-in, and therefore doesn't attempt to federate sign-in across multiple sites. That's part of the reason that we're not shipping this change today, but proposing it as a (near-)future step.

To that end, we've implemented the change behind two flags (chrome://flags/#same-site-by-default-cookies and chrome://flags/#cookies-without-same-site-must-be-secure) so that we can work with developers to help them migrate cookies that need to be accessible cross-site to `SameSite=None; Secure`.

Ideally, we won't unintentionally break anything when we're confident enough to ship this change.


I think this is a well thought out proposal. Thank you for going in that direction. Privacy and security should come first.

I would even go so far to substitute "should" with "must". But it is a first step.


The easiest way to improve cookie privacy is to block 3rd party cookies by default. Adding new polices is not the right solution. 3rd party cookies are completely unnecessary.


This will never happen. Google's entire ecosystem is built around advertising, so everything is an illusion of privacy. Things like 3rd party cookies will be the norm and Google won't block them, although firefox has started to head down that direction.

Chrome mobile, for example, can support add-ons and privacy features, but the risk to ad revenue is preventing them from making it available. Chrome being the default browser for Android means most people won't switch and they have >~80% marketshare.


Safari's Intelligent Tracking Prevention effectively blocks third-party cookies, at least for resources that are used in a third-party context on multiple sites. This google Chrome proposal also does it, but provides a dead simple workaround for trackers by sending SameSite=None and Secure with the cookie.


Let me guess: Google will tie the new proposal to a new Chrome setting which effectively identifies tracking cookies according to this new identifier, and will regularly purge them.


Is this another way for Google to prevent you from clearing their cookies via the 'Clear Cookies' option?

Its a step in the right direction with enforcing SameSite cookie scoping, but we must be cautious that Google doesn't use this to force you to always be logged in. Google has a long way to go to rebuild trust after that last browser login debacle. I don't trust em.


It's a way to fix one of the biggest security mistakes of the web (being able to send an _authenticated_ request - i.e. with cookies - to any domain from any other domain, for example from evil.com to youremailprovider.com with the payload "delete all emails"), that was kept on by default for two decades due to backward compatibility.

For a long time it required annoying workarounds (CSRF tokens) to have this security hole mitigated, then just an opt-in flag on the cookies, but as usual, most companies don't know/care about it, so having protection by default is the natural solution (although it _will_ probably break quite a few legacy websites, but for a greater good).


Does Chrome support automatically clearing cookies at shutdown yet? I seem to remember it didn't but I haven't used it recently.

Edit: I searched for it, and it seems they have added the feature, but maybe not the related feature of clearing browsing history at shutdown.


Isn't that exactly what Incognito mode does?


google will absolutely not do anything in the name of trust and privacy as we define them because their business model is 100% about selling ads to more people.


Why do cross-domain requests need cookies at all? Honest question, why couldn't we just stop sending them ever?


At my company for example we run services on multiple domains that are all authenticated by a single backend. Could probably be solved by some re-architecturing, but changing cookie behaviour would definitly break existing sites.


Is there a timeline for this change?


Really disappointed in you Google for not addressing fingerprinting much sooner.


Be disappointed all you want, but they're not really "addressing" fingerprinting. It is exceptionally difficult --- computer-science Hard difficult --- to prevent fingerprinting; all you can really do is break popular libraries people use today. It's an arms race, and a much harder arms race than exploit-hardened runtimes are.


Is it? As I understand it, fingerprinting relies on properties and headers, both of which could be altered (eg adding headers for software that doesn't exist, and only sometimes, change the order, etc). You probably can't make them do it so that websites can't see the browser type, and you might not want to.


No, fingerprinting is much deeper than that. Look at how DNS cookies work, for instance; you can create a serviceable fingerprint of a device just from how resolvers randomize multiple A records for the same label.


It's actually pretty easy overall... just don't use JavaScript and don't send user-agent string that shows browser/OS information.


No web browser is going to disable JavaScript by default and Google is one of the worst offenders when it comes to User-Agent sniffing (you can experiment with this on your own by setting a custom User-Agent and attempting to use various Google websites).

And even then that's far from enough to stop fingerprinting. Ordering of http headers has been used to fingerprint browsers. The <picture> element can be used to leak browser screen size. CSS can leak information in @media and @support queries by requesting specific images. It's even possible to create "DNS cookies".

More: http/2 passive fingerprinting: https://www.akamai.com/us/en/multimedia/documents/white-pape... Fingerprinting servers based on header order: http://www.net-square.com/httprint_paper.html#httpheader List of CSS Media queries, including vendor specific ones: https://browserleaks.com/css DNS Cookies Demonstration: http://dnscookie.com/


DNS cookies, by the way, are awesome.


User-agent strings should have been abolished long ago. It may not do much against sniffing, but it would spare us monstrosities like "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/17.17134"


I was surprised to see how many popular websites rely on things like user agent or referer header to deliver functionality. As soon as you start tinkering with them, they stop working, or say you use an older browser, etc.


Back in the day, safari 2.0.4 encountered a large amount of site breakage because sites were using useragent.indexof(“4”)!=-1 as the test for “are you Netscape 4”, and if you were then they’d use netscape’s layer apis (netscape’s alternative to css). Any changes to the user agent are very scary - that’s kind of how they ended being as terrible as they are.

Yet still sites continue to use the user agent to gate access. The same sites also like to complain about how bizarre UA strings are.


Unfortunately websites break if you stop sending it or dramatically change it. Most browsers can't afford the risk of a dramatic change here.


Breaking things is (sometimes) the only way to progress.


You will be part of the rare minority that uses that configuration which by itself is a fingerprint.


The point probably was to make that configuration default


That configuration will not be the default. You might just as productively argue that the best way to defeat fingerprinting would be to default to Gopher.


You can do a lot of things with just HTML and CSS, more than what Gopher allows.

And for what you can't, a banner asking for permission to run Javascript, like we have/had for Java/Flash/ActiveX


Can a user meaningfully determine the correct answer to that question?

The experience is “I clicked no and nothing worked” vs “I clicked yes and the site worked”.

I get that you don’t like it, but the reality is that the web is a platform that includes JS as a core technology. The reason for limiting java and activex was because they had catastrophically terrible security properties more or less by design.

Even flash had problems, but was sensible enough to correct many deficiencies and defer to the browser for interaction with anything outside of its view. Which is why you weren't asked about running flash on every website you went to. JS and the various web/html/dom APIs all have much much stricter constraints than anything flash had - they are designed to be safe in spite of all content being untrusted.

More over dialogs like that are largely recognized among browser developers as being a form of blame shifting - a regular user has no reasonable way to determine whether or not saying “yes” is safe. The purpose of asking them, is so that if something does go wrong you can say “they shouldn’t have clicked yes”.


> Can a user meaningfully determine the correct answer to that question?

> The experience is “I clicked no and nothing worked” vs “I clicked yes and the site worked”.

I agree; but the extra click may be an insentive for web developpers to try not to use JS.

> Which is why you weren't asked about running flash on every website you went to.

Firefox did ask about running Flash, because "attackers can also use the security flaws in Flash": https://support.mozilla.org/en-US/kb/set-adobe-flash-click-p...

> they are designed to be safe in spite of all content being untrusted

But they have flaws, like Flash.


Just as long as we're all clear that this is not a real debate, and a serious programming language connected to the DOM is not going anywhere; you are stuck with that design.


> just HTML and CSS,

yes, @media queries for example that trivially let the site fingerprint you again.


That would be pretty nice in many ways.


Browser and OS information is not the biggest info leak, it really isn’t.

Compared to cookies in the request.

Time stamps in the request and the tls header.

IP address, etc etc

People vastly overestimate the value of those data in the UA


There's a lot of fingerprintable info in HTTP headers other than User-Agent and in parts of the User-Agent string other than the browser/OS versions.


This is a completely unrelated aspect of the modern web. Fingerprinting is a giant mess of edgecases to manage. For all my feelings on Chrome’s privacy track record, this is an incredibly big improvement to security across the web. More specifically this change to cookie policy is a “fix an entire class of security bugs” change vs piecemeal fixes to individual instance of a bug class.


Not that any step towards additional privacy protections isn't a good thing, same for security. But Google has got to be one of the major contributors to the erosion of privacy.

How about chrome nagging to have you sign in? How about their very own ad networks?


You mean like Firefox sync and Pocket? The so called founder of JavaScript was the CEO of Mozilla, and that is the biggest cause of erosion of privacy on the web.


Absolutely. I should have the option to use a browser, be it Firefox or Chrome, without unwanted integrations or 'features'.

I'd like a browser that just browses the internet and respects my privacy. But apparently that's too much to ask


Surf / Suckless: https://surf.suckless.org


You mean like Safari? ;D




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: