Application-monitoring company Sentry recently wrote about their experience removing cookies on their site, which allowed them to drop their cookie banner. I'm glad they wrote this up! But it also illustrates why many sites have cookie banners even when they don't seem to being doing anything risky.
I was curious whether their site actually did avoid setting any cookies, and visited it in a private browsing window with third-party cookies enables. As I browsed around the site I noticed several cookies being set:
Loading changelog.getsentry.com I received first-party cookies
_GRECAPTCHA
,ph_phc_UlHlA3tIQlE89WRH9NSy0MzlOg1XYiUXnXiYjKBJ4OT_posthog
, and_launchnotes_session
, plus third-party cookie_GRECAPTCHA
onrecaptcha.net
.Loading try.sentry-demo.com I received first-party cookies
sentrysid
,sc
, andsudo
.Loading sentry.io/auth/login I recieved first-party cookies
__stripe_mid
,__stripe_sid
,session
, andsentry-sc
, plus a third-party cookiem
onm.stripe.network
.Loading docs.sentry.io/product/performance/performance-video I received third-party cookie
__cf_bm
fromplayer.vimeo.com
.
It's possible there are others; I didn't run any sort of exhaustive search.
Now, Sentry does say:
For clarification, Sentry has removed all cookies, other than essential cookies that do not require site visitor consent. Work with your legal team to better understand which of your cookies qualify as essential cookies under the laws that apply to you.
And from a perspective of removing cookie banners this is right: you don't need to completely stop using cookies, you just need to limit your use of cookies to cases that are "strictly necessary in order to provide an information society service explicitly requested by the subscriber or user". For example, per the official guidance some things that are ok include setting a cookie when someone logs in, chooses a setting like "dark mode", or adds an item to their shopping cart: you can't do what the user asked for without cookies. But even then, you have to be quite careful to stay within the narrow limits of this exception: the guidance clarifies that you should generally use an expiration of a few hours or configure the cookie to be deleted when the user closes their browser.
Looking over the cookies they're currently setting, it doesn't look to me like they fall within this exception:
I don't see any reason why the changelog page would need to set cookies. When I asked they said this was a known issue that they were working on fixing.
-
While it's possible that the Sandbox implementation does fundamentally require cookies, because it's simulating a complex application you'd normally log in for, if you first load the sandbox none of that is accessible. It has a modal dialog asking for you work email address, and without it none of the functionality works. They should at least postpone setting the cookies until you've submitted the form:
-
I don't see why a login page requires setting any cookies: while you do need to set a cookie if the user actually logs in, I just loaded the page. When I asked they brought up CSRF prevention but that's a bit of a strange one. CSRF is an attack an attacker site directs the browser to submit a form to a victim site. If the victim site isn't taking steps to fight CSRF then it won't understand that the user didn't actually initiate this request, and the attacker can use this to take actions as the victim. But a CSRF vulnerability on a login would imply that the attacker had already successfully phished the user, at which point they can already initiate any action they wish on behalf of the user.
Even if these cookies are needed to prevent CSRF in a way I'm not seeing, I don't see why you would need one like
sentry-sc
that has a one-year expiration and aSame-Site=Lax
opt in to being shared in third-party contexts. And since none of these usePath
to scope themselves to just the login form, once you've visited that page you'll be sending cookies on every future pageview anywhere on the site. The Vimeo third-party cookie
__cf_bm
, is more debatable. It's documented as an essential cookie "which is part of Cloudflare's Bot Management service and helps mitigate risk associated with spam and bot traffic." I think this is a grey area: I can't find any explicit official guidance on whether using cookies to detect bots fits within the e-Privacy exemptions.
Overall I have a lot of sympathy for Sentry here. They're trying to be careful in how they use cookies, and I think cookies they have are mostly very reasonable and shouldn't require a cookie banner. On the other hand they do seem to me, as a lay person, to be out of compliance with the e-Privacy directive.
I think this is a good demonstration of why companies generally do choose to stick with cookie banners even though they're annoying. Technically-inclined users will say that as long as you're not doing anything nefarious you don't need to ask consent, but the exceptions the regulations set out are really quite narrow and it's easy to go wrong.
(I'd love to see the regulations changed here: there's no reason to single out storing data on the client for special treatment. The general protections in the GDPR offer a much more consistent approach to data privacy, and I'm not convinced e-Privacy adds anything useful anymore. And then, of course, a regulation that leads to cookie banners and other consent walls everywhere that people mostly click through without reading is clearly not working well.)
Fair enough, although I put a little less weight on the undesirable precedent because I think that precedent is already largely being set today. (Once we have precedents for regulating specific functionality of both operating systems and individual websites, I feel like it’s only technically correct to say that the case for similar regulation in browsers is unresolved.)
Also, the current legal standard just says that websites must give users a choice about the cookies; it doesn’t seem to say what the mechanism for that choice must be. The interpretation that the choice must be expressed via the website’s interface and cannot be facilitated by browser features is an interpretation, and I’d argue against that interpretation of the directive. I don’t see why browsers couldn’t create a ‘Do-Not-Track’-style preference protocol today for conveying a user’s request for necessary cookies vs all cookies vs an explicit prompt for selecting between types of optional cookies, nor any reason why sites couldn’t rely on that hypothetical protocol to avoid showing cookie preference prompts to many of their users (as long as the protocol specified that the browsers must require an explicit user choice before specifying any of the options that can skip cookie prompts; defaulting users to “necessary cookies only” or the all-cookies-without-prompts setting would break the requirement for user choice).
But we don’t see initiatives like that, presumably in large part because browsers don’t expect to see much adoption if they implement such a feature, especially since it’s the type of feature that requires widespread adoption from all parties (browser makers, site owners, and users) before it creates much value. Instead, lots of sites show cookie banners to you and I while we browse the web from American soil using American IP addresses, seemingly because targeting different users with different website experiences is just too sophisticated for many businesses. They evidently see this as a compliance requirement to be met at minimal cost rather than prioritizing the user experience. I don’t see how the current dynamic changes as long as websites still see this purely as a compliance cost to be minimized and as long as each website still needs to maintain their own consent implementations?