I started collecting my thoughts on this topic at the start of 2020, a time just a short while and a whole era ago. Matters of heated discussion back then made way for whole other concerns, but as you know firsthand – more and more of our daily lives is getting online, ensuring that the aspect of privacy and tracking stays top of mind across the whole internet-based industry, whether or not you already got online privacy fatigue (which I get!).

Online privacy is one of these topics that have been boiling for quite some time now, but there’s so much going on under this umbrella term that it’s hard to tell what’s what. Between privacy scandals from the likes of Facebook, massive data leaks, new regulation developing and taking shape across the world, malicious state and non-state actors, and moves by the major browser makers (Google with Chrome, Apple’s with Safari, and Mozilla Firefox), I’ll admit that even the most tech-savvy are honestly getting pretty confused here.

Being part of Dynamic Yield, an experience optimization platform, from its early days, I’ve witnessed firsthand the developing concerns among our customers and across the martech industry. We often needed to clearly articulate the differences between reality and rumor, regulation, and sentiment. One cannot survive as a vendor neatly on being legally in the right — you have to respond to sentiment from the field to stay relevant. As a vendor, you will at some point put your abstract ideas of ethics to the test, by making clear decisions on what you won’t do for profit.

My goal here is to provide you with a clear look at this topic, focusing on the web:

  • Where are things with the major browser makers, and (in my opinion) why? 
  • The considerations vendors and brands should pay attention to, in order to become future proof.

Meet the players: Google Chrome
Back in January, one hot topic in many publications covering martech and adtech was the then-imminent release of Chrome version 80. Most significantly, it put increased constraints on third-party cookies, in line with a few privacy-minded announcements coming from Google execs in the preceding months. 

Here is what the message was:

Within two years, third-party cookies would hopefully go the way of the Dodo as new, safe, and secure standard mechanisms emerge to replace them. The bad actors would have a much harder time, while legit ad revenue would continue flowing unhindered. Of course, all that is just as long we let Google lead the way – with Chrome 80 being a concrete milestone in getting there.

Looking at what was actually delivered, however, points to something else. Indeed, Google has made an important step in terms of security, as in protecting from malicious hackers wanting to gain access to your logged-in user accounts through what’s known as Cross-Site Request Forgery (CSRF) attacks. It closed a gap that should never have been there from the get-go, had the birth of the web been less haphazard than it actually was. Any respectable website already has countermeasures in place, yet we know that small-but-destructive developer mistakes are made even in the best of families. 

Overall, it’s a good change: cookies created by mybank.com cannot be used by a malicious website phishingport.com domain unless specifically marked as such. Who inherently does need such behavior, you may ask? Well, mostly trackers of all kinds, of which there are probably thousands. Chrome does not actually prevent anyone from creating such cross-website cookies, just like in the good old days. You just have to explicitly mark them as such, thus providing better security around all the rest of your cookies. And for good measure, they also forced such cookies to be delivered through encrypted connections only. But, do these changes in themselves provide any privacy, though?

Well, not really. Google does promise to deliver extra controls that would let you clear or block these promiscuous cookies (while occasionally breaking something unintended?), but it’s not clear when or how that would be available. My guess: buried three levels deep inside “Settings,” where similar stuff appears today — at least until some cookie-replacement feature is widely adopted.

What I think Google is attempting here is conflating privacy and security, or hacking and tracking.

By capitalizing on its good track record with security, Google is trying to assure us that it’ll also take care of our privacy, in a form of industry self-regulation. Google has a lot invested in making the web more secure, notably with its relentless push to make all website support encryption via HTTPS. A lot of Google business hinges on us feeling safe enough on the web. They do have a bunch of good-though-intricate ideas on how to keep targeted advertising and conversion measurement work in a more privacy-respecting manner.  The key to their approach, however, lies in something they repeatedly mention, which is: we need to do it in a responsible manner. 

In Google’s own words: “Some browsers have reacted to these concerns by blocking third-party cookies, but we believe this has unintended consequences that can negatively impact both users and the web ecosystem. By undermining the business model of many ad-supported websites, blunt approaches to cookies [emphasis mine] encourage the use of opaque techniques such as fingerprinting (an invasive workaround to replace cookies), which can actually reduce user privacy and control.”

True to their gentle “don’t rock the boat” approach, on April 3 Google announced a temporary rollback of the cookie changes already rolled out globally, so that nothing breaks unintentionally for all of us now working and shopping from home.

Who, then, are these unnamed irresponsible browsers they mentioned?

With a 36% market share in the United States, and about half of that worldwide as of Feb. 2020, it’s Apple Safari first and foremost. Apple has been moving with a few iterations of their ITP (Internet Tracking Prevention) since 2017, unilaterally putting new limitations on cookies, then reversing or tweaking them, with little visibility around their decision-making process. 

Since ITP 1.0 was launched, and up to the current version (2.3), Apple has been in a cat-and-mouse game with cross-site trackers, with every new iteration aimed at fighting the latest methods of circumventing Apple’s set rules. As one markedly “blunt” case in point, not only do they block third-party cookies by default, but they also rolled out a change limiting the lifetime of some first-party cookies to only seven days.

As they’ve found out, many tracking scripts loaded by websites relied on the ability to set cookies from the client-side (rather than in a server-side response from that third party) — effectively making such cookies “first party.” Since Safari does not have any way to know which cookies set in the client are ”really” first-party, they basically capped ‘em all. Now, the only long-lasting cookies are those set by servers in the customer’s domain in their response to the browser.

Apple has undoubtedly collected comprehensive statistics on how cookies are used across a very large corpus of websites and found that most users probably won’t notice anything breaking.

No one outside Apple knows what the next iteration of ITP might bring, and which websites would need to hastily tweak their code to avoid some features breaking. One thing I am sure of: Apple does not intend to “break the web” in a way that would motivate people to switch browsers. Any changes they are making are calculated to (mostly) hit their intended targets. If Apple were to outright ban server-set first-party cookies as well, the web would basically break as most website login mechanisms are based on them.

Ideology and realpolitik
Ideology always works best when aligned with self-interest; I think this holds true for Google, Apple, and virtually any commercial actor (and arguably, people at large). It doesn’t automatically label anyone evil, though. In my humble opinion, what matters is abiding to clear ethical guidelines exactly in those moments when it’s particularly tempting to bend them.

What’s in Apple’s interests, then? While they don’t want to break your browsing, they have a clear interest in you using native apps and them getting their revenue cut from paid apps and in-app purchases. However, keeping the ad revenue stream ongoing for makers of ad-supported apps has also been important to them — hence the rules still seem to be different and more lenient for apps than for the web.

That old Identifier for Advertising (IDFA) that uniquely identifies any iOS device, and visible to all apps on your device? It was on by default until iOS 14 was released last month. And despite Apple’s policies, some of the most popular third-party SDKs used by apps have been known to also collect user data for the benefit of the SDK maker. They are probably leading in privacy, but certainly not with that “one-track mind” they may have made us believe in. 

While I cannot predict what Apple will do in the future, I think both competing corporations would remain very careful with ad revenue, in the specific channels where it matters to each company. The degree to which they’ll agree on new privacy standards for the web is still left to be seen.

Looking forward: The role of browsers
Putting on my old software architect hat for a minute, it’s clear that the privacy problem on the web goes back to the HTTP protocol itself. Unlike the tightly controlled closed gardens of the App Store or Facebook, it wasn’t shaped by corporate interests. The band of experts who came together to work on the specs paid attention to hundreds of details; apparently they simply did not foresee privacy as the big future pain. Lacking a spec, ad-hoc solutions sprang up to fill the gap.

To build a more private web, I think that privacy negotiation as a fundamental part of any client-server connection would be key. Looking at Apple and Google’s models for app permissions, it did take quite a few iterations of tweaking and tuning each model to make it clearer, simpler, and more “sane” to the user. A similar solution must evolve for the web as well, even with so many stakeholders and interests at the table.

What does this mean for brands and vendors?
As a brand, you’re safe to assume that you’ll have to do more to integrate third-party tools in order to ensure they work reliably across devices. Specifically, you would increasingly need to tweak your own codebase to let these tools work “in your name,” with your first-party cookies.

Given that extra effort, and the level of trust you need to have in such tools, here are a few things you should do:

  1. Pick your vendors judiciously and strive to use fewer tools. The age of dropping dozens of third-party scripts onto your homepage is coming to an end. Any external tool must have a clear value proposition and the reputation to back it up and earn your trust.

  2. Educate yourself! Read up and be informed, so that you’re able to ask your vendors the real hard questions.

  3. Consider a gradual shift from client script-based integrations to using vendor APIs, to have complete control of when and how you use a vendor’s product.

If you’re a vendor, play it safe and adhere to current best practices. The customer (and their IT & security folks) would have to be more involved, and some of the out-of-the-box functionality you’re now offering may just have to go until a modern replacement is viable. In such a case, make sure the core of your value proposition is still in place, and start educating customers early.

Note that we did not cover the aspect of evolving regulation in this post (GDPR, CCPA, etc.), which is a whole other complex field. In my opinion, this is another driver pushing brands to cut down on the jungle of tools and invest more in making their core tools work well. Believe it or not, any vendor is also in a similar position, so we’ve felt the same need.

For a more private web to materialize, brands, vendors, and browsers would all have to proactively take part. The old adage of “no free lunch” still applies.