July 1, 2022

The future of data collection

Final post in a six-part series on digital privacy, data collection, and the erosion of informed consent.

In early 2020, Google announced it would phase out support for third-party cookies in its Chrome browser. The announcement was called, somewhat dramatically, “the death of the cookie.” Firefox, Safari, and Microsoft Edge had already made third-party cookie blocking the default, but Chrome accounts for more than half of all web traffic on both mobile and desktop. When Chrome moves, the industry follows.

Apple’s counter-move

Apple took a different route with the rollout of iOS 14.5, which introduced the “Ask App Not to Track” prompt, requiring apps to request explicit permission before tracking user activity across other apps and websites. iOS 15 went further with Mail Privacy Protection, which masks IP addresses and blocks third parties from tracking email opens. A “Hide My Email” feature lets users create disposable addresses when signing up for services, keeping their real email private while still receiving communications. For email marketers, this was not a minor inconvenience. Open-rate tracking, one of the foundational metrics of email marketing, suddenly became unreliable for a significant portion of their audience.

These are not minor adjustments. They represent a structural shift in how the two most powerful platform companies position themselves relative to user data. The advertising industry has responded accordingly: email marketers are rethinking open-rate metrics that depended on tracking pixels, and digital advertisers are searching for alternatives to the cross-site tracking infrastructure they have spent two decades building. The infrastructure described in the earlier posts of this series, the cookies, the fingerprints, the invisible pixels, is being partially dismantled by the same companies that built it or profited from it.

Will DeKrey, HubSpot’s Group Product Manager of Campaigns, framed it this way:

Buyers get to be in charge of the data they share; not sellers. And big corporations shouldn’t get to create markets for tracking and selling personal data, giving them an information advantage over smaller businesses, this means that each individual company, large or small, will need to get better and better at building trusted relationships with their audience, earning the right to learn who they are and what they’re interested in.

The sentiment is appealing. It describes the world as many of us would like it to work: companies earning trust, users making real choices, the relationship between business and audience built on something other than extraction. But it is worth examining what is actually happening beneath it.

Privacy as competitive positioning

It is tempting to read these developments as a turning point, the moment the industry recognized that the current model was unsustainable and began correcting course. The reality is less clear. Google’s cookie deprecation has been delayed multiple times and has taken a form that keeps Google’s own first-party data ecosystem largely intact. Google still knows what you search for, what you watch on YouTube, where you go on Google Maps, and what you read in Gmail. Deprecating third-party cookies restricts the ability of other companies to track users across the web. It does not restrict Google’s. Apple’s privacy features, similarly, double as competitive positioning against advertising-dependent rivals. The iPhone becomes the privacy device, the walled garden becomes the safe garden, and the competitive advantage is real and measurable. The motivations are mixed, and the outcomes will depend on what replaces the old infrastructure as much as on its removal.

Consolidation, not liberation

There is a third possibility that receives less attention: that smaller companies and independent publishers, unable to compete with the first-party data empires of Google and Apple, will be pushed further toward surveillance-based models, not away from them. If cross-site tracking becomes the province of companies large enough to own the entire ecosystem, and smaller players lose access to the behavioral data that funded their operations, the result may be more consolidation, not more privacy. The question is whose privacy improves and at whose expense.

The pattern is familiar from the history of regulation described earlier in this series: the industry absorbs the constraint and reorganizes around it, often emerging more concentrated than before. Cookie consent mechanisms became design problems to be optimized. Privacy legislation created compliance surfaces that were gamed. And now the deprecation of third-party cookies may simply shift the advantage from networks that track across sites to platforms that own the sites themselves.

The two-sided market model that has financed the web since its commercialization is not going away. Someone will pay for content, either the user directly or an advertiser indirectly, and the advertiser will want to know something about who they are reaching. The question is not whether data will be collected but under what terms, with what transparency, and with how much genuine control on the part of the person whose data it is.

What changed for me

This is where my own position has shifted over the course of writing this series. I started my career in digital marketing treating data collection as a neutral tool: a way to understand audiences, improve targeting, and measure results. The research behind this series forced me to see the system from the other side, to understand not just how the tracking works but what it means for the people being tracked.

That shift is not comfortable. I have placed tracking pixels. I have configured cookie consent flows. I have sat in meetings where we reviewed click-through rates and conversion data without once asking where that data came from or whether the people it described knew they were being measured. The system was the water we swam in, and questioning it felt like questioning whether the work itself was legitimate. The research answered that question more clearly than I expected. The discomfort is productive. It should be.

The system is the model

At each stage of this trajectory, from the first banner ad to the current state of behavioral surveillance online, the asymmetry between what users know and what is done with their data has grown. Regulation has struggled to close the gap, technical solutions have been met with technical workarounds, and dark patterns have turned the mechanisms of consent into mechanisms of compliance. The system is not broken. It is working exactly as designed. The tracking infrastructure, the data economy, the privacy erosion, the manipulation of consent: these are not failures of an otherwise sound model. They are the model.

The question is whether the design is acceptable. And it is a design question, not only a policy question. Every interface, every default, every flow that makes data surrender easier than data protection is an argument about what the system values. Those arguments are being made thousands of times a day, in button colors and toggle positions and the number of clicks between a user and their own preferences. The people making those arguments know what they are doing. The people affected by them, mostly, do not.

Whether the current shifts by Apple and Google represent genuine change or a repositioning of power within the same extractive logic remains to be seen. What is clear is that the current system does not respect user privacy in any meaningful sense, and that the industry knows it. The interesting question now is not whether the model will change but who will shape what replaces it, and whether users will have any real say in that process.