What is Google’s FLoC technology?

Illustration for the article What you need to know about FLoC, Google's ad targeting technology intends to leave us all

Photo: David Ramos (Getty Images)

About two weeks ago, millions of Google Chrome users signed up for an experiment they never agreed to take part in. Google had just released a test for Federated Cohort Learning – or FLoC – a new type of ad targeting technology designed to be less invasive than the average cookie. One blog post announcing the process, the company said it would only impact a “small percentage” of random users in ten different countries, including the US, Mexico and Canada, with plans to expand globally as the tests run.

These users probably won’t notice anything different when they click on Chrome, but behind the scenes, the browser keeps a close eye on every site they visit and the ad they click on. These users will have their browsing habits profiled and packaged and shared with countless for-profit advertisers. Sometimes this month, Chrome will give users an option to drop this experiment, according to a post on the Google blog – but for now, their only option is to block all third party cookies in the browser.

That’s if I even know that these tests happen in the first place. While I was writing mine correct action about FLoC so far, the strongest voices I’ve seen on the subject are either marketing nerd, political nonsense, or nerdy politicians working in marketing. This may be due to the fact that – except for a few blog posts here or there – the only cookies Google gives to people who want to know more about FLoC are unsurpassed code pages, an inscrutable Repo GitHub, and untried email list. Even though Google bothered to ask for consent before enrolling a random sample of its Chrome user base in this process, chances are they don’t know what they agreed to.

(For registration, you can check if you were enrolled in this initial test using this tool at hand of the Electronic Frontier Foundation.)

Because Google doesn’t have a good experience being direct about its privacy practices, we decided to write the basics of this technology, the process and why the promises of FLoC are not, in fact, all that are broken.

“Is WTF a FLoC?”

In Google own words, is a ‘confidentiality mechanism for selecting interest-based ads’. In common human terms, it’s a way to track web users for ad targeting, in a way that’s more friendly to privacy than cookies and code Advertisers have relied so far – at least, that’s what Google says.

“How should it work?”

It’s a bit complicated. When someone floats from one site to another on the web using a FLoC-powered browser, that browser will use an internal algorithm to develop an appropriate “cohort of interests” to throw at that person, and these cohorts will be recalculated weekly. . These specific cohorts, says Google, are built from THOUSAND by different users at a time, making tracking and targeting specific browser history almost impossible for any sleazy types of adtech.

As an example here: I am renovating my apartment, which means I spend two good hours a day clicking on sites for stores like West Elm, Target, IKEA and the like. In this situation, my browser could (quite accurately) label me as a home decor nerd and push me into a cohort of thousands of other people who also spend hours on the couches.

In FLoC, each cohort is given a name that is a mixture of letters, numbers, or both, so let’s name the home decor cohort HGTV, after legendary channel with the same name.

Next time you visit a site for tips on, I don’t know, restoring my couch, the site may ask the cohort they belong to. When it is noticed that I am part of the HGTV cohort, the site can then track my on-the-spot behavior and the ads on the couch that I inevitably click, and then aggregate this data with other people in the same cohort as it is dripping.

From time to time, aggregate data about what the HGTV cohort is part of (re-upholstering the couch! Removable wallpaper! Granite countertops!) Is uploaded to any ad network that a particular site might work with.

Let’s just say that the network in question is Google Ads since almost each site he uses it. If I try to browse an ad-supported news site – such as the one you’re on right now – after checking the content on the couch, that news site will also ask my browser about my cohort (HGTV ).

Once established, my cohort ID is transmitted to the partner’s ad networks. naturally includes Google network. Based on the data that this ad serving system has previously obtained on this cohort (i.e., it could probably use a new sofa), it reaches its ad catalog in about 7 million various advertisers waiting to run. The ad platform finds an ad for a new couch and makes the news site, where I see it, immediately give up the idea of ​​redoing anything and clicking.

“How is this different from the pursuit we have now?”

The trackers that FLoC needs to replace are known as “third party cookies”. We have a beautiful in-depth guide to how this type of technology works, but in short: these are snippets of code from adtech companies that websites can bake into the code that underlies their pages. Those pieces of code monitor your behavior on the spot – and sometimes other personal details—Before the adtech organization behind that cookie transmits that data to its own servers.

This is one of the essential differences between FLoC and the current cookie hell we brag about. With FLoC, my strong cohort of a thousand people is the only thing an external advertiser sees. Anything else – such as the names of the sites I’ve visited or details about the couches I’ve clicked on in the past – are stored locally in the browser. In the case of cookies, all these details are sent to an external server, where the responsible company can have a free reign: pawn this data to other adtech companies or can combine their data with data from other cookie companies or, in some cases, may provide such data at police.

That’s why Google’s pitch sounds semi-appealing. Sure, you’re still behaviorally profiled in what is undoubtedly a kind of ugly, but at least you can’t be chosen from one line.

“There must be a catch here.”

The fact that Google still has all that juicy user-level data because it controls Chrome. They are also free to continue to do what they have always done with this data: sharing it with federal agencies, accidentally draining itand, you know, being just Google.

“No way.”

Cale.

“Isn’t that … anti-competitive?”

It depends on who you ask. UK competition authorities I definitely think so, as I do trade groups here in the USA. It was also wrapped in a Congress probe, at least one class action, and a massive multi-state antitrust case led by Texas Attorney General Ken Paxton. Their calmness with FLoC is quite easy to understand. Google already controls about 30% in the digital advertising market in the US, just a little more than Facebook – the other half of the so-called duopoly—Which controls 25% (for context, Microsoft controls about 4%).

While this dominant position attracted Google billions upon billions dollars per year, is recently offset multiple antitrust investigations mounted and against the company. And these investigations have painted quite universally an image of Google as flagrant autocrat of the ad-based economy and one that escaped largely with ugly behavior because smaller rivals were too afraid – or unable – to speak. This is why many of them are talking about FLoC now.

“But at least it’s good for privacy, isn’t it?”

Again, it depends on who you ask! Google thinks so, but EFF certainly doesn’t. In March, the EFF published a detailed piece breaking down some of the biggest gaps in FLoC’s privacy promises. If a particular website prompts you to opt out of a particular type of primary data – by having you sign up with your email address or phone number, for example – your FLoC ID it’s not really anonymous more.

In addition to this hiccup, EFF points out that your FLoC cohort tracks you wherever you go on the web. This isn’t a big deal if my cohort is just “people who like to redo the furniture,” but it becomes really difficult if the cohort happens to be modeled around a person’s mental health disorder or sexuality based on the sites that the person browses. . While Google is committed to preventing FloC from creating cohorts based on these types of “sensitive categories”, Again EFF Highlighted that Google’s approach was full of holes.

“Behavior correlates with demographics in unintuitive ways,” wrote EFF technologist Bennet Cyphers. “It is very likely that some demographics will visit a different subset of the web than other demographics and that such behavior will not be captured by Google’s ‘sensitive sites’.”

“It simply came to our notice then better alternative to cookies? ”

It is not like that?????????

“How can I get all this to my estranged uncle / parent / neighbor / nephew, who has no technical knowledge but wants to know what FLoC is about?”

Just remind them that this is a privacy product that is being pushed by Google. Google. That’s all he needs to know.

.Source