By Jennifer M. Oliver
Google says cookies are good. They are critical to generating advertising revenue and they help the company improve its product from functionality and privacy perspectives.
Competing search engines do more to give users control over how these tiny tracking files monitor them. In April Google said it would move in that direction. It has since come out strongly against their complete elimination, however, arguing that it would be bad for privacy. Without cookies, Google says, more risky tracking methods will proliferate. This increasingly heated debate comes at a time when, just this month, Google’s YouTube was found in violation of federal law for tracking children.
Despite Google’s public statements, as Wall Street Journal reporter Patience Haggin put it, a “full-on cookie crackdown isn’t happening.” Haggin also quoted Princeton University computer science Assistant Professor Jonathan Mayer, who said, “This notion that blocking cookies is bad for privacy is completely disingenuous.”
Enhanced Tracking Protection
Google’s Chrome isn’t as strict about cookies as are Firefox, Safari and DuckDuckGo, and these competitors know it. On Sept. 3, for example, Mozilla Corp. announced that its Firefox desktop and Android versions will — by default — “empower and protect all our users by blocking third-party tracking cookies and cryptominers.” Firefox’s “Enhanced Tracking Protection” will be the default for all users worldwide. As of June new users were already enjoying this enhanced protection. Anti-tracking is “core to delivering on our promise of privacy and security as central aspects of your Firefox experience,” Mozilla said.
Apple’s Safari allows Mac users to prevent trackers from using cookies and website data to track them, and has “block all cookies” and “remove stored cookies and data” options, which means “websites, third parties, and advertisers can’t store cookies and other data on your Mac,” according to the company.
Google’s online policies explain what it is tracking: user search terms, videos watched, interactions with content and ads, voice and audio information, purchases, people with whom users communicate, activity on third-party sites and apps, user GPS locations, and information about equipment near mobile devices, such as Wi-Fi access points, cell towers, and Bluetooth-enabled devices.
But the company says it’s trying to be a good guardian of private information. Google is socializing a proposal to “give people more visibility into and control of the data used for advertising,” and says it wants more feedback from industry and other stakeholders.
The company says discussion of the proposal – which is posted HERE – should revolve around principals that users should have transparency, choice and control over how they are tracked.
1. Users should have transparency. They should be able to easily see and understand how their data is being collected and used for ads.
2. Users should have choice. Their choices about how they experience the web should be respected and any attempts to bypass those choices should be prevented.
3. Users should have control. They should have the ability to adjust how their data is collected and used to tailor the ads they see, including whether those ads are personalized at all.
Beyond Search Engines: Websites Track Us, Too
In a recent post, DuckDuckGo wrote that tracking restrictions cannot end with search engines and called for stricter anti-tracking legislation.
“Our recent study on the Do Not Track (DNT) browser setting indicated that about a quarter of people have turned on this setting, and most were unaware big sites do not respect it,” according to DuckDuckGo. “That means approximately 75 million Americans, 115 million citizens of the European Union, and many more people worldwide are, right now, broadcasting a DNT signal. All of these people are actively asking the sites they visit to not track them. Unfortunately, no law requires websites to respect your Do Not Track signals, and the vast majority of sites, including most all of the big tech companies, sadly choose to simply ignore them.”
In May, DuckDuckGo proposed the Do-Not-Track Act of 2019 that would ban allowing third-party tracking as a default setting, and would not allow first-party tracking beyond the users’ expectations. Shortly afterwards, Sen. Josh Hawley (R-MO), proposed legislation with language similar to the DuckDuckGo proposal.
YouTube’s ‘Infinite Smorgasbord’
In the midst of these escalating privacy concerns, Politico first reported that the Federal Trade Commission voted to fine Google “$150 million to $200 million” to settle accusations that Google’s YouTube is illegally collecting personal information about children in violation of the Children’s Online Privacy Protection Act (COPPA). On Sept. 4 the FTC announced that the proposed settlement was $170,000, requiring the company to pay $136 million to the FTC and $34 million to New York. The commissioners voted for the settlement 3-2.
The proposed settlement also requires Google and YouTube to “develop, implement, and maintain a system that permits channel owners to identify their child-directed content on the YouTube platform so that YouTube can ensure it is complying with COPPA.” The companies must: notify channel owners that their child-directed content may be subject to COPPA; comply with the law; provide notice about their data collection practices; and obtain verifiable parental consent before collecting personal information from children.
The amount of the FTC fine has been criticized. Politico quoted Josh Golin of the Campaign for a Commercial-Free Childhood as saying that any fine should both “level the playing field” and deter violations of COPPA. “This fine would do neither,” Golin told Politico reporter Margaret Harding McGill. Given how quickly Google and YouTube can make that amount in ad revenue, the fine should have been “at least half a billion dollars,” Jeff Chester, executive director of the Center for Digital Democracy, told McGill. “It sends the signal that you in fact can break a privacy law and get away largely scot-free,” he said.
Dissenting FTC Commissioner Rebecca Kelly Slaughter said that, given this relatively low fine, she was concerned that “the vast universe of content creators” would conclude in their cost-benefit analyses that “the perceived payoff of monetizing child-directed content through behavioral advertising outweighs the perceived risk of being caught violating COPPA.”
She said that unlike the old three-channels of Saturday morning cartoons, YouTube is a “virtually infinite smorgasbord of content with … more than 23 million channels that upload a combined 500 hours of video every minute.” Much of that content comes from small producers outside the U.S. who would be difficult to investigate. It will be expensive for them to secure a designation that their content is produced for children, she said. Given how difficult it would be to enforce COPPA globally, the FTC’s order falls short of its goals.
“The order does not require YouTube to police the channels that deceive by mis-designating their content, such as by requiring YouTube to put in place a technological backstop to identify undesignated child-directed content and turn off behavioral advertising,” Slaughter wrote.
As with all privacy issues, this one comes down to disclosure and consumers’ degree of control. Most browser customers value the services and convenience cookies provide and would opt to keep them if given the option. Who wants to reset a dozen usernames and passwords every week? I would be shocked if cookies were banned altogether.
But, importantly, this example shows us that while they may claim otherwise, the “FANG” gang (Facebook Amazon Netflix and Google) do not view privacy assurances as an important competitive benchmark. That is why privacy self-regulation simply won’t cut it in the modern, shareholder-driven, digital age. After all, DuckDuckGo may lead the pack on browser privacy, but most of you are reading this in a Chrome browser window.
Edited for MoginRubin LLP by Tom Hagy.