Data Digest № 029

Datawallet
Datawallet Blog
Published in
11 min readMar 24, 2020

Welcome to Datawallet’s Data Digest, where I review and occasionally analyze the latest news and the most critical developments in the data industry. In a slightly longer edition, here’s a look at the latest developments:

A Highly Controversial Move: Google “Plans” To Rule Out Third-Party Cookies Within Two Years

Last week, a user on GitHub sparked controversy by pointing out an ostensibly hypocritical move by Google. Google’s recent proposal to W3C outlined their grandiose plans to out-rule third-party cookies within two years. Differing from the take-it-or-leave-it methods of its rivals, whereby the advertising and media industries have previously felt blindsided, Google claims that they will not take any “blunt approaches to cookies”― somewhat lessening the blow. However, for over seven years, Google has had a proprietary feature within their Chrome browser that allows them to profile users on any Google-run site, or any site that uses Google (Doubleclick) to display ads.

Ad tech share prices have “declined in the wake of Apple’s continued efforts to strengthen the Intelligent Tracking Prevention function in Safari,” and “publishers have reported precipitous drops in programmatic ad revenue.” Convincing the ad-industry that similar dismal effects won’t ripple through the ecosystem when they rule out their primary source for tracking and targeting is an ambitious feat. Chrome holds 64% of the global browser market, and therefore, any changes they make will significantly affect the corners of the digital advertising industry that it dominates. Reinforcing severe skepticism within the publishing industry.

Antitrust investigators are analyzing Google’s ad products and their business rationale for recent acquisitions of ad tech companies such as Doubleclick and Admob carefully. The move towards third-party cookieless Chrome could strengthen its business over its partners― publishers and ad tech companies who rely on user-data from third-party cookies for tracking and targeting. Concerns over companies turning to alternative and more invasive methods such as fingerprinting have underpinned Google’s reasoning for a slow and gentle transition. However, the requirement for publishers and ad tech vendors to explicitly label third-party cookies called the “SameSite cookie update” will go live next month. A privacy and tracking prevention software engineer for Google chrome, Michael Kleber, has announced that the shift across the web should progress to “more right-sized APIs” that stop the “unfettered tracking” of individuals — debatable.

Check out our blog post for an in-depth analysis of why Google’s move is fundamentally flawed.

Another “Big Screw Up” From Google…

Users who exported their Google photos using the service Google Takeout between the 21st and 25th of November last year could have received incomplete archives of videos from complete strangers. Months after this potentially fatal leak of users’ private videos, Google nonchalantly apologized to users for the “inconvenience” in an email to the affected users (see below). Users affected by the breach, including Jon Oberheide, who co-founded Duo Security, are understandably angered by Google’s handling of such a big data breach involving some of their most sensitive property. It seems unlikely they will receive the amends that they deserve.

Companies Are Dipping Their Toes In The Water With CCPA

The rather dismal results of the implementation of the California Consumer Privacy Act (CCPA) suggests that companies are waiting for July 1st (when the California Attorney General can start handing out penalties) to see how much they can get away with. After fierce lobbying from consumer activists, the CCPA was put in place to protect consumers; allowing them to manage their data better, see with whom their data is shared, and opt-out of companies selling it. However, so far, implementation has seemed extremely lax. Data scientist, Mark Rabin, contacted several companies and data brokers to collect his personal information and said they either “give you a fire hose of information that is almost impossible to interpret, or they give you practically nothing.” He also requested a company delete his data and had received no information almost two weeks later.

“Compliance is all over the map and will be until the rules are clear and there are actual penalties for noncompliance,”- Mary Stone Ross, associate director of the Electronic Privacy Information Center.

The largest technology companies, which the law attempts to regulate, maintain that many of the online activities they engage with do not constitute a “sale” of consumer data. Therefore no altering of their practices needs to be made. Whether the California AG agrees with this is yet to be seen. Other companies, such as Amazon and Groupon, previously had no disclosed policy to compile consumer files and are therefore relying on the 45-day wind to respond to consumer requests. Twitter sends users a Javascript file that is difficult to open for anyone without a background in tech, and Paypal appeared to provide an inoperable toll-free phone number. Ultimately, it looks as though most companies are going to make it as difficult and confusing as possible for consumers until the California AG says otherwise.

Salesforce Faces The First Data Breach Lawsuit To Cite The CPPA

On January 15th, high-end children’s apparel store, Hanna Andersson, announced that hackers had scraped customer names, payment card numbers, and other personal information. A complaint filed against the company claims that the compromised data was subsequently found for sale on the dark web and was hosted by Salesforce, which was infected with malware that led to the data breach. Companies will be watching the events of the case unfold closely to understand how easy it is for consumers to seek damages under the private right of action granted with the new California law.

CalPRA Makes Its Way To The 2020 Ballot

Last fall, Californians for Consumer Privacy, the nonprofit behind the 2018 ballot initiative for the CCPA, announced a new ballot measure and subsequent amendments to significantly expand the CCPA. If enough signatures are gathered, the California Privacy Rights Act of 2020 (CalPRA), will make its way to the November 2020 California ballot. The non-profit needs a minimum of 623,212 signatures by June 25th, 2020, before the initiative can appear on the ballot. As businesses, privacy advocates and consumers push for a federal regulation to set a national standard for online privacy, it’s crucial to monitor privacy developments in California, other states, and at the federal level. The new initiative could radically alter the privacy landscape currently established in California and beyond.

The CalPRA proposes significant amendments to the CCPA, expanding the breadth of notice, access, and deletion rights that are currently set under the CCPA, as well as adding new privacy rights for consumers, and an administrative enforcement regime. Not to mention, a newly formed privacy agency with the sole responsibility to provide guidance and regulations on various issues. There has also been a concerted effort to provide organizations with guidance on how to comply with CalPRA prior to the operative date of the new law, which many feel has been lacking with the CCPA.

Given their success with the CCPA Ballot Initiative “№17–0039”, which immediately attracted wide public support, it’s likely that they will meet the signature threshold requirements. During the push for the CCPA, the organization obtained twice as many signatures that were needed to put the matter to a referendum― forcing legislators, tech companies and privacy advocates to act. Let’s see if they can do the same for CalPRA.

The FTC “Needs To Do More To Stop Corporations From Selling Your Private Data”

A letter sent to the Federal Trade Commission by Oregon Senator Ron Wyden, Ohio Senator Sherrod Brown, and California Representative Anna Eshoo signals that Yodlee and Envestnet have violated the FTC Act by selling user data to a vast amount of third parties without making it clear. Envestnet, a financial giant used by 15 of the 20 largest banks in the US to provide personal financial tools to their customers, asserts that the financial data is secure because it’s anonymized. However, countless studies have proven that anonymous data is ineffective, and identification can be found easily with just a few snippets of information. The lawmakers criticized Envestment for burying notifications deep in the fine print of confusing terms and conditions of its banking partners, rather than clearly defining their data selling practices themselves.

“Envestnet should not put the burden on consumers to locate a notice buried in small print in a bank or apps’ terms and conditions or privacy policy, and then find a way to opt out-if that is even possible-in order to protect their privacy.”

A study discovered that about 60% of FTC employees have financial conflicts of interest with companies they are supposed to be regulating. A quick search shows that FTC employees are associated with companies under their watch. The fact that they have either previously worked for-or are hoping to soon be hired by some of these companies, suggests that there could be some serious corruption going on within the FTC.

Dystopia Ensues: Law Enforcement Embraces Clearview AI

Facial recognition software by Clearview AI has been dubbed the “end privacy as we know it” by the New York Times. The firms program is a tool that scrapes social media networks to create a repository of billions of images, naturally all without users’ consent. Over the past year, more than 600 law enforcement agencies have quietly started using Clearview AI to help solve shoplifting, identity theft, credit card fraud, murder, and child sexual exploitation cases. Somewhat unsurprisingly, but also sadly, federal and state law enforcement officers had no clue how the platform was actually able to produce the results that helped crack the cases it was used for. Kashmir Hill, a privacy reporter for the NYT, conducted a three months-long investigation into the company, and finally released a blockbuster worthy article on the company and its associates.

Hoan Thon That, the founder of Clearview AI, claims the company only uses publicly available images and if you change your privacy settings on Facebook so that your photos can’t link to search engines, you won’t be in the database. But for those who have already had their images scraped, it’s already too late.

“It’s creepy what they’re doing, but there will be many more of these companies. There is no monopoly on math.” — Al Gidari, privacy professor at Stanford Law School.

Perhaps the most concerning aspect of the article is whether this type of facial recognition technology would ever get into the hands of the public. Now the taboo has been broken, any number of companies could be building such a tool. The possibility to remain anonymous merely walking down the street could be over as we know it. As Prof. Gidari puts it, “Absent a strong federal privacy law, we’re all screwed.”

EU Commission Considers Five-Year Facial Recognition Ban

Concerns over mass surveillance and the potential risks of using artificial intelligence have led the EU Commission to consider a temporary ban on facial recognition for three to five years, with the final version due to be published later this month. Under the general data protection regulation (GDPR), EU citizens have the right “not to be subject of a decision based solely on automated processing, including profiling.” The draft regulation would prohibit the use of facial recognition by private or public actors in public spaces for a definite period. During this time, the EU Commission plan to develop methodologies for assessing the impacts and risk management of rolling out such tech to the public and private sectors.

Civil rights and privacy activists across Europe and the US have voiced their concerns over how quickly both private industries and law enforcement agencies have adopted the technology.

“Sensible regulation must also take a proportionate approach, balancing potential harms with social opportunities. This is especially true in areas that are high risk and high value.” — Sundar Pichai, CEO of Alphabet and Google.

The Surveillance State Grows While Data Broker’s Profit

Bruce Schneier argues that facial recognition is “just one identification technology among many” in his opinion piece for the NYT. Despite facial recognition trending across legislature and media in data privacy, Schneier claims that data identification, correlation, and discrimination have more significant roles to play in the perils of being tracked today.

“The problem is that we are being identified without our knowledge or consent, and society needs rules about when that is permissible.”- Bruce Schneier, fellow at the Harvard Kennedy School.

Many different emerging technologies, such as laser-based systems that can identify heart-beats or gait, or cameras that can read fingerprints and iris patterns, are being used to track and identify people. Further identifiers, such as phone numbers, credit card information, license plates, and MAC addresses, are cross-matched and correlated with different points of data collection unbeknownst to the individual being tracked.

This build-up of incessant data stalking is contributing to the creation of a surveillance state that funds an entire industry of data brokers. Data about our purchases, lifestyle, income, ethnicity, profession, or interests, provide immense riches for data brokers who augment our data without our knowledge or consent and are in no way held accountable.

The CCPA includes an amendment that requires data brokers to register with the California Attorney General on a public database by January 31st, 2020. Two weeks after the deadline, only 57 data brokers have registered (despite an estimated 1000 data brokers by the California AG). In an attempt to reprimand an entire industry that continues to operate under a cloak of invisibility, this amendment does little more than applying a band-aid to a bullet wound. To tackle the evil practices of data brokers properly, we need nation-wide regulations that dramatically reduce and protect citizens’ data from both private and governmental surveillance.

What I’m Reading:

Best,

Serafin

Originally published at https://datawallet.com.

--

--

Datawallet
Datawallet Blog

Datawallet gives you all the tools you need to easily comply with today’s and tomorrow’s data regulations. Visit our website: https://datawallet.com