Data Digest № 016

Serafin Lion Engel
Aug 19 · 8 min read

Hi there, and welcome to the 16th edition of the Data Digest, where I offer a weekly summary of the most important happenings in the data industry. This week in review: digital phenotyping of your online data, surveys reveal that people are becoming more reluctant to share personal data, controversies over Section 230 continue, Facebook tries to dodge the FTC while secretly listening to your calls, a former FTC Regulator’s call on congress to enact data privacy regulation, Twitter’s role in the Honk Kong protests, and a huge data breach in Biometric Security. Enjoy!

All Your Data Is Health Data

Digital phenotyping, a term coined by Harvard T.H. Chan School of Public Health, assesses people’s well-being based on their interactions with digital devices. Dr. Mona Sobhani told the NYT, “If we were ever to rule that all the data they collect on us is, ultimately, health data and that we have a right to it, then they would need to hand it over by law.” Big Tech companies — Google, Facebook, Amazon and others — currently have a remarkable amount of access and control over our personal data, and as Sobhani states, “all our data is health data.”

For instance, companies often try to gauge whether someone is pregnant by tracking their shopping behaviors. As technology encroaches deeper into our lives and data analysis gets better, more brazen researchers and organizations will try to glean deeper insights. In such a world, Sobhani argues, even some of our most trivial data — the way our eyes move in a video clip — could be thought of as health data.

Medical devices classified by the Food and Drug Administration are subject to the Health Insurance Portability and Accountability Act. Tech devices are not included in this scope. Sobhani says , “[w]ith health data, Hipaa grants you rights to have an easy copy of it. And if tech companies are mining my data for health insights, why don’t I have access to it?” It’s an important question, and one that we should continue to demand not only access to but ownership over — it is after all our most confidential and valuable information.

Research suggests that even early Parkinson’s motor symptoms are likely to be detected by typing patterns on keyboards. There are studies to prove that language in social media posts and Facebook likes can accurately predict depressive episodes. Advertisers, health providers and brokers buy such data to predict our behaviours and target us, perhaps at our most vulnerable moments. It’s a very interesting argument that, if utilized as an argument in a court of law, could provide far reaching data rights to internet users even without state legislation such as the upcoming California Consumer Privacy Act (CCPA).

Survey Reveals People Are More Reluctant To Share Personal Data

A recent study by the industry group Advertising Research Foundation analyzed consumers’ willingness to share ‘basic’ data with websites. The results revealed that users are less likely to share personal information than they were a year ago, according to the same study undertaken a year ago. The biggest changes in respondents’ willingness to share their data from 2018 to 2019 were seen in their home address (-10 %), spouse’s first and last name (-8 %), personal email address (-7 %), and first and last names (-6 %). Marketers are being advised to not only focus on regulatory compliance, but also to explain why and how personal data is collected in the first place. “I think the industry basically really needs to communicate the benefits to the consumer of more relevant advertising,” said ARF Chief Research Officer Paul Donato in an interview with Forbes. “If that becomes one of the mechanisms that people elect into sharing their data.” LOL!

Controversies Over Section 230 Continue

Section 230 of the Communications Decency Act of 1996, aptly titled in Jeff Kosseff’s book as “The 26 Words That Created The Internet”, states that internet companies are not legally responsible for the content they host if it was published there by someone else. Kosseff recently highlighted that “[s]omething tech companies have really gotten wrong — they’ve proceeded for years basically treating Section 230 like it’s a right that’s enshrined in the Constitution, and I think, frankly, some of the large platforms in particular have gotten incredibly arrogant,” In June, Senator Josh Hawley introduced the “Ending Support For Internet Censorship Act” that would “amend the Communications Decency Act to encourage providers of interactive computer services to provide content moderation that is politically neutral.” The plan would enforce a Federal Trade Commission audit every two years and employees that are seen to express a political bias would be disciplined or fired. The bill was fiercely criticized for being too difficult to enforce and unclear. Removing Section 230 immunity would upheave the entire internet ecosystem and potentially stop moderation altogether. President Trump and other Republican policy makers have frequently expressed his concerns about social media companies holding systemic bias against conservatives by technology platforms. However, reinterpretation of a law that was intended to give tech companies broad freedom to handle content as they see fit could cause significant damages. As the controversies over misinformation and free speech evolve, the fight over whether there is too much or too little moderation continues.

Facebook Prepares For Antitrust Round Two

Facebook’s recent settlement of $5 billion from the FTC, a slap on the wrist that in fact actually raised Facebook shares, is approaching round two. Tensions are rising over whether the company can push its luck again. Facebook are currently implementing measures to ensure that their existing acquisitions are harder to break up, meanwhile refraining from adding new start-ups. It’s clear the social media company is feeling the pressure and doing everything in its power to shake off federal regulators. Let’s see if Washington can impose action that doesn’t end up making CEO Mark Zuckerberg $1 billion richer by the end of the day.

Facebook’s Secretly Been Listening In

During the companies battle to appease the FTC, outside contractors of Facebook who were made uncomfortable by the ethical implications have announced that the company, until late, has been paying contractors to transcribe clips of audio from users of its services. The Irish Data Protection Commission said it would hold an investigation into the activity for possible violations of the EU’s privacy rules. Facebook admitted transcribing users’ audio, with hundreds of paid contractors, and has later claimed it will no longer do so, following scrutiny into other Tech giants such as Amazon and Google. In April 2018, Zuckerberg told U.S. Senator Gary Peters, “You’re talking about this conspiracy theory that gets passed around that we listen to what’s going on on your microphone and use that for ads. We don’t do that.” Contractors who feel their work is unethical, as users have no idea that they’re listening, have rightly come forward to expose the intimate invasions of privacy.

Data Breach In Biometric Security

A web-based biometric security smart lock platform reported yet another data mishandling: a huge data breach in security platform BioStar 2. As a centralized application, it allows admins to control access to secure areas of facilities, manage user permissions, integrate with third party security apps, and record activity logs. Assimilated with biometric software, BioStar 2 uses facial recognition and fingerprinting technology to identify users. The security team was able to access over 1 million fingerprint records, as well as facial recognition information. Combined with the personal details, usernames, and passwords, the potential for criminal activity and fraud is catastrophic. Once stolen, fingerprint and facial recognition information cannot be retrieved. An individual will potentially be affected for the rest of their lives.

Former FTC Director Calls on Congress to Enact Data Privacy Regulation

Former FTC Director of Bureau of Consumer Protection, Jessica Rich, took to the Op Ed section at the New York Times to voice her concerns over the lacking legal authority the Commission has to effectively enact consumer data privacy protection. Rich points out that all of the Commission’s work. e.g. the fine against Facebook relies solely on the Federal Trade Commission Act, which was passed more than 100 years ago, long before personal computers, the internet, social media or mobile phones were invented. The reason, as Rich puts it, is that Congress has repeatedly declined to enact a broad-based federal privacy and data security law setting strong privacy standards, codifying penalties for wrongdoers and allocating the staff and funds necessary to enforce the law nationwide. According to Rich, under the F.T.C. Act, the agency can’t set normative privacy standards that all companies must follow, such as requiring them to post a privacy policy, limit the consumer data they collect and retain, refrain from certain uses of that data or give consumers choices about how their data is used. Rich states that “absent clearer authority to order conduct relief and obtain penalties for privacy violations, the F.T.C. constantly faces obstacles in court, leading it to rely, more often than many would like, on the greater certainty of negotiated settlements. A strong privacy mandate from Congress could set clear limits on how consumer data can be used, and give the F.T.C. greater power to enforce these limits in litigation.”

Twitter is taking money from Chinese government outlets to show ads against trending Honk Kong protest hashtags

Please note: this is not a news article and has not been validated by outside research. However, albeit it sparse, the data presented seems to corroborate the claim made by the author. As the title suggests, Twitter is being utilized by Chinese government agencies to buy ads that are shown to people using trending hashtags from the Hong Kong protests. Or put differently: Twitter is taking money from Chinese governmental agencies to spread propaganda. It is interesting, to say the least, that the practice of state run media publishing ads to dominate the conversation about protests is seemingly permissible, even though it appears to be in direct violation of the Twitter Rules.

If you’re interested in what we’re doing at Datawallet, including our all-in-one CCPA compliance product that not only helps you stay ahead of data privacy regulation such as CCPA but also helps you build profound trust with your customers, go to



Datawallet Blog

Updates from the team, guest posts, opinions & more

Serafin Lion Engel

Written by

Founder & CEO of DataWallet

Datawallet Blog

Updates from the team, guest posts, opinions & more

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade