The Datafication of Contemporary World

Tatiana Feldman
6 min readSep 25, 2021

--

If you are not paying for the product, then you are the product” is an expression increasingly echoed in the world of big data and digital platforms. Anything we do within the ecosystem of digital practices is translated into data. Every action we take during our contemporary daily activities, both virtually and offline can be recorded, processed, analysed, and even monetised. The digital footprints of our actions containing, but not limited to, our search history, online social interactions, GPS locations, transactions, video streaming preferences, health metrics, etc. is a continuing datafication of our daily life.

The growing number of free apps and platforms available to the general public provide the utility to live more fulfilling and meaningful lives. But are these tools really free? What price do consumers pay for using free technology?

The answer is that today’s user data is key to unlocking many opportunities for marketers, researchers, venture capitalists, and government agencies. Our digital footprints enable organisations to learn almost everything about our individual and collective characteristics, including our socioeconomic statuses, interests, religious beliefs, political views, living habits, biometrics, and so on. Up until around 2010, this information was unavailable to organisations due to the limitations of slow internet and old technologies. Today, the pervasiveness of free platforms and our willingness to contribute our data enables profit-driven companies to target us with personalised advertising for services and goods algorithmically matched to our psychographics. On a global scale, our data allows organisations to make predictions and identify threats, trends, and demand for new products.

Our data is a commodity often sold or shared without user consent in order to fulfil organisations’ profit-driven or political agendas. It is monetised daily by platforms generating revenue from targeted advertising. Governments have the right to use our data without consent to identify threats to national security and maintain stability in regions.

The number of Customer Relationship Management (CRM) platforms grew from 150 in 2011 to 6,800 by 2018, as reported by Econsultancy in the Guide to Digital Marketing Tools. Many contemporary data systems promise organisations a 360-degree view of their customers and enhanced features to handle data collection, customer profiling, demographic targeting, personalisation, and analytics.

The dark side of digital marketing is that it has become data-obsessed and demands more from users than what is legally acceptable. The way many platforms collect and share data poses concerns to the public and regulators, focusing on current data privacy law reforms and compliance.

The recommendation engines of social platforms, driven by algorithms that feed on user data, have many flaws studied and linked to the rise in anxiety, suicides, depression, violence, and hate crimes worldwide.

The widespread use of session cookies across sites and platforms leaves user personal information vulnerable to theft in settings where such information is accessed via public computers, such as those in schools and public libraries. While cookies don’t carry any personal data on their own, they enable user access to online services that hold names, addresses, and banking facilities. Session cookies are stored within the browser, and if not deleted after use, they become a threat to data security.

Our personal data often becomes the focus of quantitative research, enabling organisations to link an individual to a group with matching variables, learn about groups’ behaviour within society, and identify common characteristics related to genetics, socioeconomic statuses, biometrics, etc. In quantitative studies, the insights can only be meaningful if they derive from the analysis of the entire cohort rather than from the analysis of each subject.

For instance, an average smartwatch is continuously tracking its user’s health metrics containing heart rate, blood oxygen levels, exercise frequency, and so on. This data is held in possession of the smartwatch company. In addition, these data could be extremely valuable to researchers wanting to establish the link between living habits and obesity. The data is then de-identified and sold or shared with researchers. Furthermore, during analysis, the data are split into smaller datasets, each relating to specific characteristics, such as users’ age bracket, gender, or medical conditions.

Asking for consent to share such information is not always presented in a clear form. On most occasions, there will be a clause on data sharing within the Terms of Use shown after the purchase. However, once the user has already invested hundreds of dollars into a smartwatch, it is too late or inconvenient to opt-out and return the watch back to the retailer.

Security of health data is paramount as there is always the risk of falling into the wrong hands to be used to inform unethical practices.

Data collected from non-human living entities and objects continue to drive scientific research, the discovery of new products, and predictions in all industry sectors, including healthcare, environmental, agricultural, and mining.

While there is no one universal Code of Ethics to cover every sector of the economy, every professional field, including Medical, Accounting, Legal, and Media, are bound by codes that derive from their respective professional authorities. For instance, in the accounting industry, a Chartered Accountant’s professional actions must align with the CA Code of Ethics issued by The Institute of Chartered Accountants (CA). Likewise, in mining, a member of The Institute of Mining and Metallurgy (The AusIMM) must align their actions with The AusIMM Code of Ethics. However, while professionals who affiliate themselves with governing bodies can rely on guidance from these agencies, the affiliation is not compulsory, which leaves many professionals exempt from compliance with such codes.

In addition to Codes of Ethics, legislations, including The Privacy Act 1988; The Spam Act 2003; Spam Regulations, and several others aimed at specific industries, address user privacy and consent in Australia. Additionally, any organisations with business in the EU region fall into General Data Protection Regulation compliance.

Due to the nature of ubiquitous computing, a term vastly associated with the Internet of Things (IoT) and the presence of Artificial Intelligence (AI) in data technology and everyday life, the last two decades have seen ethics surrounding privacy, consent, acquisition, ownership, security, and secondary use of data being the focus of research and law reforms across IT and science sectors.

Australian data laws and GDPR have many things in common and provide a framework for ethical considerations and decision making within data-driven environments. However, in a day-to-day business operation, where executive decisions impact the wellbeing of data subjects as well as profits (or outcomes) that derive from the subjects’ data, ethical dilemmas are often solved not only by code adherence but also by applying the moral agency of senior staff. While data stewards and custodians are the experts in the technical field of data management, their skill sets are often not enough to drive executive decisions in circumstances where data-driven practices threaten the wellbeing of groups or individuals.

In such cases, the moral agency of the whole organisation is driven by a group of decision-makers, including executives, lawyers, financial controllers, scientists, and representatives from governing bodies.

For instance, manufacturers are constantly facing ethical dilemmas. Avoiding strategies that contribute to deforestation, pollution, climate changes, and habitat loss and generating significant profits at the same time is a challenge that many organisations face today.

The essence of data ethics is finding solutions that maximise benefits and minimise harms. The ever-evolving digital landscape and multifaceted nature of data science present many challenges to contemporary ethical thinkers, researchers, policymakers, and governing bodies and continue to elicit the need for better processes, practices, and regulations worldwide.

--

--