What the technology sector could learn from anthropology and it’s work with people, society and data
● Digital connectivity initiatives should take anthropology seriously — social and cultural factors have a tremendous impact on the extent to which technology may be a driver of social impact and digital equality.
● Digital connectivity and data mediate culture, systems and life today. Failing to take into account the importance of “small data” in a world of big data risks boxing people into categories of belonging which inaccurately represent their lives, hopes, fears and desires in this world.
● Technology companies should learn from anthropology’s history and adopt approaches to understanding the impact of innovation and policy on social and cultural life.
Anthropology is the study of culture. Anthropologists study culture to better understand how it shapes people’s identities, actions, beliefs and values — and how people, in turn, shape culture. Through anthropology’s primary research method — ethnography — anthropologists study culture by participating and observing people’s everyday experiences, and analysing how larger political, economic and social phenomenon affect people’s lives in local contexts.
As digital technologies seep into our lives, disrupting political, economic and social systems, we’re at a vital time in history where the technology sector needs anthropology to reflect on what it means to be human in a digital age. Here’s why..
A short history of Anthropology, it’s data and it’s decision making power
Although current anthropology is grounded in a deep appreciation for human diversity and cultural preservation, this was not always the case. Anthropology was a tool of colonialism, applied to develop theories of cultural superiority and evolution. Oxford University’s Edward Taylor (the founder of cultural anthropology) introduced a “theory” of cultural evolution in the late 1800s which maintained that societies go through ‘stages of development’: from savagery, to barbarism to civilisation. It was in this context that Western anthropologists studied “Other” people, an encounter based on the position of the anthropologist as an unchallenged authority. In their lectures and writings, communities were presented as homogenous and ahistorical, “incarcerated in place and in modes of thought” (Abu-Lughod 2005). Representations of communities went uncontested, as communities had no voice in the matter. Anthropological “knowledge” was often politically motivated, used by colonial regimes to upend local cultures and people based on contrived hierarchies of human progress and the mission towards ‘modernisation’. The creation of “modernity” through the systemisation of Western governance, economic and social orders served to discredit local ways of knowing and being, thereby facilitating colonial regimes’ economic, political and social ‘mechanisms of control’.
Anthropology went through a period of vicious self-criticism in the 1960s in the in the wake of independence and postcolonial movements when anthropologists were forced to come to terms with the political implications of their work. Given the history of the discipline and the political and epistemological crisis it found itself in, there was a period of self doubt where the best bet seemed to be scrap it and make do with other social sciences. Indeed, as Diane Lewis documents in her paper on Anthropology and Colonialism (1973), “the fieldworker….may be forced to pose as an economist or sociologist in order to gain acceptance…”
But anthropology went through a transformation in the 1980s, driven in large part by those who proposed doing away with anthropology’s objectivity and authoritative claims about non-Western cultures (read: dehumanisation). The historic Writing Culture (1986) publication presented new approaches to understanding people and culture. It proposed (among other things) that:
- research participants become an authoritative voice determining how their communities are represented and historically placed and understood.
- anthropologists insert themselves into the field of study through reflexive analysis, involving a deep questioning of the researcher’s social bias based on location, identity, power, and how these factors influence research insights and analysis.
- studying home communities through a critical questioning of the “personal as political” and attempting to “make the familiar strange and the strange familiar”.
Anthropology today is a positive reincarnation from it’s past, continuing to be shaped by the changing nature of the world (in the context of globalisation, digital media, mobility and migration). Anthropology’s reincarnation was, in a sense, a preemptive response to Chimamanda Adichie’s warning of “the danger of a single story” that is ever so relevant to cross cultural representation and understanding.
A lesson in the history of anthropology matters to anyone involved in technology design, development or policy because of the huge impact that the tech sector has on society, knowledge and culture today. The internet is a critical architecture of public space, mediating access to information, to the economy, to social capital, and to public life. Although this year we’ve reached a point in history where 50% of the world is connected, huge digital divides persist, both across countries and within. The increasing digitisation of information and services means that those who are offline are oftentimes excluded from participating in economic, political and cultural life. Being excluded from digital spaces significantly limits life choices.
The Web Foundation’s Women’s Rights Online (WRO) found that the despite increasing access to mobile phones, women living in urban poor areas across 10 countries across Africa, Asia and Latin America are still 50% less likely to use the Internet than men in the same communities, and 30–50% less likely to use the Internet for economic and political empowerment. This study made important inroads to understanding barriers to Internet adoption and use and providing policymakers with key recommendations to close gaps. Building from this study, we now need to delve into understanding the more nuanced contextual factors which have profound effects not only on access, but on the spaces created around technology that mediate power and belonging in the digital age. For example, several villages in India have banned unmarried women from owning mobile phones due to them being “immoral”. In Uganda, women interviewed in the WRO study said that they are often frowned upon for using internet cafes which are male dominated spaces. In Indonesia, content about topics like sexual and reproductive health and LGBT issues are censored as allegedly “harmful” content in line with the country’s anti-pornography law.
Technology is a social tool that requires understanding of social and cultural factors for it to be a driver of equality. Failing to incorporate an anthropological perspective into tech design, development and policy risks increasing social inequalities driven by digital exclusion. It also makes it more likely that your product or service will fail.
Next… internet and social media platforms are driven by algorithms, artificial intelligence and automated decision making processes. These are created and managed by Silicon Valley corporations and a sector that is primarily male, white and elite. There is a widespread belief that algorithms make decision making processes and the information you access online more objective, personalised, and efficient, but this is not always the case. Internet systems, protocols, and algorithms are designed by people who themselves have social beliefs and bias, which often become unconsciously embedded into the systems. Algorithms are not inherently objective and often discriminate based on how they learn about the world through existing data (read: big data) which is based primarily on male perspectives. This led Microsoft’s Twitter Chatbot Tay to turn into a misogynist within days; it led Apple’s Siri to go blank when asked questions about sexual assault and domestic abuse; and it led Google online advertising system to show ads for high paying jobs to men more frequently than to women.
Large technology corporations play a tremendous role in facilitating people’s informational, communicative and associational lives, as well as driving the political economy. They are coming under increasing criticism because of a lack of transparency in their decision making processes, protection of online privacy and influence on the political economy (e.g. with the rise of fake news playing a major role in the 2016 US presidential election, and with the rise of the platform economy driven by Uber and AirBnb).
Tech company business models and profitability are sustained by use of data footprints — which are used to categorise us (people) into boxes of gender, race, income and interest groups. We are then targeted with ads telling us who we should be, how we should think and the choices we should make — fueled by assumption and driven by bias. As a result of categorising humans into fixed boxes and ‘indicators of being’ technology companies have created your “second self” through your data. We understand very little about this system “despite the fact that we, as users, are providing most of the fuel — for free”, as warned by Share Lab.
This invites ethical concerns around several issues: personal data as currency in ‘mediascapes’, driven by the manipulation of consent; and the use of big data and artificial intelligence in fueling knowledge production based on limited contextualization. As Giulio Quaggiotto notes in his blog on Big data meets thick data, “Without proper handling and contextualisation, big data risks becoming deep fried data.” Hence, “the era of big data needs even more qualitative, granular knowledge of local contexts.”
Information and communication technologies are driving the “modernisation” of governance systems (e-government), economies (platform economy) and of social norms (attention economy), and they are doing so with much authority. Some have termed this the tyranny of data. This builds from William Easterly’s view that global development is based on a “tyranny of experts”, whose technocratic approaches “reduce multidimensional social phenomena such as poverty or justice into a set of technical solutions that do not take into account either the political systems in which they operate or the rights of intended beneficiaries” (3).
Many have termed these new digital developments “the Fourth Industrial Revolution”, characteristic of modernity owing to their profound impact on the future of work, productivity, skills and education. However, the impact of this revolution will be equally if not more profound on language, beliefs, values and emotions of the future.
The situation is eerily similar to anthropology’s crisis of representation in the wake of independence and postcolonial movements. Perhaps we are at the start of another such revolutionary time, one that is driven by citizen’s demand for freedom of information, of association, of movement, of rights online; and of overcoming “the danger of a single story” characteristic of data driven indicators of belonging.
Anthropology provides important approaches for the tech sector to better understand people and the ethical and political context of their work on people’s lives. As such, anthropologists should be part of the future of tech for companies to become more representative of the world they seeks to change. Here’s how:
- Government and technology policymakers should work with anthropologists to better understand the social and cultural norms influencing internet access, particularly for underrepresented groups, to come up with local, contextual, and relevant solutions to drive digital equality. This requires time and money invested into long term ethnographic studies, work with community leaders, and collaboration with people from all parts of society.
- Tech companies and other actors working on technology projects should hire anthropologists to undertake studies of the political, economic and social impacts of products, and the ethics of the political economic landscape and “social spaces” they produce. This goes beyond the framework of “user experience design” but can incorporate UX into methodological approaches and study aims.
- Anthropologists should also play a role in designing and facilitating ethics trainings and developing codes of conduct for the technology sector with participation from communities of practice. Companies such as Facebook are attempting to remedy bias by providing unconscious bias training to employees which would benefit from an anthropological perspective.
- Tricia Wang: Big Data Needs Thick Data
- Kate Crawford: The Hidden Biases in Big Data
- danah boyd and Kate Crawford Six Provocations for Big Data
- Arturo Escobar et al: Welcome to Cyberia: Notes on the Anthropology of Cyberculture