The Clerics of Equilibrium (© Dimension Films, Blue Tulip Productions; fair use claimed, photo credit PopVerse)

Power and Control—The Rise of Covert Strategies

Alper Sarikaya
4 min readOct 6, 2014

[this article is a response to readings on the class’ syllabus; week of 10/6]

There are several very interesting philosophies brought up this week, many of them brought up in Michel de Certeau’s work “Making Do: Uses and Tactics.” These philosophies and organization of societal actions introduced by de Certeau serve to bring structure to the more applied pieces on how computers talk over the internet in general, the ethics of the Facebook research study, and how online companies experiment on the efficiacy of their web services in their day-to-day operations.

It is important to emphasize some thoughts that de Certeau calls up from Aristotle’s time. He notes that the procedures of the enemy (the powerful) tend to pervert the truth—“making the worse argument seem the better.” To me, this is a key point as feedback is an inherent part of discourse. Obtaining negative feedback to a tactic that is inherently against the ideals of those who create the strategies (the powerful) will generally have the effect of demoralizing the weak (not powerful), thereby perpetuating the status-quo.

In our increasingly ‘techno-cratic world’, de Certeau notes the following:

Consumers are transformed into immigrants. The system in which they move about is too vast to be able to fix them in one place, but too constraining for them to ever be able to escape from it and go into exile elsewhere. There is no longer an elsewhere.
Michel de Certeau (“Making Do: Uses and Tactics”, 1984, pg. 40)

de Certeau notes that in the world where communication is plentiful, the populace will be bombarded by various strategies from different corporations, governments, and educational institutions, all trying to either impose order or organize individuals into a particular action. It is up to the individual to navigate this complex environment and decide their own actions. This is precisely what de Certeau defines tactics as: the ability to navigate, circumvent, and rebel against built strategies levied against them by the powerful.

The studies done by Facebook, OkCupid, and other companies (noted by OkCupid’s co-founder Christian Rudder on On the Media) very closely follow this same line of control, except that alternative strategies are vetted on their users in real-time, without users’ explicit or informed consent.

Cartoon via Cagle.comArend van Dam)

Christian Rudder argues against the concept of ‘informed consent’ in the On the Media interview, saying that he never knew what he was being tested for, and that he often did not understand. He further goes on to say that a ‘laboratory experiment’ does not capture the essence of (specifically) love, and that since laboratory experiments necessitate the approval of the user, they are inherently different from real-world experiences.

These sorts of comments are very alarming. With the wide-spread adoption of IRB protocols in the United States for research concerning human-subjects, informed consent entails a very specific set of rights available to the participant, most notably an explanation of the purpose of the study (usually after the experiment) and access to the study results once analysis is complete. Implicit in informed consent is the voluntary nature of the participant; simple participation in an activity without explicit consent runs contrary to the ideals of the IRB-designed protocols. A critical component of the IRB is a balancing of risks to subjects and the potential benefits to society, along with a fair distribution of those risks and benefits to eligible participants.

With the Facebook and OkCupid studies (that still may be on-going), none of these guarantees are being made to users. Though end-user license agreements implicitly signed by users when signing up for the service remark that data will be used to improve the service, there is very little language that states that users will be part of A-B experiments that will test the alternative display and organization of information and its effects (via telemetry) on their users. Part of this stealthy strategy is most likely due to the invisible nature of updates to online software; users are not aware that the software or algorithms have changed unless there is more than a just-noticeable difference in the interface.

Software companies have found (implicitly or explicitly) that the needs and desires for the site to work effectively for the company and its users outweighs the negative effects of validation experiments against a few users. It is simply not cost-effective to create a experimentation platform when unsuspecting users can be used as test subjects simply by segmenting the sampled population—half use this version of the software, the other half will use this newer version. Through this mechanism, these companies are effectively imposing their strategy on their users, quietly making small changes to achieve their goals (generally higher adoption) that generally stray from the users’ goals (e.g. making meaningful connections).

It was not until very recently that users have even had the chance to try their hands at any tactics. So far, the only retribution that individuals have against these software services is to quit.

--

--

Alper Sarikaya

Data vis developer/researcher at @MSPowerBI. UW-Madison PhD grad. I tweet what I like.