Drones, Privacy and Ethics

Jazmyn Stokes
Computers and Society @ Bucknell
8 min readMay 1, 2020

By: Jazmyn Stokes and Jiewen Wu

Figure 1: Drones have eyes

Introduction

Drones and privacy, the two normally do not go hand-in-hand. With an increase use of drones, government usage and private, societies privacy concerns have increased tremendously. Privacy in America is abused, filled with data breaches and leaks, so what is the big concern with drones, “people don’t like being recorded. This is the difference on why people would rather have a helicopter flying over their house than a drone, even though drones are a great deal quieter than helicopters”(Maass).

Throughout this article, we will look at the how drone usage would affect privacy in America. We will be proposing that The Office of Technology Assessment (OTA), approves a drone that will follow and track American citizen’s habits, academic achievements and social media. Through this tracking, the government will be able to hand-pick individuals who demonstrate the highest loyalty and reliability. Not only does this raise privacy concerns, but it also raises gender biases as well as racial biases.

Case Analysis

Purpose

The purpose of using computing is to allow the government to choose individuals who are going to create a greater impact on the safety of the United States. It will be tracking every citizen to make sure people sworn to protect our country are genuine. Being able to hand-pick individuals based off surveillance will decrease the likelihood of corruption and extortion.

With technology only advancing and nations such as China surveilling their citizens around the clock, there is a huge role of surveillance throughout the world and our nation. It is through this case study that we can examine the potential benefits and risks with using drones on a large scale(McCrisken).

Stakeholders

Figure 2: Stakeholder

When looking at wide-spread surveillance and tracking, everyone in the nation, whether they are a citizen, an undocumented citizen or a visitor, becomes a stakeholder. With everyone inside the United States falls victim to a lack of privacy (Gotterbarn).

The government is also trying to produce employees and soldiers who are loyal, capable and trustworthy. They also become apart of the people being observed, where there stake becomes two-fold, where they want what might be best for the safety of civilians, but dislike the constant scrutiny.

The companies creating the drones and the software to record individuals are accountable to create an ethical application, that upholds the safety and privacy of individuals without creating a biased. If the algorithm written is based off of biased data, if the drones surveillance can be leaked, or if the laws around how this works take away the right to decline the offer these are all going against ACM codes of ethics and is doing a disservice to American citizens.

Benefits and Risks

Benefits: Decrease in corruption and police brutality due to constant surveillance, Increase feel of safety among Americans, Decrease in crimes and wrongful convictions.

Figure 3: Risks Outweigh Benefits

Risks: Decrease feel of privacy among Americans, Lack of personality due to people trying to conform to the constant surveillance, Less people voice their concerns and opinions.

Ethical Challenges

Figure 4: Privacy has a key that can be unlocked

Possibility of leak in confidentiality

Disregard for privacy

Discrimination against those who are seen as more expendable than others

Inability for individuals to revoke consent (Gotterbarn)(Santa Clara University)

Ethical Obligations

Since this project would collect lots of detailed materials as personal lives are being filmed, the ethical obligations should obviously include respecting the personal privacy of individuals being tracked, and not to use the information and materials being recorded for other purposes.

Also, since the recorded materials would be considered performance of candidates that would be scored and eventually affect the result of election of important government positions, another ethical obligation would be to keep the materials as it is. Altering or modifying the materials would reduce the fairness of this project and thus would not fulfill the purpose of detecting disqualified candidates.

Potential Disparate Impacts

The drone system is so powerful, but there could still be some risks. If the personal information starts to leak, individuals being filmed by the system would clearly be affected as their privacy is invaded.

If other organizations are able to hack the system and alter the materials for candidates for profit, the credibility of government agencies will be affected, and the original purpose of the government agencies will also not be fulfilled. Since the other organizations could profit by hacking the system, the maker and the investor will be affected since they have to defend the system (Ünver).

Best/Worst Case Scenarios

Figure 5: Drone being Hacked

The best case scenario consists of four key components.

First, there would be only a few authorized people who have access to the materials and information being recorded by the drone system. The system is kept secret from the public, thus there will be no personal materials leaking to the public or any other organizations.

Second, these materials collected by the drone system can effectively detect disqualified candidates for important government positions. There is a clearly defined standard of what behaviors are considered potentially disqualified behaviors, and these disqualified behaviors could be detected and identified clearly by the drone system.

Third, the algorithms in which candidates are chosen are not biased, and the materials recorded by the drone system as well as the results of this system are just part of the selection process. Thus the process of selecting these positions is comprehensive, not completely determined by the drone system, but just takes the result of the drone system selection results as considerations.

Lastly, there is the ability for individuals to turn down the position if they are not interested in the position or do not want to be filmed by the drone system.

The worst case scenario would be the opposite.

Figure 6: Algorithm is Biased

First, when personal information collected by these drones start to leak when unauthorized persons or organizations start hacking the system for profit. Individuals being filmed by the drone system will be harmed.

Second, once the materials are not secured, people and other organizations would start to fake and alter the recorded materials which cause this project to lose all its points. Government agencies are then losing their credibility. Social media sites will also start to sell changes to alter the performances of candidates in the algorithm exploiting low-class society.

Third, the algorithm itself, that will be used to choose or filter candidates, will be based on past biased data. Then the system will be racial or gender biased, and even if the system is secured without personal information leak, it would not fulfill the purpose of fairly selecting qualified candidates.

Lastly, individuals are not notified when they are going to be filmed in the drone system, or they have no interest in the position but are still being filmed. Individuals have no ways to reject being filmed.

Discussion Questions

Figure 7: Question to Asks

How do we decide who could use and analyze the information being tracked with the drones? For privacy concerns, the information collected should be kept secret, but then how do the public review the decision of the government? Like how to make this transparent to the public?

Even the individual being tracked acts perfect, how do we decide that he/she is a good potential candidate? Like, what if his/her family uses drugs, and what about his/her close friends? Does the drone system track these people too?

Do individuals being tracked get informed and consent before this happens? How long are they being tracked for? If there is a set period of time, what if they just act like they are decent people for this time period?

Solutions

The procedure of this project should follow these guidelines, for carrying out the project in the most ethical way:

Figure 8: Consent

First, send out a survey to all the citizens to get their permissions to be filmed in the drone system thus having a chance to be selected for these positions. At the same time, keep the actual filming time or period secret so the candidates could not prepare for filming.

Second, implement an algorithm that could pixelate irrelevant people when filming, and remove the personal identity of the candidates when these materials are shown to any reviewers from government agencies, then use a key pairing system to match the resulting score with each candidate after the reviewing process. In this way, even the reviewers would not see the real identities of the candidates, and the privacy of other irrelevant people are protected.

Third, propose laws regarding the fairness and obligations of the reviewers and engineers in this project, so that it is illegal for these authorized people to alter the materials.

Fourth, Keeping this project for just the rough draft for the selection and only to identify the disqualified candidates, so that it is just a part of the selection process, and try to make other parts more transparent to the public.

Finally, delete the materials after analyzing or scoring, this could also increase confidentiality.

Conclusion

After a short analysis we have concluded that explicit content does have negative impacts on society. It causes such a lack of privacy that Americans to lose their freedom it also takes away autonomy which is the tip of Maslow’s hierarchy of needs. These are social problems that outweigh the benefits of the proposal.

References

Gotterbarn, D., & Brinkman, B. (2018). Using the Code: Case Studies to help guide computing professionals in how to apply the Code to various real-world situations. Retrieved from https://www.acm.org/code-of-ethics/case-studies

Ünver, H. (2018). (Rep.). Centre for Economics and Foreign Policy Studies. Retrieved April 7, 2020, from www.jstor.org/stable/resrep17009

McCrisken, T. (2018). EYES AND EARS IN THE SKY — DRONES AND MASS SURVEILLANCE. In Lidberg J. & Muller D. (Eds.), In the Name of Security — Secrecy, Surveillance and Journalism (pp. 139–158). London; New York, NY: Anthem Press. Retrieved April 7, 2020, from www.jstor.org/stable/j.ctt22rbjhf.11

Santa Clara University. (n.d.). Overview of Ethics in Tech Practice. Retrieved April 7, 2020, from https://www.scu.edu/ethics-in-technology-practice/overview-of-ethics-in-tech-practice/

Maass, D., & Eckersley, P. (n.d.). Surveillance Drones. Retrieved from https://www.eff.org/issues/surveillance-drones

--

--