The Enlightenment has always aimed at liberating men from fear and establishing their sovereignty. Yet the fully enlightened earth radiates disaster triumphant.
— Horkheimer and Adorno, The Dialectic of Enlightenment Introduction
It’s 2019 and the conversation surrounding the ethics of policing has never been louder. However, when it comes to policing in today’s society, the line between private companies and the government keeps getting thinner. With this line getting closer to disappearing every day, I believe it is time we ask a question directly to the biggest technology companies of our time: “is it ethical to supply the government with policing technologies?” Technology companies are the ones who are shaping our society at the moment and thus, the choices they make will have a major impact on our society for years to come.
Currently, technology companies that are selling policing technology to the government are facing a contradiction of the following moral principles (taken from Bernard Gert’s Universal Moral Values):
- Not depriving freedom
- Doing your duty
Before we go further, it must be noted that “policing” is not just done by law enforcement. “Policing” is defined as: “to regulate, control, or keep in order”, and by this definition, the DHS, DOT and IRS (just to name a few) are all government agencies that police. Being regulated, controlled or kept in order directly conflicts with the moral principle “Not depriving freedom” and that is why these private companies are facing this moral dilemma. The use of technology for policing purposes has made depriving freedom much more pervasive in our lives.
Background Example
A recent Twitter thread between JetBlue and twitter user @mackenzief is a perfect example, not just of how policing is done through technology but of how private companies are aiding the government in depriving us of freedom.
In this thread, MacKenize recognizes her personal biometric data is being shared freely between JetBlue and the Department of Homeland Security. Biometric data should be treated sensitively due to the amount of damage it can cause an individual if it is to be misused. The ethics of sharing sensitive data has been up for debate for some time now (see Facebook) but the consensus seems to be that private data should only be shared after consent is given for it to be shared. After receiving little help from JetBlue regarding information about their new practice, MacKenzie found on her own that there is no opportunity for consent in JetBlue’s new boarding practice.
MacKenzie’s story begs the question: “How accountable are companies for the use of their products by “policing” agencies”? However, the technology in this example is so new we must use a hypothetical situation to examine the technologies potential side effects. Imagine, you are boarding a JetBlue flight using facial recognition instead of boarding passes. But after being scanned, instead of being let on to your flight, you are detained by TSA and then brought to your local police station because your face matches the description of someone who recently commit a crime. Don’t worry, you didn’t commit the crime, and after a quick interrogation, and having your fingerprints taken, you’re free to go. Everything seems great until you remember that you missed your flight. I don’t need to tell you how big of an inconvenience this is… you were on your way to your sisters wedding and the next flight isn’t until tomorrow. You are pissed, and understandably so… but who exactly is to blame for your current circumstances? In my mind, the police along with JetBlue are sharing the blame. Yes, the police put out a vague description of who they were looking for and they were the ones who questioned you, but none of that would have been possible without JetBlue taking your picture and then sharing it with the police. Just think about this, we’ll be coming back to it.
It is important to point out that in this instance JetBlue is not morally obligated to cooperate with TSA. “Obeying the law” is on Bernard Gert’s list of moral values; however, there is no law stating biometric scanners must be used at the airport. This is why we are examining the moral law “Doing your duty”. It is our duty as individuals of a society to try and protect one another and after 9/11 it is definitely the duty of airlines to due their due diligence when letting individuals onto planes. But, how far do airlines have to go in order to be doing their duty? The question of what your duty is as a memebr of society is not new and is in fact one of the oldest questions society has been trying to answer. The question most notably came about in the Age of Enlightenment and was prominently discussed by thinkers such as Thomas Hobbes and Jean-Jacques Rousseau. Hobbes, Rousseau and many others of that time worked on the idea of the Social Contract to answer the question of why and how citizens give up their rights to be governed (or “policed”). Social Contract Theory typically states citizens give up some rights (such as privacy) to the state in order to gain other rights (such as protection) in return. However, there is no consensus as to how much privacy one must/should give up or how much protection one gets in return. If taken to its extreme through Utilitarianism (which we see in many Western societies), the theoretical Social Contract involves individuals giving up any and all semblances of privacy they have in order to receive protection from the state (we will discuss more in depth later on). When this happens, even though no choice is given, this can be considered citizens fully cooperating with the state in order to be policed.
Social contract theory was well known among the original writers of the U.S. constitution and even influenced the document itself. Amendments 4 and 5 of the Bill of Rights make it very clear there are only a few specific circumstances where Americans are obligated to aide the police. Thus, in this instance, JetBlue is not obligated by law to use this new boarding practice. When thinking about the ethics surrounding the use of this technology by JetBlue, we can look at the situation through a Contract Theory of Ethics paradigm. We analyze the situation through this paradigm by seeing the right thing to do (ethically) is for the parties involved in the scenario to abide by agreements they have made with each other. Again, this is a question we will discuss further.
JetBlue’s new boarding practice is a good example of private companies policing/helping police for the government through technology because it brings to light how this happens right in front of Americans everyday. However, when talking about this subject, there is a far more obvious and widespread example we all should know… The NSA’s PRISM project.
PRISM was the codename for a National Security Agency Program that was leaked through The Guardian and The Washington Post by Edward Snowden in 2013 (I highly recommend watching the film Citizenfour to learn more about PRISM and Edward Snowden). In short, through PRISM the NSA “obtained direct access to the systems of Google, Facebook, Apple and other US internet giants” (Washington Post), through which they were able to “collect material including search history, the content of emails, file transfers and live chats.” The data obtained was from all of these services users, regardless of if they were suspected of committing a crime. To decide if this is an ethical form of policing or not, as in most cases when dealing when considering the ethics of government interaction with citizens, we should apply a Social Contract Theory paradigm of ethical analysis.
Social Contract Theory, proposes thinking about ethics in terms of agreements between individuals and society. Doing the right thing means abiding by the agreements that the members of a society have made. When considering the agreements between our society (America) and our government, we can always look at the American constitution. Again, this is similar to the JetBlue situation. In the case of PRISM, there are many who believe the U.S. government had violated the constitution, and thus, was acting unethically based on Contract Theory of Ethics. On June 11th, 2013, the American Civil Liberties Union filed a lawsuit against the NSA, stating that it had violated American’s right to privacy. However, not everyone believes PRISM violated the constitution. In 2013, when PRISM was revealed by Edward Snowden, Barack Obama was president of the United States. Obama was forced to defend PRISM on multiple occasions but I think his general line of defense could be summarized by his statement:
“In the abstract, you can complain about Big Brother and how this is a potential program run amok, but when you actually look at the details, then I think we’ve struck the right balance,” — Barack Obama, 2013
PRISM and Biometric boarding passes are both examples of how private technology is eliminating freedoms of the American people. PRISM is important to analyze because it is an example that has already happened. Biometric boarding passes are very new and the the discussion about the ethics of their implementation hasn’t even begun yet. Just as biometric boarding passes are doing, PRISM pitted the moral law to “not deprive freedom” against the moral law to “obey your duty”. Thus, we can use the discussion that surrounded the ethics of PRISM to extrapolate how the discussion surrounding biometric boarding passes will presumably go.
Subjectivism
Imagine a society in which half the population thought it was moral to kill and the other half did not… that could never work. However, that’s exactly what ethics are for, when moral conflicts arise (not for when society’s turn murderous). Unfortunately, ethics typically need to be agreed upon for a society to function. It does our imaginary society no good if the murders and non-murders cannot think of a common code of ethics. When two sides disagree on whether an action was/is ethical, the disagreement is often attributed to ethical subjectivism. The simplest form of ethical subjectivism is “When a person says that something is morally good or bad”, meaning “that he or she approves of that thing, or disapproves of it, and nothing more” (Rachels). However, I do not believe the example we were previously discussing (differing views on PRISM being ethical) was a case of “Simple Subjectivism”.
President Obama believed the amount of privacy lost due to PRISM was allowed under the “Social Contract” the government had with its citizens while the ACLU believed the amount of privacy lost was unethical based on the “Social Contract”. This seems to fall closer to Emotivism. It is evident that their is a “disagreement in attitude”, a form of Emotivism. In a disagreement in attitude, two parties can share all the same beliefs regarding an action, but have differing attitudes about the action. Using the example of PRISM, Obama and the ACLU are both American, both believe the intentions of PRISM were to help American citizens, and both believe in the validity of the American constitution; however, their attitudes regarding PRISM are different: Obama believes PRISM is ethical under the agreement the government has with the people and the ACLU believes it is unethical. This is the epitome of how dangerous ethical subjectivism is, even under contract theory of ethics, in which there is a contract stating how to be ethical, disagreements on what is ethical can still arise.
It is tempting to allow ethical subjectivism, and it is true that sometimes it can be wrong to tell someone that their decisions are unethical. However, for ethics to have any currency, they must be agreed upon, especially when actions are being committed by one person or party onto another. As Penn State professor of philosophy David Agler points out, ethical subjectivism is particularly dangerous to weak or defenseless persons or parties. If moral right and wrong is based on individual beliefs, than a purely ethical individual would never have to respect or acknowledge any other person’s individual rights or ethical beliefs. Thus, the powerful could force their will onto the weak unimposed and still be ethical, as long as the powerful could justify to themselves that they were in fact being ethical. This can clearly be seen in our example of PRISM. The U.S. government is much more powerful than any individual or group and, thus, they can (and have) forced actions onto others with the justification that they themselves found it ethical. You, as an individual may find it unethical that all your calls and online data gets collected by the NSA, but that doesn’t matter if we allow for subjectivism. Thus, we need to do our best to recognize subjectivism (as we just did) and try to work through it. Subjectivism is also clear when discussing the ethics of implementing biometric boarding passes.
Private Companies
Our focus to this point has mainly been on the government and their actions related to individuals; however, third parties were involved in both examples we have discussed so far. The third parties in these examples have all been private companies that provide tools to the government that aide in actions that some have found unethical. When we previously examined the new boarding practice that Jetblue are implementing, we merely examined it through the paradigm of Contract Theory Ethics. It is possible to disagree on whether these new implementations are ethical or not, as we found with subjectivism. But, for the sake of this discussion, let’s say you did find the new practice unethical… who do you say is the party acting unethically? As a quick reminder, we found that biometric data is being shared between the U.S. government and Jetblue without a possibility for any individual to consent to it. This action could not have occurred without actions from both the airlines and the government. We were able to use Contract Theory to analyze the individual’s ethical contract with the government and Jetblue’s ethical contract with the government; however, we left judgment of Jetblue’s actions up in the air. We found Jetblue are not ethically obligated to cooperate with government policing practices but that does not eliminate the possibility for them to cooperate and still be ethical.
Jetblue has no ethical contract with the government or individuals in this instance so we must use a different paradigm of ethics to analyze their actions. Let’s analyze the situation through 3 of the most popular theories of Fruethics.
Virtue Ethics
Virtue Ethics states that in order to be ethical, all that matters is character. Thus, a moral action by the standards of virtue ethics is an action that would be taken by a virtuous person who has virtues that lead to happiness and flourishing. In the example of the biometric boarding pass, JetBlue has a case to be made for being ethical under Virtue Ethics. This new boarding practice has intentions to make flying easier for individuals, no longer is there a need to print a boarding pass and keep track of it. This practice also frees up the gate agents and allows for them to freely walk about the boarding queue and assist passengers more easily.
However, the paradigm of virtue ethics often falls victim to ethical subjectivism. There is no one set definition of a virtuous individual, and thus, all virtue ethics practitioners may have different standards of what it means to live virtuously. On the other hand, virtue ethics recognizes the important role that emotions have in living a moral life.
Utilitarianism
Utilitarianism states that the determinant of an ethical action is the amount of happiness and suffering it creates. Thus, a moral action maximizes happiness and minimizes suffering. This often referred to as the principle of utility. However, there are two different ways to apply the principle of utility and as a result there are two specific forms of Utilitarianism: Act and Rule. Act Utilitarianism applies the principle of utility to individual moral actions while Rule Utilitarianism applies the principle of utility to universal moral rules.
If we apply act utilitarianism to biometric boarding passes, we must only ask if this one action provides more happiness or more suffering. It could be said that this one action provides more convenience for passengers than it does suffering of loss of privacy. If we apply Rule Utilitarianism to biometric boarding passes, we have to imagine a scenario were it becomes the rule to always give up personal privacy for personal convenience.
This rule would have the potential to eliminate any semblance of privacy any individual possesed. In a society under this rule, not just airlines would have biometric data. All entities that currently have tickets or need confirmation an individual is who they say they are would have biometric data, along with any other information they could acquire. Although this would create maximum convenience for individuals, it would also create a maximum loss of privacy. Individuals have more privacy to lose than convenience to gain so it could be said that biometric boarding passes are unethical under Rule Utilitarianism.
Kantianism
Kantianism is a theory that was created by German philosopher Immanuel Kant (1724–1804). Kantianism emphasizes the principles behind an action rather than the results of the action. This is a very practical take on ethics as it eliminates hypothetical results of an action and just focuses on actions and the circumstances surrounding them. A categorical imperative for Kantianism is “that you always treat both yourself and other people as ends in themselves, and never only as a means to an end” (Quinn, Michael; Ethics for the information age 7th edition). So to understand if biometric boarding passes are ethical under Kantianism, we must speculate as to why biometric boarding passes were implemented. If the principle behind implementing biometric boarding passes is solely that JetBlue want to make flying more convenient for their passengers, than their implementation of biometric boarding passes is ethical. However, let’s say JetBlue saw this as a way to give themselves an advantage over other airlines and thus make more money, than the implementation of biometric boarding passes would not be ethical. It should be noted that under Kantianism, the same action can be ethical and unethical when done by two separate parties, as long as those parties had different principles behind the actions.
Do We Have To Decide If Actions Are Ethical?
There are many more ethical paradigms we could have used to analyze the implementation of biometric boarding passes but that wouldn’t help us in deciding if this situation is ethical. We can already see that this action can be either ethical or unethical based on different theories of ethics (it’s impossible to find any situation that is either always ethical or unethical under all theories); however, as a society we must make a decision. This is because by not making a decision, we are allowing this practice to continue and, thus, we are making a decision to allow it. I picked this specific example because I believe it is the epitome of a current trend, a trend that could lead to the elimination of all privacy. The line of between private companies and government agencies when it comes to who polices society is fading, and with it, the idea of individual privacy. This is all due to how new technology can connect private companies and the government and how this same technology can reveal so much more about individuals. Theodor Adorno postulated after WWII that “The Enlightenment has always aimed at liberating men from fear and establishing their sovereignty. Yet the fully enlightened earth radiates disaster triumphant.”(Horkheimer and Adorno, The Dialectic of Enlightenment Introduction) Adorno saw technology progress to a point where it was possible for 62 million people to die in the 6 years it took for WWII to play out. Thus, it was clear to him that if society did not follow the correct ethical paradigm, things could get irreversibly bad, in a relatively short amount of time. Adorno saw that reality in the 1940’s but technology has been advancing at an exponential rate since then. Now, a new technology can consume the world almost instantly, changing cultures and societal norms with it. For example, Forbes found that in 2014, Uber, which at the time was a relatively new technology like biometric boarding passes are today, was launching in a new city every day. Thus, if we don’t make a decision now on how willing we are to have technology police us as a society, soon it will be too late to have a choice.
Influential Circumstances
Before we make a decision on if these actions are ethical we need to take into account some current influential circumstances. As we mentioned earlier, actions like biometric boarding passes require a free flow of private information between the private sector in the government. Currently, private companies have a terrible track record of keeping private information private. It was revealed in March of 2018 that political consulting firm, Cambridge Analytica, had improperly obtained data on as many as 87 million Facebook users, gaining access to private information from individual profiles. Also in 2018, KrebsOnSecurity along with other researchers found that
“Many Google Groups leak emails that should probably not be public but are nevertheless searchable on Google, including personal information such as passwords and financial data, and in many cases comprehensive lists of company employee names, addresses and emails.” — KrebsOnSecurity
Unfortunately, data breaches don’t just plague private companies, earlier this year it was reported that 3 terabytes of unprotected data from the Oklahoma Securities Commission was uncovered, revealing FBI investigations and actions dating back 7 years.
This brings us to our next influential circumstance: there is reason, based on past events, to not trust the government with your personal data. Similar to PRISM, there is a past government program that all American citizens should know of when it comes to knowing how the government has used technology against its own citizens: COINTELPRO. COINTELPRO was an FBI program created by J. Edgar Hoover that used wiretaps, video surveillance, forged documents and dissemination of fake news to discredit, frame, and even kill U.S. citizens. This program may have ended in 1971 but the government’s attitudes toward individual privacy when it comes to policing hasn’t changed. 35 years after COINTELPRO and the U.S. government is still initiating programs, such as PRISM, that have no regard for individual privacy.
One last circumstance we should take into consideration is that, at the moment, biometric data, especially facial recognition, is not always that accurate. Facial recognition may work when used in scenarios such as the boarding queue where passengers can stand in front of a camera until it recognizes them (and when there are human back up measures in place) but when tried in other scenarios it has fallen short. In April of 2019, “prominent artificial-intelligence researchers… signed a letter calling on Amazon to stop selling its facial-recognition technology to law enforcement agencies because it is biased against women and people of color.” More evidence of Amazon’s facial recognition technology being inaccurate comes from a trial run in New York City. According to the WSJ, a New York City MTA official wrote that the “initial period for the proof of concept testing at the RFK for facial recognition has been completed and failed with no faces (0%) being detected within acceptable parameters”..
What Do We Do If We Decide We Don’t Think This Is Ethical?
Given the influential circumstances we previously discussed, I believe the implementation of biometric boarding passes is unethical. I mainly focused on the paradigm of Rule Utilitarianism. I used Rule because I think it applies to how our society embraces technology better than any other ethical paradigm. However, just stating that a practice is unethical is never enough, we must decide how as individuals we can ensure the ethical choice is made. Although it doesn’t feel like it most of the time, there are things we can do as individuals to prevent private companies and the government from taking actions we deem to be unethical. At the smallest level, we can stop allowing ourselves to be victims of the unethical systems. By that I mean we can stop flying JetBlue or any other airline that uses biometric boarding passes. As a result, those companies will lose business and hopefully be convinced to change their practices. This method of resistance can be applied to many more scenarios, if you don’t want your private data being shared by Facebook or Google, stop using Facebook and Google. Unfortunately, that is just about the extent of what “regular” individuals can do to show private companies they think their business practices are unethical. However, individuals who are part of these large companies have a lot more power to create change.
Employees at some of the largest and most powerful technology companies are starting to fight against what that see as unethical business decision by their employers. Last year, over 100 Microsoft employees posted a letter on their companies internal message board protesting Microsoft’s selling of software service to ICE. In the letter addressed to chief executive Satya Nadella, the employees stated that they “believe that Microsoft must take an ethical stand, and put children and families above profits”. Also in 2018, over 3,100 Google employees signed a letter “protesting the company’s involvement in a Pentagon program that uses artificial intelligence to interpret video imagery and could be used to improve the targeting of drone strikes” (New York Times).
This is a trend that needs to be maintained in order to prevent private technology companies from eliminating privacy from our lives entirely. It is also essential that individuals who are not apart of technology companies at least recognize when intrusive technology is being introduced into our society. As long as we all stay vigilant and are ready to voice our opinions, I have hope that privacy will have a place in future America.
Sources
I tried to hyperlink as much as possible as opposed to using parenthetical citations (I think that’s the most ethical way to site a source online, it should increase their SEO). Citations are in order of appearance.
Starting quote: Horkheimer and Adorno, The Dialectic of Enlightenment Introduction
Bernard Gert’s Universal Moral Values: http://tarothermeneutics.com/ethics/gertsrules.html
JetBlue Twitter thread: https://twitter.com/mackenzief/status/1118614203051466762?ref_src=twsrc%5Etfw
More information on biometric boarding passes: https://www.extremetech.com/extreme/250214-jetblue-delta-biometric-scanners-may-replace-boarding-passes
Social Contract Theory: https://www.iep.utm.edu/soc-cont/
More on Social Contract Origins: Patrick Riley, The Social Contract and Its Critics, chapter 12 in The Cambridge History of Eighteenth-Century Political Thought, Eds. Mark Goldie and Robert Wokler, Vol 4 of The Cambridge History of Political Thought (Cambridge University Press, 2006), pp. 347–75.
PRISM — Washington Post: https://www.washingtonpost.com/wp-srv/special/politics/prism-collection-documents/?hpid=z1
PRISM — The Guardian: https://www.theguardian.com/world/2013/jun/06/us-tech-giants-nsa-data
Rachels on Subjectivism — James Rachels, Subjectivism in Ethics Chapter 3, page 35
Prof. David Agler on Subjectivism: http://www.davidagler.com/teaching/ethics/Lecture_3_Ethics_Subjectivism.pdf
Tech growing at an exponetial rate (Moore’s Law): https://www.intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html
Uber growth: https://www.forbes.com/sites/ellenhuet/2014/12/11/ubers-global-expansion/#531bd122550a
Cambridge Analytica: https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html?module=inline
Google Groups leaks: https://krebsonsecurity.com/2018/06/is-your-google-groups-leaking-data/
FBI leaks: https://www.upguard.com/breaches/rsync-oklahoma-securities-commission
COINTELPRO: https://www.archives.gov/research/african-americans/individuals/fred-hampton
AI researchers letter to Amazon: https://www.nytimes.com/2019/04/03/technology/amazon-facial-recognition-technology.html
NYC MTA failed facial recognition: https://www.wsj.com/articles/mtas-initial-foray-into-facial-recognition-at-high-speed-is-a-bust-11554642000
Microsoft protest ICE: https://www.nytimes.com/2018/06/19/technology/tech-companies-immigration-border.html
Google employees protest letter: https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html