Black Box: Self-Surveillance and Societal Control

Article by Emphour

“The road to hell is paved with good intentions”

Merely seeing depictions of watching eyes is enough to influence humans to modify their behavior to conform to expected norms. One experiment conducted in 2012 found that stylized eye images placed on donation buckets in supermarkets increased donations by a staggering 48% in comparison to controls that did not have eye images.[1] The watching-eyes effect illustrates how easily people can be manipulated  and controlled, specifically through feeling as though they are being monitored.[2] In July 2024, newly registered cars in the EU will be required to have a "black box" installed. The Event Data Recorder ("EDR") as a small technical device is not a big deal in and of itself, but it is part of a wider trend in technological society that increasingly restricts and controls people's freedom, encroaches on their privacy, and tries to mold human behavior to fit the confines of the technological system.

Let's first establish the bare facts. The EDR constantly receives information from various sensors in the vehicle, but does not store this information permanently, instead constantly "forgetting" everything. Only in the event of an accident is a small period of time saved. The information then remains in the memory, which is permanently installed in the vehicle, and can only be read out by experts via a physical interface and is not sent wirelessly to any servers. So nobody sits in front of screens all day and randomly looks at who is driving where and how fast. In contrast to the "black boxes" in airplanes (specifically the cockpit voice recorder or "CVR"), no sound is recorded. The EDR therefore promises to resolve any disputes in the event of an accident. So if you haven't done anything wrong, you have nothing to fear. In the future, the EDR will be used after accidents to find the guilty party.

Nevertheless, this makes the EDR a form of behavioral control, because human behavior can be controlled without direct coercion. Whether people are always being observed or not is not so important. As hinted above, what counts here is that they feel observed. This is known and may be deliberate. German courts ruled that even a dummy camera aimed at a neighbor's property constitutes an infringement of personal rights, as well as a camera that is not aimed at the neighbor’s property, but which could be electronically panned towards it. Both cases cause surveillance pressure.[3] The EDR is part of a series of techniques that make up the modern democratic system as a whole, ideally working without the need to even evaluate the physical device. This is very similar to the “chilling effect”:

“[W]hen people are aware of the digital traces they leave and start thinking about their potential consequences, they become wary of what they say and do. This is called the ‘chilling effect’ and can result in self-censorship, self-restraint, and silence. People anticipate what the consequences of their behaviours may be and behave more carefully when there is no certain way to assess what will be considered controversial or disruptive, and what the legal and extra-legal consequences may be.”[4] 

Of course, the average citizen once again has no way of checking for themselves what the new device in their vehicle is actually doing. So they have to trust their government. The current governments in Europe may still be relatively humane, but they are also under attack from different directions at the same time. The future of democracy as we currently know it is very uncertain. In China, for example, surveillance is on a completely different level: One surveillance camera for every two citizens, widespread facial recognition technology,[5] dehumanizing experiments like having students wear brainwave measuring headbands.[6] And on top of that all a widespread defeatist mindset about the whole situation.[7] Even if it were possible now to prevent the introduction of new surveillance technology, the technology is already out there, waiting. With a change of circumstances, it could be reintroduced at any time, as long as the infrastructure that enables it continues to exist.[8]

It is well known that constant surveillance has negative psychological effects. But even if surveillance could be done at just the right level so that no psychological problems arise, it would still have to be rejected because of its offense to human dignity.  It is demeaning to live in a world where one could be spied on at any time for any reason, but only spared this because of the beliefs of people outside your direct control. And it is a form of benign servitude: one does not have real freedom if anyone else–especially large impersonal organizations outside your control–has power over you, no matter how tolerantly or benevolently or humanely that power is exercised.

Regardless of this technology's specific use, its very existence will still have the effect of influencing and controlling human behavior, even if not directly on a continuous basis, then indirectly out of fear of what the technology will mean for them after its stated use–once there is a traffic accident. Stalking and nosy neighbors are rightly put in their place by the rule of law, but when large, anonymous organizations do it, it is suddenly pushed as being “for the benefit of humanity”?

If the EDR can bring about more pro-social behavior in particularly reckless drivers remains to be seen. What can be almost assured is that people who are by nature particularly susceptible to this form of psychological control are influenced even more intensively to conform, with all the negative consequences that brings.

So what is to be done? Anyone who has ever tried to bring about political change through a protest or something similar knows how hopeless the chances are when working against the current trend. The issue of surveillance is nothing new. After 9/11, the Patriot Act gave the go-ahead for extensive wiretapping. Edward Snowden's leaks brought the surveillance measures into the public eye and they were widely discussed. In fact, nothing has changed to this day, except that it is now an accepted fact that we no longer have any privacy. The real problem is a system that creates the need for surveillance and control. It is necessary for driving behavior to be strictly regulated. In the US alone, the number of traffic fatalities in 2021 was 42,939, which is enough people to fill an average professional baseball stadium. And that's just the deaths. This is not only a terrible tragedy for those affected and their families, but also a major disruption to the smooth functioning of the system. So, because of the technology, the monitoring and control of driving behavior aimed at reducing or eliminating traffic accidents becomes not only a moral imperative, but a purely technical necessity.[9]

Once again, we see that any technology brings with it unavoidable consequences and requires people to adapt to it – and not the other way around. Cars and other technologies are generally seen as enabling greater freedom, while in reality they require even more restriction. In the case of the EDR, democracy is stepping in here and enforcing the necessary high level of social discipline, based on sophisticated means of psychological control instead of old-fashioned physical coercion.

 

 
___________

NOTES:

[1] Powell, K.L., Roberts, G. and Nettle, D. (2012), Eye Images Increase Charitable Donations: Evidence From an Opportunistic Field Experiment in a Supermarket. Ethology, 118: 1096-1101. https://doi.org/10.1111/eth.12011

[2] “In our meta-analysis of 15 experiments from 13 research papers we report a reduction in the risk of antisocial behaviour of 35% when eye cues are present.” Keith Dear, Kevin Dutton, Elaine Fox, Do ‘watching eyes’ influence antisocial behavior? A systematic review & meta-analysis, Evolution and Human Behavior, Volume 40, Issue 3, 2019, ISSN 1090-5138, https://doi.org/10.1016/j.evolhumbehav.2019.01.006. (https://www.sciencedirect.com/science/article/pii/S1090513817303264)

[3] See Kame­raat­trappe ver­letzt Per­sön­lich­keits­recht, Legal Tribune Online, 18.09.2019. Schwenk­bare Kameras sind unzu­lässig, Legal Tribune Online, 03.06.2024. LG Koblenz Beschl. v. 05.09.2019, Az. 13 S 17/19 and AG Gelnhausen Urt. v. 04.03.2024, Az. 52 C 76/24.

[4] Ariane Ollier-Malaterre, Living with Digital Surveillance in China, Routledge, 2024, page 29.

[5] https://techwireasia.com/2023/08/facial-recognition-tech-in-china-will-soon-be-governed/

[6] https://thechinaproject.com/2019/04/05/chinese-parents-want-students-to-wear-dystopian-brainwave-detecting-headbands/

[7] Ariane Ollier-Malaterre, Living with Digital Surveillance in China, Routledge, 2024, part II-IV.

[8] Until now, data retention without cause was only possible if it served to combat serious crime. The European Court of Justice relaxed this restriction: IP addresses can now be stored to combat any crime, for example copyright infringement. See European Court of Justice, 30.04.2024, Case C-470/21.

[9] The reader may be thinking here that a “chaotic” system, as seen in India for example, also works. Indian traffic may look to westerners as if it can regulate itself, but in fact the number of road fatalities (per 100,000 inhabitants) is many times higher than in Europe, for example. See: Global status report on road safety 2023. Geneva: World Health Organization; 2023. license: CC BY-NC-SA 3.0 IGO, page 9.

 

Copyright © 2024 by Wilderness Front LLC. All Rights Reserved.

Previous
Previous

Solarpunk: A Fantasy Solution to Technological Dystopia

Next
Next

The Decay of Relationships in the Modern World