Deepfakes, Surveillance Capitalism and The Cyber Republic

Deepfakes, Surveillance Capitalism and The Cyber Republic

by Collin Wynter

In this paper I shall focus on novel developments in social media, artificial intelligence and the ether realm called the internet. Deepfakes, Surveillance Capitalism, and the Cyber Republic are the topics. Deepfakes are artificially constructed audio and/or visual features that mimic real life. They are a dangerous because they can spread misinformation, create mistrust and invade privacy. This in turn may call into question the validity of audio/visual communication. Next, surveillance capitalism will be explained. Five tech companies are intrinsically involved in the gathering of private information and benefiting from their uses: Facebook, Apple, Google, Microsoft, Amazon (Other companies not mentioned, but these two should be considered: Twitter and TikTok). They use artificial intelligence help predict consumer behaviour. They wish to create smart cities and determine people’s actions. Will social media be played out in real time on the streets? Perhaps smart phones will be used in face to face contact to determine if the person we are talking to is exactly who they say they are. To friend or unfriend? That is the question. I will endeavour once again to conclude with some hope with ideas from the Cyber Republic. This is a society that integrates the use of social media, artificial intelligence and the internet, safely and securely, through the use of blockchain and data property rights. As the artificial world of virtual reality expands, it is reflected in the actual economy. How should we, as the users, maintain independence from the wires are gadgets that we have become so dependent on; while still benefiting from this virtual illusion reality?

In Deepfakes (2020) by Nina Schick she calls this concept “synthetic media” ( Shick 8).  This technology is developed using artificial intelligence and used via social media among other avenues. The danger behind this type of technology progressing unabated is obvious. You can create anyone real, alive or dead, or never to have existed, and make them appear to voice statements or enact actions. These “deepfakes” could be used to insight violence, do criminal acts, or manipulate scenarios to nefarious ends. She begins her argument with a story about a deepfake of President Obama talking like a street thug (Shick 7). The ramifications of anyone believing that Obama would say racially based hate on the internet would send shudders throughout the free world. This technology may bring about an “infocalypse” (Schick 13). By putting doubt into the minds of the users of the internet and breaking down trust between participants, may lead to an existential crisis. There is a special danger for law enforcement. Audio and/or video tools are purported to be some of the best tools of the police. And some of the most reliable evidence. In the “Age of information” (Schick 9) we cannot expect society to deny the use of technology because it presents some risk. It is much better to make oneself aware about it (Schick 190); learn how to defend against it by checking multiple trusted sources and using software to detect deepfakes (Schick 192); while also fighting back by joining such organizations as the Deep Trust Alliance (Schick 201). Deepfakes call into question what is to be trusted online. The public must learn to trust itself, lest it fall into a recursive algorithm of ones and zeroes.

The Age of Surveillance Capitalism (2019) is a New York Times “Notable Book of the Year”. In it, Shoshana Zuboff aims to present the development of the social media and artificial intelligence technologies, their dangers, and solutions to using them. Surveillance Capitalism is defined as the tools used to garner data of the individual to influence future consumer behavioural choices (Zuboff i). Their method is to use “machine intelligence” to create “prediction products” for future markets (Zuboff 8). The public appears to benefit from artificial algorithms assisting them via advertising suggestions. However, she presents the case of Pokemon Go, the interactive social media game that is played out in public. Personal player data is mined and then used to physically direct you towards businesses that they believe you may wish to patronize. For example, they may extrapolate from your preferences that you would enjoy going to McDonald’s. Therefore, the program will lay out a game map directing you towards that establishment (Zuboff 311). Applying this technology to games presents itself as being harmless fun. But intentionally using your private preferences as a determinant to direct your future behaviour is insidious. She also discusses a “smart home”, which is an artificial intelligence system used not only for music and temperature, but it can also be tied into video surveillance and security features (Zuboff 5). Hayley Peterson, in a Business Insider article, reports that these systems can be hacked. Inhabitants have been subject to audio messages over their intercom, their temperature thermometers have been tampered with and doors being locked or unlocked. Not by the homeowner. Zuboff highlights that all of your information via social media and/ or personal security is being represented online. To protect oneself from this intrusive attack, she points to a case in Spain where a family fought for the “right to be forgotten”. The courts agreed that personal information was not the right of the company, but that of the individual (Zuboff 60). Thus, there appears to be some hope; that the laws still favour the people rather than the tech oligarchs. But can we trust these behemoth corporations to remain in their place, especially when one such giant, Google, has this as a mantra: “ ‘[T]o organize the world’s information and make it universally accessible and useful’ ” (Zuboff 59).

User knowledge must be at the forefront of the battle against the tech oligarchs and their overreach into the personal and private data of the public. Agreements must be clear and transparent as to what is being traded for ‘free’ use of their platforms and any online or internet connected goods and services. George Zarkadakis tries to tackle these momentous considerations in his Cyber Republic. The formation of a “Cyber Republic” is a response to this tech oligarchs’ online bullying and corporatism. No longer is it just concerns about your back account being broken into, it now appears that access to your physical house is at risk. To protect your personal data, he suggests using blockchain technology. User anonymity is protected via security keys. Think a personalized number that only you have access to. Blockchain also prevents hacking. “Because of their decentralized nature, blockchains are generally more secure than centralized system because they do not have a central point of attack or failure.” (Zarkadakis 105) So, Blockchain is a viable system for protection of user data, but there are still legal matters to attend on. Human rights are developed and maintained through the use of law. To protect against the tech oligarchs’ future attacks on users, ownership of digital property rights needs to be developed. Users would be required to be compensated for any agreed upon use of their data ( Zarkadakis 121). 

Social media, the internet and artificial intelligence are not going away anytime soon. If ever. These tools, applications, and relationships are developing at an ever rapid pace. It is important to learn how to ride this wave of technology, rather than to deny the existence of it. Learning the pros and cons to the usage of it is essential to protect oneself from being used by it. Deepfakes are a new dangerous tool that may seem like harmless fun to some, but can seriously erode our trust in using the internet for direct audio/visual contact. Starting with pre-recorded video, this surely will turn into live streams and then into Turing Test dialogues. While we may end up watching Deepfakes, they may in turn be watching and nudging us along to future behaviours we did not even know we wished to pursue. The pre-determined future artificial intelligence is planning for us through the use of social media, may not necessarily be what is in our best interests. To combat these Machiavellian mechanism, we must gather together as a Cyber Republic. The individualized use of blockchains to protect our personal data from non consensual use and hackers, along with lawful agreements of data sharing with tech companies that guarantee financial shares, is vital for our society to maintain a sense of meaning in reality. Giving our lives over to virtual reality is in essence giving over our identity to a machine. And only machines are capable, capable of knowing what they want. 

Bibliography

Peterson, Hayley. Wisconsin couple describe the chilling moment that a hacker cranked up their heat and 

started talking to them through a Google Nest camera in their kitchen. Business Insider. September 25, 

2019, 3:12 PM.

https://www.businessinsider.com/hacker-breaks-into-smart-home-google-nest-devices-terrorizes-couple-2019-9

Schick, Nina. Deepfakes. Tamang Ventures Ltd. New York, NY. 2020.

Zarkadakis, George. Cyber Republic. The MIT Press. Cambridge, MA. 2019.

Zuboff, Shoshana. The Age of Surveillance Capitalism. Hatchett Book Group Inc, New York, NY. 2019.

Published by Collin Wynter

Exploring rights of our freedom of expression and justice

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: