KNOWLEDGE

Our communications can be monitored

I tried to write a long message to the Gao Games fans to explain why I was leaving, but I couldn’t. It was too tiring, too complicated, and to be honest, I didn’t have the courage… I know it sucks. I left my draft notes, though. It’s not necessarily very clear, still I hope someone can do something with it. I sorted them by topic, to try to talk about everything. This one is about spying on discussions…

By default, our digital communication means (email, SMS, social networks…) are not very secure and do not guarantee the confidentiality of our exchanges. Especially cell phones, because the way they work is voluntarily made opaque and their modification difficult.

All our conversations can be monitored by companies that make money from this activity (like Gao Games), by states and their administrations, or by malicious people around us or unknown. One of the only ways to protect ourselves from these prying eyes is to encrypt communications.

Discovering the impact on Nikki’s and others’ lives

Associated Know-How

Our phone allows us to tell at any time where we are

As soon as it connects to a mobile network, our cell phone can be located relatively precisely by our telephone operator (e.g., Free, Orange, SFR, Bouygues…). At a short distance, it is also possible to locate cell phones with Wifi or Bluetooth activated.

But the most problematic part comes from applications already installed. Many of them use the built-in GPS to transmit our location to companies that exploit or market this information, including many of the “Net Giants.” These companies and their customers—as well as police and intelligence services—can know where we are, but also where we have been, how many times, how long, with whom… If you want to know more about this topic, you can read this article du site Surveillance Self-Defense.

Discovering the impact on Nikki’s and others’ lives

Companies make a lot of money by monitoring us!

How do some companies make money from our data?

The business model of many companies offering online services relies on the collection and use of our personal data. The best known of these companies are some of the GAFAMs—an acronym for Google, Amazon, Facebook, Apple and Microsoft—but they are far from the only ones doing so.

By knowing our behaviors, these companies can push us to do something (buy a product, see a movie, enter a store nearby…). The more these companies know about us, the more money they make!

Most of our uses can be used to collect more and more information about us: web browsing, cell phones, watches, payment and loyalty cards, televisions, vacuum cleaners, cars, doorbells, refrigerators… So many opportunities for these companies to access small pieces of information, which when put together, allow them to know a lot about us, our lives, our ideas and our daily lives. Although these practices are often illegal, they are nevertheless widespread. Fortunately, there are ways to protect ourselves.

Discovering the impact on Nikki’s and others’ lives

Mass surveillance undermines democracy

By monitoring our actions, companies are able to deduce our political opinions and the arguments most likely to convince us to do or not to do something. And even to vote or not to vote for a candidate in a democratic election!

For example, the company Cambridge Analytica is accused of having used data collected in part via  Facebook o influence the election of Donald Trump in the United States or the referendum on Britain’s exit from the European Union. In all, Cambridge Analytica would have biased more than 200 elections around the world, to meet the demands of people who can pay for these services, but this type of influence goes against the functioning of democracy where each person makes his choices freely and not by being manipulated.

Knowing that we may be watched changes the way we behave. For example, people will refrain from doing or saying something (i.e., expressing a minority opinion) for fear of being penalized.

Discovering the impact on Nikki’s and others’ lives

Associated Know-How

Digital tools also pollute

Contrary to what we might think, digital uses and tools pollute the planet. The manufacture of devices and equipment, browsing websites accessible 24 hours a day, sending emails, printing at home … all these activities, consume energy, produce waste and require large volumes of rare metals!

Digital tools and services (i.e. a video instead of a plane trip), encourage existing polluting uses instead of replacing them. Studies show that IT uses already emit more greenhouse gases than air travel, and that these emissions will soon reach the level of those of road transport. The storage of large quantities of data (the “Big Data”) and deep learning are very polluting

The manufacturing conditions of this equipment and their components are also often scandalous (esclavage, travail des enfants…).(slavery, child labor…). Fortunately, solutions exist and we can act to limit the social and environmental consequences of our digital practices !

Discovering the impact on Nikki’s and others’ lives

Mass surveillance facilitates state repression

Mass surveillance in itself is already a violation of our fundamental rights. Unfortunately, it also allows even more serious violations. Non-democratic governments can use it, for example to remove the counter-power of the media or whistleblowers, who by their vigilance strengthen our democracies.
As whistleblower Edward Snowden revealed, the surveillance carried out by the Internet giants directly feeds the surveillance carried out by certain States. Protecting ourselves from one allows us to protect ourselves from the other, but also to protect our loved ones. Indeed, our practices have consequences on the people around us. For some of them, keeping their information secret can be very considerable—even vital.

Nowadays the Pegasus Program is the most recent example of how some governments monitor others (heads of state, citizens, journalists) through their cell phones.

Discovering the impact on Ajay’s and others’ lives

Robots can make mistakes too!

Digital tools are becoming more and more influential in our daily lives and assist us in an ever-increasing number of activities. Without us necessarily being aware of it, decisions affecting our lives are made by machines. Or rather … by the people who designed them!

Of course, these machines are far from infallible: their computer programs can malfunction, or the data used to make these choices can be incorrect. In other cases, the people who designed these programs imagined that our bodies and our uses would resemble theirs, and … they were wrong.

All of these potential problems can lead to the same result: people cannot use these tools normally or benefit from the services they provide. Unfortunately, the people who experience these difficulties the most are often those who are already victims of other forms of inequality. For example, female voices or dark skin are less well recognized by many digital tools than male voices and light skin. Boring, when you want to use the voice command of your GPS or a facial recognition application!

If you are interested in this topic, you can listen to this podcast.

Discovering the impact on Ally’s and others’ lives

Interfaces designed to mislead and manipulate us

On the Internet and in our phones, many companies use fake interfaces, carefully designed to manipulate us. In English, they are called dark patterns. Thanks to deliberately misleading notifications, presentations and color choices, preselected options, hard-to-find information … it is possible to make us accept things that we would probably have refused if they had been presented clearly.

These companies know how to use the mechanisms of our brain related to the processes of pleasure and reward and make us want to use their applications and services, to spend more and more time on them, sometimes to the point of causing real addictions.

All these efforts are aimed at collecting more and more data about us to know us better, to push us to consume. For example, they try to determine the moments when we are tired or when we lack confidence, and they take advantage of this to make us buy things that we would have refused the rest of the time. I contributed to this manipulation with the Gao Games, it was a mistake on my part. When I understood it, I left.

To learn more about this, you can watch this videoseries from ARTE, this article from the InternetActu.net site and this publication from the Commission nationale de l’informatique et des libertés (CNIL).

Discovering the impact on Amin’s and others’ lives

Companies and administrations are also monitored

The tools of mass surveillance facilitate industrial or economic espionage. Whether it is to compete with companies, to steal their discoveries, to hold them to ransom, to sabotage them, or to find out about their customers through this means, many actors have an interest in obtaining confidential information from companies.

In the same way, this issue is also very important for administrations and public services. More often than not, it is the people who work there and have legitimate access to this confidential information who unwittingly allow malicious people to gain access. And this situation is getting worse as these organizations increasingly rely on telecommuting and open access to their digital tools from outside their premises.

The massive nature and global scale of these practices, as well as the participation of state intelligence services in these activities have been demonstrated by the documents revealed by whistleblower Edward Snowden in 2013 and more recently, in 2021, the Pegasus Program where some governments monitor others (heads of state, citizens, journalists) through their cell phones.

Thus, protecting ourselves from surveillance allows us to protect the data of the structures in which we work and to avoid situations that can be catastrophic. You can find many tips to limit these risks on the cybermalveillance.gouv.fr website

Discovering the impact on Lucas’s and others’ lives

Personal data or collective data?

Although we generally speak of “personal data,” a large part of what the “Net Giants” exploit commercially is collective data. This is what we call social graphs (or social networks): the connections and relationships between people.

By being able to collect information about someone’s opinions and tastes, these companies can easily deduce those of the people around them. And since communications by nature involve several people, their confidentiality depends on the choices of all participants. If only one recipient of our messages cares less about privacy than the others, even the most protective tools are much less effective.

Taking care of our personal data is also taking care of those of our loved ones. For example, we can ask ourselves whether the people we see in our photos really agree to their being put on the Internet or shared. And this does not only concern human beings: poachers use [for example]the metadata of images posted on the Internet by tourists to locate and hunt animals. To learn more about the collective dimension of “personal” data, you can read this article.

Discovering the impact on Masako’s and others’ lives

Tools at our service? Are they really?

Using the services of the “Net Giants” has its advantages: they are easy to use, very often free and popular. But these services answer foremost to the interest of these companies: the needs of the users are secondary.

According to their wishes and objectives, they can delete our account or some of our publications, limit access, add or remove features, not show us certain publications or on the contrary only put forward certain opinions … or even sell the activity to another company, or stop it altogether. Very important powers, especially when a lot of people use these services.

Fortunately, there are alternatives that work just as well and give us back the ability to decide! The association Framasoft proposes for example many alternative tools to those of the giants of the Net on the site degooglisons-internet.org.

Discovering the impact on Rokaya’s and others’ lives

Our data may be used to make decisions that affect us

Throughout the world, companies are building software to make decisions about us based on our personal data. They would like to know who would be likely to pay back a loan to their bank or not, who is the best fit for a job, who will get sick in the next few years, who might commit a crime…

These may sound like interesting goals, but in practice they have many problems. Their predictions often turn out to be inaccurate and reveal more about the stereotypes that the people who created these methods had than anything else.

For example, black skinned people are more likely to die from a skin cancer: on the one hand, doctors are less trained on these skins, et so, advanced artificial intelligence technologies developed to find these types of cancer reproduce this bias and much more, as shown in this article.

ortunately, in Europe, fully automated decisions are prohibited if they could have significant consequences for the individuals concerned. To learn more about this, we recommend reading this article on how automated decision systems reinforce inequalities or this page from the website of the Commission Nationale de l’Informatique et des Libertés (CNIL)

Discovering the impact on Sol’s and others’ lives

Avatar de Gao