Data protection campaigners like Germany’s “Digitalcourage” association see inherent risks in using big data analyses. They claim that data is being misused, personal rights violated, and people discriminated as soon as they fall into certain algorithmic categories. After all, there is no guarantee that the results of the algorithm match reality. Data is always quantitative and relatively meaningless without context.
At the same time, the data can contain anything, including sensitive information. That gives power to those analyzing it. Facebook programs algorithms in such a way that users automatically see only those things that confirm their world view. Google’s search results are also biased thanks to the search engine’s algorithms and the user’s previous queries. Experts refer to that as the “filter bubble”.
European data protection law does at least specify that data may only be used for the purpose and within the offering for which it was collected. This is the principle of purpose limitation. For example, the order data of a pizzeria may not be linked to that of a connected car in order to send offers to customers in the car as it drives by.
Many consumers still lack awareness of what it means to divulge information. Everyone should be aware that “only non-collected data is secure data,” says the Digitalcourage association.
Consumer protection organizations do, however, also warn against playing off big data and data protection against each other. The analyses could be hugely beneficial for consumers, for example, when connected cars automatically report accidents or avoid traffic backups. On the other hand, all those in possession of data could manipulate and control consumers. Germany’s Federation of German Consumer Organizations advocates that each individual should have the right to determine which data they divulge and how it may be used.