Anonymized Data is Not Subject to Privacy Regulations

19. February 2019

Since the start of EU’s General Data Protection Regulation (GDPR) in May 2018, the erasure requirement, also known as the ‘right to be forgotten’, has been one of the most-debated principles. It implies that individuals (‘data subjects’) can demand companies (‘data controllers’) to remove all personal data that they hold about the subject. In an official statement released in December 2018, the Austrian Data Protection Authority ruled that the erasure requirement is satisfied by anonymizing personal information. According to the statement, anonymization disables linking any information “with reasonable effort” to the data subject.

Anonymized Data Is Not Personal

The judgment from the Austrian Data Protection Authority highlights that anonymized data is by nature not personal and therefore not subject to data privacy regulations. This finding is well-grounded based on GDPR’s Recital 26, which states that “the principles of data protection should […] not apply to anonymous information”. And that’s where things get really interesting. Apart from the right to erasure, GDPR also mandates companies to ask subjects for consent when collecting personal data. In many cases, like collecting camera data for self-driving cars, this is extremely difficult — if not impossible. With up to twelve cameras on board, an autonomous vehicle records video data of the surroundings without asking pedestrians for consent. Still, having rich camera data is highly relevant to develop safe autonomous cars. So the only viable solution is to build a system based on the concept of ‘privacy by design’. Therefore, implementing anonymization directly at data recording is the way to go — especially when it comes to collecting public camera data.


Brighter Anonymization for Better Data

In the past, anonymizing camera data and extracting value for analytics & AI at the same time was not possible. When removing personal identifiers like faces and license plates through traditional methods such as blurring, pixelation or masking, the value of the data for developing AI models is significantly impaired — if not lost completely. To resolve this dilemma and enable companies to harness the power of camera data, brighter AI has developed a solution called Deep Natural Anonymization. With deep learning technology, the software generates artificial faces and license plates that protect the identity of individuals but keep major attributes for a full understanding of the visual data. The GIF at the top of this article shows how Brighter AI’s technology anonymizes license plates in a way that is completely natural and allows for both advanced analytics and AI development. By incorporating privacy by design, companies are able to focus on data collection, processing, analytics and the development of AI models instead of worrying about data privacy infringements.

If you want to learn more about how Brighter AI’s Deep Natural Anonymization empowers companies in automotive, retail and smart city to use public camera data while following privacy by design, please say If you are interested in joining our team, check out our open positions!

Caspar Miller
Head of Regulatory