Posting privacy away

Facial recognition... a new potential data privacy issue.

Facial recognition... a new potential data privacy issue.

Published Oct 28, 2023

Share

Durban - The right to remain anonymous in the digital age is becoming more onerous as tech firms beef up their ability to mine personal data from the internet.

Experts say while facial recognition and other technology become increasingly unavoidable, it is often the selfie culture and need to share personal information online which threaten our privacy, making us easy targets.

Associate professor of IT at Durban University of Technology Brett van Niekerk says all technology, including facial recognition, had limitations and how you used it could be the problem.

“Sometimes human rights and ethical problems come up with facial recognition if you use it for excessive surveillance, or for criminal action against someone,” he said.

Questions have been raised globally about the use of artificial intelligence for racial profiling, but Van Niekerk said there could be a reason for this perception.

“There is obviously a need to train the algorithms to recognise people correctly. This is where sometimes the ethics come in. We have a challenge if you train it on a specific demographic, or there is an excessive number of a specific demographic. And it can be age, gender, race, or whatever and if overtrained on that demographic it struggles to detect others. It’s not that the system is racist, it’s just been trained on a data set that is potentially skewed.”

He said in terms of privacy there were various issues to consider under the Protection of Personal Information (Popi) Act.

“If an organisation is using facial recognition, or even fingerprint scanners and so on, those systems need to be secured. In theory, that gives a unique identifier for a person. So if we can get hold of the biometric data, we can potentially use that in other systems.”

He said advanced hackers had the skill to circumvent these systems. Another concern was if you were walking down the street and your face was registered somewhere, then those systems could track “pretty much everything you’re doing”.

Lesiba Seshoka from the Community Schemes Ombud Service said facial recognition technology was used in some gated communities and in most cases it provided quick, seamless access to residents who just needed to face a camera and then drive through the gates. He said where outsiders tried to enter, the security officials would contact the people they were visiting to clear them for entry and not necessarily record their biometric data.

Seshoka said there was nothing wrong with collecting this data as long as it was protected in line with the Popi Act. But he warned there was “monetary value” in such information and once the various technologies took off, “we are going to see a revolution” as companies fight to get their hands on your information.

He said the sale of personal data to companies in South Africa was prevalent, often leading to people being harassed with various marketing ploys.

In the US, a little-known tech firm, Clearview AI, has amassed a database of 30 billion images of people around the world including South Africa, without any permission. A UK court has already fined the company for violating the EU’s General Data Protection Regulation for collecting the data without consent.

Sadia Rizvi and Lucien Pierce from the law firm PPM Attorneys said in the context of South African data protection law, the use and collection of data in this manner most likely breached the provisions of the Popi Act.

“Popia outlines a number of data protection principles that juristic and government entities must abide by, one of them being that you must have a lawful basis for the collection and use of the data. So, technically, because Clearview AI may have scraped your image off a website and now processes it on its database, if they were operating in South Africa, you could contact them and ask them to remove your image from their database,” they said.

Rizvi and Pierce stressed the importance of questioning why your data was needed and where it would be used. They said just because you worked for a company or the government wanted your data, they were not necessarily entitled to collect or use it.

“South African data privacy law provides that organisations must have a lawful basis for processing individuals’ personal information.

“If they do not have a lawful reason to do so, then they cannot process that information. Specifically, biometric data (ie, fingerprints, facial identity, retina scanning) is regarded as a special category of information that requires extra protections and can only be processed in limited circumstances.

“As an example, certain banks in South Africa have adopted biometric technology, such as fingerprints, to verify the identity of their customers. The information is used in compliance with certain laws. Their reason for processing this information is then regarded as valid.”

Lance Faranoff is the chief strategy officer and co-founder of Iidentifii, the company which handles biometrics for the largest tier one banks in South Africa.

He said as technology advanced and criminals used sophisticated techniques to fake a person’s identity, basic biometrics such as touch ID or fingerprints were no longer sufficient proof that a person was real and alive. He said this was where liveness detection was needed to prove that a human was “genuinely authenticating themselves” and not a bot or deep fake.

He said companies in South Africa had much stricter data and privacy laws than those in the US and elsewhere, and banks did not just use facial biometrics and proof of liveness for the initial starting process, but for subsequent authentications as well.

He said once the verification was done your information was deleted and all that remained was a “biometric hash”, a series of noughts and ones, and if anyone managed to get hold of the database they would not be able to “reverse engineer” the process.

Faranoff said IIdentifii’s facial liveness detection used facial features to confirm a person’s identity as well as verifying that they were physically present and alive at the time of the remote identity verification process.

“There are actually three things they say you need to do to try to be safe when transacting online. That’s something you are, which is your biometric; something you know, which is like a pin, and something you have ‒ your device. So it’s those three things you always need to be as secure as possible.”

Rizvi and Pierce warned that the information relating to children under 18 requires extra protection. “To publish the photos of children on its website, the school must obtain express consent to use the photos. If they do not have this consent, they cannot publish the photos.”

The Independent on Saturday

Related Topics:

south africatechnology