The government of South Korea is supplying personal data to developers of face recognition algorithms.
What’s new: The South Korean Ministry of Justice has given the data profiles of more than 170 million international and domestic air travelers to unspecified tech companies, the news service Hankyoreh reported. The distribution of personal data without consent may violate the country’s privacy laws.
How it works: The government collects data on travelers at Incheon International Airport, the country’s largest airport. It gives facial portraits along with the subjects’ nationality, gender, and age to contractors building a system that would screen people passing through Incheon’s customs and immigration facility. The project began in 2019 and is scheduled for completion in 2022.
- Last year, South Korea passed along data describing 57.6 million Korean citizens and 120 million foreign nationals.
- Another system in development is intended to recognize unusual behavior based on videos of travelers in motion and images of atypical behavior.
- The Ministry of Justice argues that South Korea’s Personal Information Protection Act, which bans the collection, use, and disclosure of personal data without prior informed consent, doesn’t require consent if personal data is used for purposes related to the reason it was collected.
- A coalition of civic groups pledged to file a lawsuit on behalf of foreign and domestic individuals whose images were used.
Why it matters: Face recognition is an attractive tool for making travel safer and more efficient. But data is prone to leaking, and face recognition infrastructure can be pressed into service for other, more corruptible purposes. In the South Korean city of Buncheon, some 10,000 cameras originally installed in public places to fight crime are feeding a “smart epidemiological investigation system” that will track individuals who have tested positive for infectious diseases, scheduled to begin operation in January 2022, Hankyoreh reported. The city of Ansen is building a system that will alert police when it recognizes emotional expressions that might signal child abuse, scheduled to roll out nationwide in 2023. Given what is known about the efficacy of AI systems that recognize emotional expressions, never mind the identity of a face, such projects demand the highest scrutiny.
We’re thinking: Face recognition is a valuable tool in criminal justice, national security, and reunifying trafficked children with their families. Nonetheless, the public has legitimate concerns that such technology invites overreach by governments and commercial interests. In any case, disseminating personal data without consent — and possibly illegally — can only erode the public’s trust in AI systems.