CCTV networks are ‘driving an AI-powered apartheid SA’

Camera systems have so far been installed in predominantly white areas and AI will pick out black people as 'the other'.


In SA’s crime-ridden society, technology – and particularly artificial intelligence (AI) – is helping people get the upper hand over criminals. But it comes with drawbacks. One of the best ways to combat crime is through visuals. Being able to see criminals and recognise them always accelerates the procedures to apprehend them and this is where closed-circuit television (CCTV) shines. CCTV is used on highways and in residential areas worldwide by police and security companies and in South Africa, a number of areas have coverage. One of the problems with a CCTV system is the human aspect – screens need…

Subscribe to continue reading this article
and support trusted South African journalism

Access PREMIUM news, competitions
and exclusive benefits

SUBSCRIBE
Already a member? SIGN IN HERE

In SA’s crime-ridden society, technology – and particularly artificial intelligence (AI) – is helping people get the upper hand over criminals. But it comes with drawbacks.

One of the best ways to combat crime is through visuals. Being able to see criminals and recognise them always accelerates the procedures to apprehend them and this is where closed-circuit television (CCTV) shines.

CCTV is used on highways and in residential areas worldwide by police and security companies and in South Africa, a number of areas have coverage.

One of the problems with a CCTV system is the human aspect – screens need to be monitored and security officers might miss something by looking away for a moment.

That is where AI, like that being marketed by the Avigilon company in partnership with Motorola Solutions, comes in. Its new ACC7 software uses AI technology to improve surveillance efficiency.

AI is nothing new and has already been integrated in most of many people’s current smart devices. Features such as voice or facial recognition make it easier to obtain access to your device and information through the internet.

Avigilon’s new software uses the “learning” capacity of AI. Linked to cameras, it will take about three weeks to learn the normal patterns of a particular neighbourhood. Once it has that baseline knowledge, it can quickly detect anomalies and alert security officers, as well as the police and emergency services.

Even in cases where fugitive criminals change clothing, they can still be identified, as long as the system has captured their facial features. Once CCTV networks are linked to each other, fugitives can be tracked across wide areas through shared data.

Avigilon’s systems are already in use in the UK, which has one of the highest numbers of surveillance cameras in the world.

However, there is a dark side to the technology and a trade-off, because more security often means less privacy.

China uses CCTV equipped with facial recognition software and AI to track and monitor its citizens, particularly the minority Muslim Uighur community.

Michael Kwet is a visiting fellow of the Information Society Project at Yale Law School and has written extensively on personal privacy and technology issues.

In a story in Vice magazine last month, Kwet claimed that smart CCTV networks were “driving an AI-powered apartheid South Africa”.

This is because camera systems have so far been installed in predominantly white areas and AI would pick out black people as “the other” and mark them for further action.

Kwet wrote that “the technology’s use in South Africa recalls a harrowing past”.

He said that during the apartheid era, government forced black Africans to carry a pass, indicating where they lived – the domicile pass or more commonly known as the dompas. Police could instantly tell whether someone “belonged” in an area.

“The system was a staple feature of apartheid, designed to monitor and control Africans from a centralised location,” said Kwet.

He concluded that “by offering communities a system that allows all human activity to be automatically tracked, recorded and algorithmically scrutinised”, the companies involved were “creating a public-private surveillance state”.

When asked about the risks, David Robinson of Motorola revealed that the information from the surveillance cameras was stored in their cloud, or one that the purchaser would choose would not be kept and used by Motorola, but rather with the company or public service that would be using the product.

So, AI would be under the control of the user, rather than the producing company.

– costam@citizen.co.za

For more news your way, download The Citizen’s app for iOS and Android.

Access premium news and stories

Access to the top content, vouchers and other member only benefits