ATG Digital, an access control and visitor management software company, is preparing a consolidated industry submission to the office of the Information Regulator.
It warns that several provisions in the draft Code of Conduct on the Processing of Personal Information at Gated Accesses require significant refinement before implementation.
The concerns were unpacked during a webinar hosted by ATG Digital in partnership with Civitas and led by legal and privacy specialist Carine Marais.
The 65-page draft privacy rules, published by the Information Regulator on 30 April for public comment, seek to regulate how personal information is processed at gated environments, including estates, office parks, logistics facilities, hospitals and mining operations.
It seeks to ensure owners and managers process personal information collected through gated access systems lawfully, responsibly and in line with the Protection of Personal Information Act (POPIA).
The public have until 14 May to provide feedback.
According to the Information Regulator, the draft privacy rules follow mounting public complaints about the excessive collection and storage of personal information in gated environments.
These concerns are also related to the use of CCTV systems; biometric systems, including the use of facial recognition technology; and visitor registers without clear communication about how information would be stored, shared or retained.
The proposed privacy rules could soon limit what security guards and estates are allowed to demand from visitors.
During the webinar, Marais urged calm among members of the public who were distressed following the gazetting of the code, noting that panicking is unnecessary.
“The regulator is not against the industry. The regulator lives in exactly the same South Africa and wants safe and peaceful environments just like the rest of us. The reality is that they may not fully understand how the industry operates, which is why they have asked the industry to guide them,” she said.
Marais explained that the code mandates a specific compliance framework tailored for access control environments, detailing strategy, governance, policies, roles and risk appetite.
“This framework is a regulatory requirement, not an optional exercise. Information officers must be registered with the regulator and lead framework development. The framework breaks down into strategy/governance, tailored privacy policies, daily operational processes, and ongoing reporting and review.”
Vague risk framework
Marais noted there are several “sticky issues” in the current draft code, ranging from automated decision-making and profiling, to technology neutrality and CCTV definitions. One of the most pressing concerns is the draft code’s risk management framework, which she described as “extremely vague”.
“It’s conceptually correct, but practically it’s hollow,” she said. “While the framework correctly allows organisations to assess their own operational environments and risk profiles, it lacks practical implementation guidance.”
Different sectors, including logistics facilities, mining environments and residential estates, face varying operational and security risks, which should inform the type of personal information collected and processed.
Another major issue relates to how the draft code interprets automated decision-making in biometric access systems, she pointed out.
Marais argued that the regulator incorrectly classifies basic biometric template matching as automated decision-making driven by artificial intelligence (AI).
“What they say is if your fingerprints are already registered on the system... and if I get to the gate one morning and I put in my biometric templates, and the system rejects it, that is defined by the Information Regulator as an automated decision, which I fundamentally disagree with. That's not an automated decision by AI. That is a standard operating procedure put into place.”
She added that the draft code appears to conflate ordinary biometric verification with advanced AI-driven behavioural analysis systems.
Marais warned that incorrect interpretation and enforcement of these provisions could have significant implications for the deployment of biometric access technologies across South African businesses and residential estates.
“The code also references unusual behaviour detection, which is genuinely linked to AI and machine learning technologies. However, those systems are generally used in broader operational environments and not typically at the point of access control. If these concepts are interpreted incorrectly, it could have significant implications for how biometric systems are deployed.”
Definitions questioned
Marais also expressed concern over the draft code’s framing of “access control” primarily as a security mechanism.
“The code harps on security, which we know is not the only reason we use access control.
“Access control systems are also used for operational efficiency, visitor management, safety procedures and compliance processes. In some industries, this includes operational requirements, such as breathalyser testing at logistics facilities.
“Safety and privacy are not enemies. They need to work together. And when safety can also include operational efficiency, they are not mutually exclusive. They have to coexist.”
The webinar also highlighted concerns around the draft code’s definition of CCTV, which Marais described as “completely incorrect and draconian”.
While she said the definition itself may not be the central issue, she argued it reflected broader concerns around the level of detail and technical understanding applied during the drafting process.
“The way the definition is framed tells you a lot about the approach, the focus and the detail that this code was written with.”
Marais further argued that the code should remain neutral regarding the use of specific technologies.
“If you need a set of technology, then you need it because your specific environment requires it... It's my view and probably most technology providers’ view that the technology and the code need to be neutral towards the technology. It’s how you implement it that needs to be regulated.”
She also raised concerns about the draft code’s treatment of profiling, arguing the concept was being “completely used and referenced incorrectly”.
Profiling refers to the automated processing of personal information to evaluate, analyse or predict aspects of a person’s behaviour, movements, or characteristics.
According to Marais, under the code, profiling is associated with automated decision-making and could have “detrimental effects for businesses” if enforced incorrectly.
Despite these concerns, Marais stressed that the regulator was engaging constructively with industry stakeholders and remained open to refinement.
“Guys, we don’t know. We need you to guide us,” Marais quoted the Information Regulator as saying.
According to Marais, the industry submission to the regulator, being prepared by ATG Digital, will aim to provide practical solutions rather than fragmented criticism.
“Rather than overwhelming the regulator with fragmented comments, we’re providing coherent feedback that gives them solutions as opposed to more problems.”
As part of the process, Marais is preparing a proposed redraft of problematic sections of the code to assist the regulator during the remaining stakeholder engagement period.
“The code has improved significantly from the first draft. That shows the regulator is listening and willing to engage with stakeholders. This next phase is about refining the remaining problem areas before the code is finalised,” she concluded.

