

Programmers as human beings have inherent or cognitive biases that can have ripple effects on the code of technologies they develop, says Terri Burns, associate product manager at Twitter San Francisco. Burns was the keynote speaker at the second annual gathering of local software developers, DevConf, which took place in Midrand yesterday.
Her topic titled 'bad people, bad computers' explained how the experiences of developers when programming software can manifest in a biased manner. She explained how computers can be just as racist as human beings.
"Any algorithm can - and often does - simply reproduce the biases inherent in its creator, in the data it's using, or in society at large. For example, Google is more likely to advertise executive-level salaried positions to search engine users if it thinks the user is male... while Harvard researchers found that ads about arrest records were much more likely to appear alongside searches for names thought to belong to a black person versus a white person," she said.
She explained that these incidents don't come as a result of malicious situations, "It's not that Google is staffed by sexists, for example, but rather that the algorithm is just mirroring the existing gender pay gap."
She urged that by believing in code as an unbiased machine, the risk is an enforcement of a flawed status quo. "As long as we continue to believe that an algorithm is unbiased, we risk reinforcing the status quo in harmful ways. This is really a very difficult and philosophical issue when it comes to being software engineers and it is thinking about how we as people building technology and software products reinforce some of the pre-existing parts of society that maybe we don't necessarily think are the best parts."
I believe that some of the best computer programmers are continuously thinking about this question and they are building for communities with this consideration in mind. One needs to be cognisant of implicit bias and how it manifests in code."
She called on developers to stay aware of their privilege and to be "cognisant" of implicit bias and how that manifests in code. She believes this encourages diverse teams, companies, thoughts, backgrounds and tastes.
Burns says parameters governing access to software, can also be exclusionary and developers need to think about the communities that are, by default, left out of the process.
"One thing that can really help to foresee some of the undesired cases is by building diverse teams. Diverse interests and backgrounds pull together much more knowledge and they can build more inclusive and thoughtful software that is useful."
According to Burns prejudice can arise through various channels, including from developers with malicious intent, users who exploit software or as a result of oversight within code that communicates a very different message to what was originally intended.
"For the first time in history we are learning what it means to design and engineer products for everyone in the world," said Burns.
DevConf took place in Midrand yesterday and was attended by over 700 developers and media.
Share