In Professor Jennifer Eberhardt’s book, Biased, she describes how a Black police officer pulled his gun on a man in New York only to realise he was looking at his own reflection. It is a striking anecdote, and one that demonstrates how insidious and powerful unconscious bias can be.
Eberhardt is one of the world’s leading experts on unconscious racial bias, and shows in her book how this affects every sector of society, leading to enormous disparities.
Bias (despite what some like to claim) is not always, or even often, deliberate. It is an embedded trait of human cognitive function and one that plays an important role in synthesising the huge amount of data we have to process every day. The truth is that often we just don’t appreciate how vulnerable we are to certain narratives, how under the influence we are to the things we’re told and the things we see. And we don’t realise how this is reflected in the way we build and relate to the world around us.
This aspect of human nature has bled into the world of technology. In fact, there are few people now who would contest the fact that bias exists in the world of tech. Predominantly white male developers have created technology that responds principally to people like them, and so unfairly discriminates against women and people of colour. A 2018 survey showed that around three quarters of all technical jobs were held by men, and the needle hasn’t moved much since.
This is reflected in social media algorithms, facial recognition technology, and voice assistants, all of them increasingly common features of modern life. And in the built world, these technological biases exist, too, and can be exacerbated in all sorts of ways. Moreover, the pandemic has prompted building designers to think of ways to give spaces full functionality without requiring their inhabitants to touch anything. And this is likely to mean prioritising the kind of technology that has shown itself to be biased elsewhere, including voice assistant tech and AFR. It might mean, for example, that gaining entry to lifts, or even gaining entry to buildings, suddenly becomes a problem.
Aggravating this further is the way we think about data. Automated technology assumes data to be objectively “true”, but bias exists here as well. Hannah Rozenberg’s award-winning graduation project, Building Without Bias, showed that machines associated words including “architect”, “steel” and “cement” with men, and “tearoom”, “kitchen” and “nursery” with women. We will need to overcome this if the buildings of the future are to serve everyone, as they must.
But it is naive to think that bias is always unconscious in the built world. Some developers have been known to install pay-per-minute benches, “pig ears”, and even “anti-homeless” spikes outside luxury properties to keep out “undesirables”, from skateboarders to teenagers to rough sleepers. These unfortunate innovations even come together under a name: “defensive architecture”.
But we can’t do much about cynicism in the long run. We can do something about unconscious bias. Clearly, we need to include as diverse a range of people in the tech-development process as possible, and from the very start, in order to make sure that the final product is not unfairly prejudiced against one group or more. This is a lengthy project, since it goes well beyond the world of tech and into wider society and culture. But we can nonetheless, within our own organisations, strive to be inclusive when we create new technology.
And for this to last, we will need to go beyond inclusion to education. The “why” is as important as the “how”. We need to understand the flaws in the way we think—the kind of flaws that Professor Eberhardt illuminates—so that we do not keep making the same mistakes we’ve made in the past, and so that we approach our work in full knowledge that we are prone to bias and that the flaws in our psychology can creep into the technology we fashion for ourselves.
In these unpredictable times, we can predict that the buildings of the near-future will rely even more heavily on technology than they do now. And that means that the clock is ticking, and we need to iron out those deficiencies and vulnerabilities in our tech. The first step is to have no illusions. Then we need to work hard on technological inclusion and education—and show this space to be as forward-thinking as it can be.
By Tom Harmsworth, managing director UK for WeMaintain