In 2015, Nextdoor, the location-based social networking app, gained a reputation as a locus of racial profiling. Users were sending alerts for merely spotting hoodie-wearing Black men walking in their neighborhoods. One damning report described Nextdoor as “a forum for paranoid racialism—the equivalent of the nosy Neighborhood Watch appointee in a gated community.”
In studying the problem, Nextdoor realized that its easy-to-use app was partly to blame.
Creating intuitive, easy-to-use products and interfaces is the domain of user experience designers. Don’t Make Me Think is both the title of a core textbook in the field and many designers’ professional mantra. An approach called “anticipatory design” even takes the virtue of speed to the extreme, promising a system so seamless that consumers never have to make a single choice.
But the very features that are meant to make technology easier and faster to navigate—like shortcuts and auto-complete functions—can also encourage users to make snap judgements and put marginalized populations at risk. This is just one way racial bias is bundled into everyday systems, albeit sometimes unintentionally.
Speaking at TED conference earlier this month, Jennifer Eberhardt, a social psychologist who helped Nextdoor address its racial profiling problem explained how designing for speed can sometimes foster injustice. “Categorization and the bias that it seeds allow our brains to make judgements more quickly and efficiently,” she said. “But just as the categories allow us to create quick decisions, they also reinforce bias. The very things that help us see the world also can blind us to it. They render our choices effortless—friction free.”
This concern is partly why, over the past few years, a cadre of UX designers have been championing the utility of interruptions, or “friction,” in design processes. Awakened to the the ethical quagmires of big data during the 2016 US elections and seedy “dark patterns” of social media platforms, many are questioning the norms and metrics of their profession. Now, as the Black Lives Matter movement gains momentum, the urgency of addressing pre-programmed biases in tech products and instruments is even more pronounced. To check these biases, some designers believe, like Eberhardt, that we need more purposeful interruptions to slow the user down.
Working closely with Nextdoor’s in-house teams, Eberhardt recommended adding a checklist for users to go through before they report an incident. They also tweaked the language of its prompts—changing the oft-cited security slogan,”if you see something, say something,” to “see something suspicious, say something specific”—to get people to reflect if they’re actually witnessing criminal behavior. Within a few months, the San Francisco-based start-up was able to curb racial profiling incidents by 75%.
Read the whole story: QuartzMore of our Members in the Media >