clock menu more-arrow no yes mobile

Filed under:

Can Software Help Eliminate Workplace Bias?

Anonymity in the office could work wonders

(Getty Images/Ringer illustration)
(Getty Images/Ringer illustration)

A little over three years ago, Amanda Greenberg began to notice subtle patterns in her colleagues’ behavior. It was her job to collect information from the different branches of the Washington, D.C., environmental consulting firm where she worked as a researcher, and she often observed that the only people who spoke up in conference calls and email chains were “louder, more senior voices.” After group conversations, there was always a second wave of feedback: individual emails and phone calls from people who participated but kept quiet.

“The ideas they were sharing with me ended up leading to a lot of project success or proposal success,” Greenberg said. “So I started asking them why are they coming to me versus all of these pools. Their answers just really shocked me, which is that they were intimidated. They said things like, ‘I didn’t feel like there was an opportunity. Everyone’s ideas were better.’”

After so many one-on-one conversations like this, Greenberg decided she should address this input problem herself. She left her job in 2013 to launch Baloonr, a company that aims to reduce factors that lead to bias in workplace collaborations through internal-decision-making software. A little over two years later, the startup has been used by companies such as Microsoft and Disney, and recently graduated from the Silicon Valley–based accelerator 500 Startups with a rotation of between 10 and 30 clients.

The software allows managers to launch a message board that is titled with a question or topic (what Greenberg refers to as a “balloon”). Typically these conversations revolve around product ideas, prioritizing product road maps, or reflecting on projects. Invited participants post anonymous feedback and offer support to smart suggestions they like by clicking a “pump” button — like a corporate version of a Reddit upvote. Once the conversation has played out, employees can then remove their anonymity to reveal whose ideas prevailed. The ideal result is that a manager can gather a more diverse set of ideas and a group can evaluate those ideas based on their content, rather than the gender, race, age, or seniority of the person who made them.

The startup is part of a growing number of companies trying to tackle the issue of workplace biases. The technology industry is notoriously white and male. And despite pledging year after year to diversify their workforces, major tech companies like Google, Facebook, and Apple have made little significant change. So the entrepreneurs of Silicon Valley did what they do best: They launched companies to address the problem. Startups like GapJumpers,, and Triplebyte began experimenting with methods like voice modulation and anonymous skill tests to anonymize recruiting processes and emphasize applicants’ technical skills. There’s also Textio, which targets the hiring process by scanning job listings for language that could discourage certain people from applying to a position (like Magic Leap’s “Wizards Wanted,” to borrow a recent example).

Collectively, these companies’ products have been used by the likes of Evernote, Dolby, Google, Microsoft, Starbucks, Uber, and Yelp to diversify their workforces. And according to data that GapJumpers provided to The New York Times last year, when employers used anonymous skill-based screening processes, a much higher percentage of applicants “who were not white, male, able-bodied people from elite schools” made it to first-round interviews than when companies used traditional résumé-based systems.

Though these companies are paving the way for a more equal hiring process in tech, their effects remain temporary. None of them addresses the challenges that underrepresented applicants face after they get the job: the day-to-day discriminatory interactions that have been illustrated by horror story after horror story over the years (the most recent high-profile case being the bizarre experience of former Uber engineer Susan Fowler). Greenberg sees her software as the first step to addressing the subconscious biases that people encounter in corporate environments. Not only does she hope it addresses prominent issues in workplace culture — like discrediting someone’s contributions based on race, gender, or age — but she says the cloud-based productivity software is also designed to address other collaborative problems, like excessive “groupthink,” seniority bias, or anchoring onto the premise of whatever idea is blurted out first.

“Humans are profoundly biased,” she said. “It’s been studied for over 20 years in detail: All different types of bias and how those plague the workplace. They stall innovation; they waste time.”

Though her company is still growing — she’s in the process of hiring her fourth employee — Greenberg thinks feedback she’s received from clients is promising. More often than not, she said, management is taken aback by the results of a conversation once identity is removed from the equation.

“They feel stalled, they know there still might be disagreements, so they use Baloonr,” she said. “They’re sometimes shocked at where that group actually is and … the direction they should actually be moving in.”