Algorithms are like invisible judges that decide our fate

Companies now use ‘voice analysis’ software to determine whether to hire us. And, once we’re employed, to predict if we’ll stay

April 27, 2015 11:28 pm | Updated April 28, 2015 01:02 am IST

Imagine that you’re a contestant in an audition round of The Voice, where you belt out your best I Will Always Love You . A minute passes. No reaction from the celebrity judges. You keep singing. Another minute, still no encouraging smile or nod.

You strain to hit your highest note, pleading with your performance: “Please, please accept me! I am doing my best!” The song ends. No one wants you. Your family bow their heads in shame. Your mom cries. You stand on the stage, alone in the spotlight, heartbroken. A trap door opens beneath your feet and you slide screaming into Adam Levine’s basement torture maze.

Dehumanising

Think that’s bad? In the real world, science has come up with something worse. A company called Jobaline offers “voice profiling” to predict job success based on how candidates sound; its algorithm identifies and analyses over one thousand vocal characteristics by which it categorises job applicants on suitability.

It’s horrible and dehumanising, like all our other profiling (the racial kind is always a big hit!) and reliant on born-in, luck-of-the-genetic-draw factors that we can neither avoid nor control.

This is not the only creepy algorithm system HR departments have been employing to help the company bottom line. Companies like Wal-Mart and Credit Suisse have been crunching data to predict which employees are “flight risks” who are likely to quit (easily remedied with a simple anklet attaching the worker to his or her cash register or cubicle) versus those deemed “sticky,” meaning in-it-for-the-long-haul. The information lets bosses either improve morale or get a head-start on a search for a replacement.

The inventors of such programs often enjoy the impeachable, amoral cloak of scientific legitimacy. When it comes to voice profiling, computers are not judging the speakers themselves; only the reactions the speaker’s voice provokes in other (presumably human) listeners.

‘Mechanical judge’

“The algorithm functions as a mechanical judge in a voice-based beauty contest”, wrote Chamorro-Premuzic and Adler in The Harvard Business Review . “Desirable voices are invited to the next round, where they are judged by humans, while undesirable voices are eliminated from the contest.”

The makers of voice profiling programs tout this as a moral achievement. Human beings bring loads of biases into any evaluation; computers are blissfully unaware of differences in race, gender, sexual preference or age. “That’s the beauty of math!” Jobaline CEO Luis Salazar told NPR. “It’s blind.”

The problem is, when applied in a capitalist system already plagued by unfairness and inhumanity, this blindness sounds really, really dangerous. An impersonal computer program gets first say as to who gets to earn money to buy food and who doesn’t, based on an application of a binary code too subtle and complex for us to understand. — © Guardian Newspapers Limited, 2015

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.