60 Minutes | Sci-Tech

I saw a future where you could frown at your digital device: Rana el Kaliouby

Illustration: R. Rajesh  

In September 2001, only a week after the horrific events of 9/11, Rana el Kaliouby, a newly married Egyptian woman, went to the University of Cambridge in the U.K. to pursue her Ph.D in computer science. With this degree, she would qualify for a tenure-track position at Cairo’s top university. She wanted to be an academic and raise her family in Egypt. That was the plan.

Her doctoral research was in artificial emotional intelligence (Emotion A.I.), dedicated to training computers to recognise and respond to human emotion. To realise the full potential of her research, she became a tech entrepreneur in the U.S.

In 2018, Fortune magazine named el Kaliouby, co-founder and CEO of Affectiva, Inc., one of the most influential young people in the world of business. In her recently published memoir, Girl Decoded, she writes candidly about her journey from “nice Egyptian girl” to a woman who pursues her own path and career.

Tell us about your mother, a trailblazer in her own right.

My mother was one of the first women computer programmers in the Middle East. At a time when most Egyptian mothers did not work outside the home, she held an important job with the Bank of Kuwait and raised three children, all girls. Education is the best investment, she used to say. When I was about eight years old, an uncle told my father, “Ayman, your girls will just get married, so why waste all that money on those fancy schools?”

My father, a progressive man when it came to women’s education, paid no heed to such “advice.” But like most Middle Eastern men, he expected my mother to put her duties as wife and mother first. When school let out, she would be home taking care of her daughters. So, the trailblazer was also a traditional wife.

How did you get interested in computer science?

Both my parents were technologists. They met in a class where my dad taught programming 101. My dad was an early adopter of technology. He bought us an Atari console and made us figure out how to set it up. I was less interested in the gadget and more intrigued by how video games brought us together as a family.

As I went on to study computer science, my interest was once again in the human side of technology. I read the book Affective Computing by Rosalind Picard. She said that if we want smarter computers, we must design machines that can recognise, understand, even express emotions. This idea would change the entire trajectory of my career and my life.

Tell us about your research in Emotion A.I.

When we communicate in person, only 10% of our meaning is conveyed through words, while the rest comes from our facial expressions, tone of voice, gestures, and other subtle cues. Our computers are oblivious to such non-verbal cues. But I saw a future where you could frown at your digital device, it could recognise your frustration, and use this input to create a better user experience.

The research goal for my Ph.D. was to teach computers to read facial expressions, to infer the person’s mental state or emotions. For my thesis, I developed a facial expression-reading algorithm, which I called “The Mind Reader.” In 2004, I got a chance to demonstrate this technology to Rosalind (Roz) Picard — the author of Affective Computing — who was visiting our lab in Cambridge. She was impressed with my work and offered me a postdoc position at the Massachusetts Institute of Technology (MIT), in Boston.

How did you become a tech entrepreneur?

I moved from the U.K. to the U.S. I accepted Roz’s offer and went to work in the Affective Computing Group at the Media Lab in MIT. I found it exhilarating to interact with the lab’s industrial sponsors and hear about the varied uses they envisioned for “Mind Reader.” I realised we had a unique opportunity to take this technology and scale it in the real world. So, in 2009, Roz and I incorporated a company called Affectiva. To raise money for our startup, we went to Silicon Valley. Some of the men at the venture capital firms recoiled when they heard the word “emotion” in our presentations. But eventually, we found investors who shared our vision.

Today, many industries use Affectiva’s Emotion AI in myriad ways, from market research to mental health and even in cars that check for signs of distraction or drowsiness in drivers.

Your memoir, a book about the coming age of emotive machines, also gives us unexpected glimpses into your culture.

In a way, my education in the science of emotions began on my visits to Cairo for summer vacation, sitting around my grandmother’s dining table. I watched, fascinated, as members of my large extended family talked, gestured with their hands, laughed out loud, interrupted one another, and engaged in lively conversation and debate.

People think of a smile as being all about the mouth, but without those crinkly smile lines around the eyes, a smile is not really a smile. My mother’s older sister wears a niqab — she is covered from head to toe, with a small slit for her eyes — but I can tell if she has had a good day or not, simply by looking at her eyes. With Emotion AI, a computer could identify the same thing, just as accurately as a perceptive human can.

Looking back, I see that it was at my grandmother’s that I began to notice the differences in how emotion is expressed, a fact that I took into account later, when I was designing software that would read and interpret our emotional cues accurately across cultures, not just the one I come from.

The interviewer is a Boston-based science journalist.

Related Topics
Recommended for you
This article is closed for comments.
Please Email the Editor

Printable version | Oct 1, 2020 2:21:02 PM | https://www.thehindu.com/sci-tech/i-saw-a-future-where-you-could-frown-at-your-digital-device-rana-el-kaliouby/article32112380.ece

Next Story