Everlasting Love? Let AI Predict

If there are relationship problems between a couple, artificial intelligence can predict from speech better than therapists whether love between the partners will last.

To figure out whether things will get better or worse, it might be worth just pausing for a second to listen to your partner. Listen closely. When you speak to each other, your voices hold all sorts of information that could reveal the answer. Subtle inflections in tone, the pauses between phrases, the volume at which you speak – it all conveys hidden signals about how you really feel.

That shift in emphasis is one of the more obvious ways we layer our speech with meaning. But there are many more layers that we add without realizing it.

But there is a way to extract this hidden information from our speech. Researchers have developed artificial intelligence that can then use this information to predict the future of couples’ relationships. The AI is already more accurate at this than professionally trained therapists,

In one study, researchers monitored 134 married couples who had been having difficulties in their relationships. Over two years, the couples each recorded two 10-minute problem-solving sessions. Each partner chose a topic about their relationship that was important to them and discussed them together. The researchers also had data on whether or not the couples’ relationships improved or worsened and if they were still together two years later.

Trained therapists watched videos of the recordings. By assessing the way the couples spoke to each other, what they said, and how they looked while they were talking, the therapists made a psychological assessment about the likely fate of their relationship.

The researchers also trained an algorithm to analyze the couples’ speech. The algorithm’s job was to calculate exactly how these features were linked to the strength of the relationship.

The algorithm was purely based on the sound recordings, without taking into account visual information from the videos. It also ignored the content of their conversations – the words themselves. Instead, the algorithm picked up on features like cadence, pitch and how long each participant talked for.

Surprisingly, the algorithm also picked up on speech clues beyond human perception. These features are almost impossible to describe because we’re not typically aware of them – such as spectral tilt, a complex mathematical function of speech.

After being trained on the couples’ recordings, the algorithm became marginally better than the therapists at predicting whether or not couples would stay together. The algorithm was 79.3 percent accurate.

0