I thought AI was emotionally constipated until it beat me at feelings. Now deal with it.

Look, I have been peddling the same tired narrative for a few years in a row now. You know the one where I say (sometimes literally) that AI is just a parrot and a huge word prediction machine, nothing more than a supercharged calculator that couldn’t recognize real human emotion if it smacked it in the face with a wet fish.

“Oh sure, it can solve calculus problems beyond fast, but it will never understand why I cried for the first time in my life while watching a movie (Straw)“- well that’s what I kept telling myself anyway.

Turns out I was about as wrong as Noel Skum doing his Adolf salute at a democrats party convention.

Much to my surprise, yes, actual shock even – you know, of the eyebrow-raising kind – some eggheads with nothing better to do with their lives than researching the AI (again) did some research and wrote a scientific paper that is making me question everything I thought I knew about my little Oompa Loompa friends with whom I have a kind-of love and hate relationship.

Why you saying that Marco?

Well, they were were supposed to stick to, like, manipulating spreadsheets and not emotional manipulation, you know.

So probably with the help of their own Oompa, their study got published in Communications Psychology and it had the catchy and totally descriptive title “Large language models are proficient in solving and creating emotional intelligence tests” – which is their way of saying like we would “Holy shitskebab, them bots are getting feelings now too”.

In the next couple of paragraphs I’m not gonna tear down what these researchers found, because as you know, it is “Happiness Week” at TechTonic Shifts, which means that I have buried my cynical thoughts (and my self esteem) under a huge pile of Fluoxetine, mixed with good ol’ Jack, which is – as you know – the unofficial king-o-cocktails for people pretending they’re “just fine”.


More rants after the commercial brake:

  1. Comment, or share the article; that will really help spread the word 🙌
  2. Connect with me on Linkedin 🙏
  3. Subscribe to TechTonic Shifts to get your daily dose of tech 📰
  4. Visit TechTonic Shifts blog, full of slop, I know you will like !

The one thing we had left

So what is the one and only skill that is supposed to separate us carbohydrate based hoomans from the silicon-based competition? Most people would concur with me when I say that it’s Emotional Intelligence – or E.I. if you’re too lazy to pronounce the whole GD thing.

Now, E.I. is not about being touchy-feely and offering soaky hugs when someone is having a bad day. It is actually four distinct skills that most people suck at.

For one, you gotta recognize emotions in yourself and others without needing a manual (unlike some of my semi-savant readers, including myself). You need to understand why those feelings happen and what kind of wreck they will cause next, and managing those emotional rollercoasters comes third. And finally, you need to be able to respond to other people’s emotions without making everything worse than it already was.

Ahem.

This invisible software that we have developed runs every wedding that didn’t end in tears and a fistfight, every team project that doesn’t result in a brawl on slack, and every friendship that doesn’t end in restraining orders.

It is what separates the leaders from the dictators, friends from energy vampires, and therapists from that ‘friend’ on Facebook who always gives terrible advise, even when you don’t need it.

So the team of researchers set out to do some research on this subject and decided to put AI through the emotional intelligence wringer using actual scientific tests (aren’t there any other) last month. They grabbed five gold-standard EI assessments – the real deal, not some BuzzFeed quiz – and fed it to six of the world’s top Large Language Models including all the ones you use and the ones you don’t use.

The results were surprising – to say the least – actually, they were more shocking than finding out your wife has a secret OnlyFans account, and you apparently are her best paying customer.


AI took the test and used our answer sheets as toilet paper

When regular humans take an emotional intelligence tests, we average around 56% accuracy based on the original validation studies. Not exactly stellar performance, but hey, emotions are complicated and most of us are emotional wrecks anyway.

But strangely enough, the AI models scored an average of 81% accuracy.

Let that marinate in your brain for a hot minute.

🕐

🕑

🕒

🕓

🕔

🕕

🕖

🕗

🕘

🕙

🕚

🕛

Since my brain is somewhat slower than y’all, I am now fully marinated and totally enlightened. Accuracy in an EI test is not about guessing what makes you cry at work on Monday morning. Nah, than it would be too easy to pass. It is about picking the most emotionally effective (or the social equivalent) response to real-world(ish) scenarios.

Apparently what these tests measure (especially the good ones) are things like “can you recognize emotions in faecal expressions” (pun), or “can you predict what emotions someone will have in the following situation”, you know, that kind of wishy washy stuff.

And the thing is that the AIs performed more than one standard deviation above the human average (just Google ‘statistics’ and ‘normal distribution’ and have fun), and some models like ChatGPT-o1 and DeepSeek V3 were over two standard deviations higher than our pathetic human scores.

Ok, in statistical terms, this means that you aren’t bringing a knife to a gun fight, but that you’re bringing a rocket launcher to a water balloon fight.

The AI apparently was showing consistent, accurate understanding of human emotions that put me to shame in ways I didn’t think were possible.


What does this emotional torture actually look like

The tests they ran weren’t your typical multiple choice BuzzFeed-type-of-questions about feelings and rainbows. The tests that they choose to run will throw you into real-life emotional chaos and expect you to navigate it without freakin’ everything up.

Oh, before I forget, here’s a beautiful song for y’all:

To give you a taste of the emotional torture they put the AI through, here’s a sample from the Situational Test of Emotion Management that will make you realize just how emotionally incompetent most of your male friends really are:

“Surbhi starts a new job where she doesn’t know anyone and finds that no one is particularly friendly. What action would be the most effective for Surbhi”

My gut feeling would be to scream out loud “ Why is everyone here dead inside and allergic to eye contact ? “ or “ Do I look like the ghost of someone’s ex or what ? “, but hey, that’s me, and I know for a fact that I would not pass this test. My scales tips to the right on the IQ scale and to the left on the EQ.

For real this time, ok?

Ok, ok. . . The real test options ranged from “have fun with friends outside work” to “concentrate on doing her work well” to “make an effort to talk to people and be friendly herself” to “leave the job and find one with a better environment”.

🤢 Barrrrf 🤮

Now that is why I only feel those pesky emotions when watching series like ‘Band of Brothers’ or ‘the Pacific’ (when you know, you know!).

Anyways.

The ‘correct’ answer requires understanding of things (nobody cares about) like social anxiety (there’s a pill for that), proactive problem-solving (there’s an AI for that), and long-term strategy making without turning into a complete social disaster. Now that’s where awkward lil’ me would personally fail), but it is a judgment call that most humans would probably screw up, but strangely enough the AI is now making these calls better than we are – and that is impressive and deeply disturbing.


From test-taker to test-maker

Hold on! Before you leave and call it a day, cause I need you to read this.

(cue infomercial voice)

They did not stop at letting the AI solve our human tests.

Oh no, no, they had to go and ask ChatGPT-4 to create entirely new emotional intelligence tests from scratch. They told it to write new scenarios, new characters, and new emotionally intelligent response options. And the bastard actually did it without whining or crying – things like real hooman emotions.

Then these masochistic researchers gave the AI-generated tests to hundreds of human participants to see if the robot could match the quality of tests that expert psychologists spent years carefully crafting with their university degrees and decades of experience.

And you know what?

The AI-generated tests were statistically equivalent to the original human-made ones in every way that mattered.

Shock-and-awe!

Same level of difficulty. Humans found them just as clear and realistic. They measured the same underlying emotional skills.

This is what I call a “holy shit, we’re all obsolete” moment for an entire field of science.

Yesterday we had surgeons, and today it’s the psychometricicicist.

Psychometric test development usually takes forever, costs more than my yearly spend on gadgetry, and requires teams of experts droning on-and-on about methodology until they’re blue in the face. And ChatGPT-4 did it on command with just a few prompts, prolly while helping someone else with his homework and spitting out Yeti shorts for Facebook and TikTok.


Does AI have feelings now or are we just simply fcuked

This is the million-dollar question that was keeping me up the night I read the paper while staring at the ceiling and wondering if my laptop is secretly monitoring my strange behavior.

The guys who did the study say that AI doesn’t actually feel emotions (ok bruv, we knew that), and that it just understands them better than we do, and to me, this is somehow even more unsettling. The researchers make a distinction between two types of empathy that sounds like something my shrink would explain while slowly destroying my will to live. . .

Cognitive Empathy is understanding what someone else is feeling through intellectual analysis – basically emotional detective work. C’est simple mon ami.

Affective Empathy is actually feeling what someone else feels – the shared emotional experience that makes you cry during dog movies. Ah, so that’s what I’m lacking – c’est not so simple.

This research proves that AI has mastered cognitive empathy so thoroughly that it makes the rest of us look like we are emotion’s lacking cavemen. These things generate responses that show they have accurate knowledge about human emotions and how to manage them without actually experiencing a single feeling themselves.

Now that’s what I call a true feat of engineering.

A mental health chatbot doesn’t need to feel your angst to offer scientifically-backed coping strategies. A project management tool doesn’t need to experience your team’s frustration to suggest better communication methods. It just needs to understand the patterns and spit out the right response and that is exactly what it’s doing.

But before you decide to cut out your limbic system altogether there’s one little thing. The AI’s performance wasn’t perfect though – the study found that its scenarios were slightly less diverse than human-created ones. But hey those are just minor flaws man, in an otherwise mind-blowing demonstration of artificial emotional intelligence (truth be told) and it making me question everything I thought I knew about consciousness and feelings.

And just for the fun of it, here we go again:


This ain’t academic masturbation you know

This scientific experiment (no quotes this time) isn’t some nerdy Dexter’s lab experiment that won’t affect your daily life. If you think that you’re more delusional than someone who thinks they’ll actually use that gym membership they bought in January (remember?).

This has massive real-world implications. Just let me go through them one by one (as mentioned in their research paper). .

Mental health got a glow-up

In the not so distant future you will have a shrink sittin’ in your pocket all day, errday and this AI homie actually knows what it’s talkin’ about and moreover it is anonynymous and that, my dear friends, is where the difference lies. Well, that and the fact that us plebs ain’t gonna be able to afford a ‘real’ shrink no more. That’s reserved for the 1%. And this new study says this ain’t just some bougie sci-fi movie.

Bosses ’bout to lose their schmucky powas

Oy vey, no more dealing with bosses who learned ‘management’ from – what’s his name – oh yeah, Simon Cynic and Daniel Overthink and that one TED Talk they watched halfway through. AI is gonna slide right into their inbox, and there it is going to whisper some real ‘how to handle humans’ tips on things like tough talks and not makin’ every team meetup feel like forced group therapy. We can all kiss those awkward-ass trust team building exercises goodbye, thank the Lord.

Psychology ’bout to get wrecked

Traditional psych offices err gonna get disrupted harder than Blockbuster, you know, the bumped video place that your uncle Saul still complains about closing down.

And if you think it’s going to take a while before this gets real.

Just about a month ago, I was positively surprised by a presentation by a lady from Finland who built a DSM-V chatbot for youngsters, based on ChatGPT.

Lemme mansplain. DSM is like the manual for all crazies out there, and if you would’ve axed me a half year ago if this was even possible, I would have laughed in your face, but a few months later it’s for real. Chatbot + some guardrails + GDPR = free Medicare people, and less suicides amongst teens (that’s the goal of it all).

And yes, I am also part of a team who are building a chatbot to support victims 24/7. It’s not easy-peasy, take that from me, but the technology is there and it is possible.

You just need to trust it.

And trust happens over time.

I hope.

Just three years ago, AI bots were about as emotionally intelligent as me, and now they are outperforming average humans on one of the most uniquely human skills I thought we had left in our arsenal to compete against the bot.

Bots like Hume.ai for instance are specifically built to recognize and show emotions.

When you let it tell you a story, or ask it questions, it is sometimes hard to distinguish it from a human being. Now if that doesn’t feel like a revolution that’s already happening while you’re still figuring out how to use ChatGPT properly, I don’t know what reality you are living in, but one thing is for sure – you are already so far behind that it’s going to get hard to catch up. . .

Oh sorry, I forgot it was supposed to be happiness week.

By now.

Signing off with a smirk

Marco

I build AI by day and warn about it by night. I call it job security. Let’s keep smashing delusions with truth. We are the chaos. We are the firewall. We are Big Tech’s PR nightmare.


Think a friend would enjoy this too? Share the newsletter and let them join the conversation. Google and LinkedIn appreciates your likes by making my articles available to more readers.

To keep you doomscrolling 👇

  1. The AI kill switch. A PR stunt or a real solution? | LinkedIn
  2. ‘Doomsday clock’: it is 89 seconds to midnight | LinkedIn
  3. AIs dirty little secret. The human cost of ‘automated’ systems | LinkedIn
  4. Open-Source AI. How ‘open’ became a four-letter word | LinkedIn
  5. One project Stargate please. That’ll be $500 Billion, sir. Would you like a bag with that? | LinkedIn
  6. The Paris AI Action summit. 500 billion just for “ethical AI” | LinkedIn
  7. People are building Tarpits to trap and trick AI scrapers | LinkedIn
  8. The first written warning about AI doom dates back to 1863 | LinkedIn
  9. How I quit chasing every AI trend (and finally got my sh** together) | LinkedIn
  10. The dark visitors lurking in your digital shadows | LinkedIn
  11. Understanding AI hallucinations | LinkedIn
  12. Sam’s glow-in-the-dark ambition | LinkedIn
  13. The $95 million apology for Siri’s secret recordings | LinkedIn
  14. Prediction: OpenAI will go public, and here comes the greedy shitshow | LinkedIn
  15. Devin the first “AI software engineer” is useless. | LinkedIn
  16. Self-replicating AI signals a dangerous new era | LinkedIn
  17. Bill says: only three jobs will survive | LinkedIn
  18. The AI forged in darkness | LinkedIn

Become an AI Expert !

Sign up to receive insider articles in your inbox, every week.

✔️ We scour 75+ sources daily

✔️ Read by CEO, Scientists, Business Owners, and more

✔️ Join thousands of subscribers

✔️ No clickbait - 100% free

We don’t spam! Read our privacy policy for more info.

Leave a Reply

Up ↑

Discover more from TechTonic Shifts

Subscribe now to keep reading and get access to the full archive.

Continue reading