AI is not there yet: Case #1023

I like Joanna Stern, so I’m not here to bust on her or the WSJ (in this particular context). But this is the type of thing that angers me when we’re talking about AI and (Apple) being “behind”. It’s something John Siracusa has talked about often on ATP, so I’m not novel in this complaint.

Stern recently sat down with Apple’s Craig Federighi to talk about Apple Intelligence and its “slow” rollout. Let’s look at the example that got to me. Go to the 5:33 mark of this video when Stern begins talking about OpenAI’s Advanced Voice Mode:

She asks it a simple question: “How does Craig Federighi do his hair?” The answer she received does trail off in the video, but it begins like this:

Craig Federighi has a full silver mane, typically brushed back…

She then contrasts this with what Siri came back with when asked the same question:

I found this on the web.

Ok, I see what she’s trying to prove here, but it’s not as “gotcha” as she thinks it is. Yes, the OpenAI model gives her back a conversational answer, and not a list of choices that were found (on the web) that could be the answer. People are frustrated with Siri for this type of thing. I get it.

But read the OpenAI answer again. It doesn’t answer the actual question! It even uses the word “typically”. The question was “How does Craig Federighi do his hair?” Not “How does a typical older man with voluminous gray hair do his hair?”

It’s not answering the damn question!!

As a viewer, we are supposed to marvel over “how does it know?” and “That’s creepily accurate!” No, it’s not! It’s generalizing an answer based on probabilities of “most men”. It’s not answering the question in regard to Craig himself.

It’s bullshit. And yet, everyone gets to hit on Apple for “missing the boat” or “being behind”. I’d rather them be behind than make shit up or generalizing an answer to the point of being useless. We have enough of that crap in the media nowadays. I don’t need it coming from my phone, too.

Lee Feagin @leefeagin