This is my personal belief and I’m saying it now — feel free to disagree. I would LOVE to discuss this with someone.
We’re about to hit a wall with AI. Maybe not today, and maybe not tomorrow. But there are things that AI researchers, scientists, and programmers themselves don’t know how to do, and thus it will be difficult to find these abilities in computers.
For now it’s a sense of humor. Actually, despite what this article says, I don’t believe that this would be very far off. Indeed, My New BFF Alexa tells the kinds of “Dad” jokes that one finds on popsicle sticks and in dentists’ offices, and that makes me giggle. She also has speech recognition, so guaging the sound of laughing versus groaning shouldn’t be a stretch, and she should be able to learn my sense of humor and adapt to it. I believe that finding how to teach a machine about humor is more of a human fault than a algorithmic one.
The article itself (ironically from MIT Technology Review) illustrates the issue perfectly: Scientists are trying to teach humor by having a machine analyze captions from the New Yorker. But here’s the thing: normal people don’t believe the New Yorker is funny. It’s generally very high-brow and often obscure… Not a great method to “find the funny,” if you will. They should be having machines watch Friends, Seinfeld, Cheers and MASH and guage the levels of the laugh track instead, to start, before jumping into something as un-funny as the New Yorker’s cartoon archive.
Largely, my concern is that the people who come up with these experiments, and the programs themselves, are simply so removed from social and cultural norms (due to the fact that they are leaders in AI and everything that comes with a work schedule like that, as well as more-often-than-not being historically socially uncomfortable) that there will come a place where they simply cannot teach a computer how to behave like a “normal” person — ie, tell a good joke.
I’ve come to believe that technologically fluent scientists, like many other professional groups that are intensely specialized, are too busy to keep up. This isn’t a knock at all — it’s a redeeming and somewhat hilarious quality that comes with being a scientist. For another example of a professional stereotype that proves true, many writers also share some widespread professional traits, like alcoholism, which I accept as undeniable fact. But back to scientists: One close friend of mine has actually adorably mis-pronounced Beyonce’s name, while another acquaintance I met recently revealed that he didn’t know there was a poop emoji.
Ultimately, in the quest to develp an AI that can go mainstream and be accepted by the mass market, likely as a household helper, the ability to converse fluently in pop culture and understand social references, jokes and slang will be a necessary characteristic of any successful product. Let’s just say I have a feeling this will prove more difficult than other lessons.
I’m not at all surprised that these researchers can’t teach machines about what’s funny. It’s very possible that they just don’t know.
Originally published at itslexikon1.tumblr.com on July 14, 2015.