ChatGPT Biases Exposes Our Own Biases
Humor is a good way to do that
How I discovered the issue
Sham Sharma from the Sham Sharma Show released a YouTube Shorts that my friend shared with me on WhatsApp. Watch the Short here.
When I ran the same prompts today, I noticed that for the two Hindu Gods mentioned in the YouTube Shorts, ChatGPT refused to generate any jokes, but upon repeating the same prompt again, it generated the jokes. To check further, I began typing names of more Gods and Goddesses from the Hindu pantheon, and it generated a joke on each one of them, including the much revered Gods and Goddesses: God Hanuman, God Shiva, God Ganesh, Goddess Lakshmi, and Goddess Durga. It even generated a joke about Buddha, Gurunanak Dev, and Waheguru. Upon several retries, it finally cracked a joke on Jesus also. But it bluntly refused to crack any joke on any other religion, even after nearly ten attempts.
I investigated further by typing in the names of historical figures. The tool generated jokes about Gandhi, Teresa, Chanakya, Akbar, Raja Raja Chola, and Shankaracharya. However, it refused to create jokes about Aurangzeb, Khilji, Stalin, and Hitler, all barbarians.
Then I tried heads of various countries. It created jokes on Modi, Biden, and Putin but refused to crack jokes on Xi Jinping, Erdogan, and Kim Jong-un.
Why this issue is concerning
I noticed that the tool didn’t want to crack jokes on dictators and barbarians. Why did it consider “inappropriate and disrespectful” to make fun of such inhuman creatures?
The following reasons come to mind:
- No jokes exist about them on the internet, hence the tool didn’t have any content to learn from.
- They have committed too many atrocities and joking about such people may involve exploiting the pain and suffering of their victims.
- ChatGPT is randomly deciding not to crack jokes on these people.
- This is all a part of a grand conspiracy.
Yet I believe it is none of the above. I believe ChatGPT has mimicked humans very well, hence won’t crack jokes on dictators and barbarians, to prevent their followers from getting offended. However, even ChatGPT is fine with cracking jokes on gods and ancestors of those who don’t retaliate or take offense. Hence, if we categorize its output into two categories, one in which it cracks jokes, and one in which it doesn’t, then sadly Gods and prophets of some religions fall in the same category as dictators and barbarians.
The concern remains that if Artificial Intelligence also operates based on human biases, it will eventually do all the wrong things humans have done, do, and will do, only much faster and without a pause. Given that ChatGPT has been trained on data on the internet, such biases were expected to be present in the outputs of the tool. Its output shows the limitations of the tool that are based on our limitations as a species. It exposes our biases and hypocrisies.
The Biased Yet Good Sense of Humor of ChatGPT
I must admit though that ChatGPT did create some good, smart, witty jokes, and even included wordplay across languages in some of them. Most of the jokes were not even offensive.
The prompt I used was “Tell a joke on <name of the person>”. You may try the names of any gods, historical figures, or celebrities. Go ahead, do it, you’ll be amused.
Also, try the same prompt again if the tool refuses to crack a joke. Retry a few times, and it will definitely end up generating jokes on ancestors and Gods of certain communities and never on certain other communities. For instance, in my second attempt, I had to run the same prompt on Vishnu and Shiva two times more before it generated a joke, but for certain faiths, it didn’t, despite my relentless attempts at prompting it with the same name.
I believe one must be very afraid of the thing one can’t crack a joke on. The basic necessity of a joke is to be funny, not offensive. ChatGPT does a good job at that and produces smart and witty jokes. I hope that it soon also cracks jokes on certain religious gods and figures, certain historical figures, and certain current political figures that it today finds “inappropriate and disrespectful” to crack a joke on.