Thank you for this article. Since the advent of ChatGPT3, I've been concerned about the A.I. learning same flaws that humans possess through the language humans use. English is a shining example of how a language can have racism, patriarchy and no recognition of genders other than male and female. You've given some good examples where the word "black" is used to describe something negative. I see the same issue with the word "fair" when used in fairness that implies justice.
Hope you don't mind me using your article to refer to these issues.
We could use other colors to substitute black, that's a good idea. But can't use "red", that's the color of communists, can't use "brown" or "yellow", people of Indian subcontinent and Chinese respectively are referred with these colors. Pink refers to feminine, blue to masculine. Maybe have to use composite colors like Magenta or Crimson.
Thanks again for the article. Much needed in today's times when we are trying to teach A.I. about human culture. We must ensure we teach only the good things so that it doesn't have human flaws.