Banned: The 1,170 words you can't use with GitHub Copilot
Hash cracking reveals verboten slurs, terms like 'liberals, 'Palestine,' and 'socialist' ... and Quake's famous Fast InvSqrt
GitHub's Copilot comes with a coded list of 1,170 words to prevent the AI programming assistant from responding to input, or generating output, with offensive terms, while also keeping users safe from words like "Israel," "Palestine," "communist," "liberal," and "socialist," according to new research.…
from The Register
No comments