ARTIFICAL intelligence is “shockingly” racist and sexist, a study has revealed.
In one example, the team from Massachusetts Institute of Technology looked at an income-prediction system and discovered it was twice as likely to misclassify female employees as low-income and male employee as high-income.
However, the team was able to adjust the system to make sure it was less biased.
When researchers increased the dataset by a factor of 10 they found the mistakes decreased by 40 per cent.
Irene Chen, a PhD student who wrote the paper with MIT professor David Sontag and postdoctoral associate Fredrik D. Johansson, said it comes down to using better data.
She said: “Computer scientists are often quick to say that the way to make these systems less biased is to simply design better algorithms.
“But algorithms are only as good as the data they’re using, and our research shows that you can often make a bigger difference with better data.”
In another example, researchers found that AI system’s ability to predict intensive care unit (ICU) mortality was inaccurate for Asian patients.
They warned that using existing methods to fix the system would make the predictions less accurate for non-Asian patients.
Typically researchers would just add more data to the system, but Chen said it is also the quality of the data that is important.
MOST READ IN SCIENCE
Instead researchers should be getting more data from under-represented groups.
Sontag said: “We view this as a toolbox for helping machine learning engineers figure out what questions to ask of their data in order to diagnose why their systems may be making unfair predictions.”
The research team will present their paper in December at the Neural Information Processing Systems (NIPS) in Montreal.
We pay for your stories! Do you have a story for The Sun Online news team? Email us at email@example.com or call 0207 782 4368. You can WhatsApp us on 07810 791 502. We pay for videos too. Click here to upload yours.