Fake Data Could Help Solve Machine Learning’s Bias Problem — if We Let It
By now, we should know that when we let Artificial Intelligence solve problems, it will reproduce (and often amplify) whatever biases were in the data we trained it with. That’s why A.I. thinks women are less qualified for certain jobs and black people are likely criminals. Now, what could possibly go wrong if we replaced (biased) real data and trained A.I. with «synthetic data» instead, created by A.I., for A.I.?
From Weekly Filet #318, in September 2020.
💔 Some older links might be broken — that's the state of the web, sadly. If you find one, ping me.