Fake Data Could Help Solve Machine Learning’s Bias Problem — if We Let It

Fake Data Could Help Solve Machine Learning’s Bias Problem — if We Let It

By now, we should know that when we let Artificial Intelligence solve problems, it will reproduce (and often amplify) whatever biases were in the data we trained it with. That’s why A.I. thinks women are less qualified for certain jobs and black people are likely criminals. Now, what could possibly go wrong if we replaced (biased) real data and trained A.I. with «synthetic data» instead, created by A.I., for A.I.?


From Weekly Filet #318, in September 2020.

💔 Some older links might be broken — that's the state of the web, sadly. If you find one, ping me.

Make sense of what matters, today and for the future.

Every Friday, carefully curated recommendations on what to read, watch and listen to. Trusted by thousands of curious minds, since 2011.

Undecided? Learn more | Peek inside