Model collapse. AI is eating the data, then it decides for us
We are filling the web with synthetic text, then we expect AI models to stay close to reality. Stay with me until the end because this hits jobs and money. “Model collapse” is what happens when a model is trained again and again on AI made data, or on polluted data. It loses rare details, edge cases, nuance. What remains looks clean and confident, but it is average and flatter. A 2024 Nature paper shows this decay can compound over generations of training.
We already pay for it in real decisions. Hiring filters, shortlists, screening. Here in the United States, Reuters reported that Amazon stopped an internal recruiting tool because it systematically disadvantaged women, due to biased historical data turning into automatic rules.
Now the loop is closing. AI written resumes, AI written job posts, AI screening. People write to please an algorithm, companies select with another algorithm. Reality drops out of the process.
Security is no longer only firewalls. It is decision integrity: logs, traceability, real human oversight. A Thomson Reuters Institute report says 91% of C-suite leaders already use GenAI or plan to within 18 months. If we ignore model collapse, we accept decisions that are more automatic and harder to audit.
#ArtificialDecisions #MCC #AD
✅ This video is brought to you by: https://www.ethicsprofile.ai
👉 Important note: We’re planning the upcoming months.
If you’d like to request my presence as a speaker at your event, please contact my team at: management@camisanicalzolari.com
