How can bias be detected in machine learning? – Techopedia

Many machine learning algorithms use provided by data scientists to learn the desired results. Whether it’s a supply chain anomaly, an image detection of a deer crossing the road or the correct response to a ChatGPT question, the dataset used to train the model is important to understand what biases it might have. (Also read: Fairness in Machine Learning: Eliminating Data Bias.)

If the machine learning model was based on a supervised data set, it means a human was validating the correctness of the machine learning algorithm results. If that human was biased, the results would be skewed accordingly.

There are many different types of AI bias and they can all exist within the same model — making detection that much harder. For example, latent bias is due to a type of learning around a dataset that drives a stereotype in human society — such as assuming all fighter jet pilots are male.

Regardless, there is a way to detect biases in machine learning systems. The method involves providing the system with a wide range of inputs, including extreme edge cases, to push the model into a corner. If you start to detect the machine might be answering repetitively or simply heading in the wrong direction, push harder to see if you can get extreme negative results. Luckily, Google has created a tool to do just that. It is called the “What-if Tool” and promises to “test performance in hypothetical situations, analyze the importance of different data features, and visualize model behavior across models.”

Removing bias is similar to teaching a child to never touch the stove: See if it will happen once, teach the model that this behavior was bad through experience, and then teach the machine to never complete that type of task again. Because if you just scold the end result, both the child and the machine might approach the stove sometime in the future just to see what happens. (Also read: Prompt Learning: A New Way to Train Foundation Models in AI.)

Leave a comment

Your email address will not be published. Required fields are marked *