Some experts say that the data sets that AI uses come from information that is biased against women.
Some experts worry that artificial intelligence (AI) could have its own gender gap if more women don’t help build it and look at data sets.
Dr. Georgianna Shea, the top scientist at the Foundation for Defense of Democracies Center on Cyber and Technology Innovation (CCTI), told Fox News Digital, “It’s not just AI, but engineering as a whole.” “You don’t want to end up with biased engineers when there’s any kind of engineering process.”
Melinda French Gates, the co-chair of the Bill & Melinda Gates Foundation, recently said in an interview that she was worried about possible biases in platforms because there aren’t enough women working in the field of artificial intelligence.
Shea said that the problem is twofold: not only does the field need more women to help guide the development of AI platforms, but the datasets used to teach and train the AI already have skewed data.
“There’s another thing about the data you put into those AIs,” Shea said. “You want to make sure you get the data about women, and that women know how much data represents them.”
Shea used the example of a field where women make up most of the workers, like nursing, where about 86% of workers are women. In this case, the AI could choose the information that favors women when making conclusions about the nursing field and industry. This would put male workers at a disadvantage if they wanted to use an AI platform to get relevant information.
“guys and women have different bodies, so if you try a drug on a group of guys, the body mass index might be higher or lower than if it were a woman… “There are just fundamental differences, so the data will show that this is how it turned out for that test group,” she said.
Women have been worried for years about how gender bias could affect AI. In 2019, the Stanford Social Innovation Review talked about some of the possible problems: The writers said that institutions that use AI and machine learning to make decisions have a “pervasive” gender bias that hurts women’s short-term and long-term safety and well-being in big ways.
Part of this is because all the data is put through a single server without being separated by gender or sex. This “hide[s] important differences” and “hide[s] potential overrepresentation and underrepresentation,” as the report says.
According to a collection of data from Zippia.com, women will make up about 28% of tech business workers by 2022. Also, around 34.4% of the people who work at the biggest tech companies in the U.S. are women.
DataProt says that only 15% of engineering jobs are held by women and that women leave the tech field at a rate 45% higher than men.
Shea compared skewed data sets to military tools, like a tank or similar vehicle, that were made for men because only men could serve in combat jobs until about 10 years ago when the military changed its rules. When women were fully allowed into the military in 2015, the military and engineers had to start changing cars to meet the minimum height and weight standards for safety.
Shea says that thinking about the context of an AI tool is the best way to ensure it doesn’t run into this problem.
“You have to know why we’re making this system. Why are we doing this? Who will be affected? She asked, “What kind of advice do we need that won’t include those societal biases and data biases that might be there?”
“So you have to figure out what that is and include it in the process while taking gender out of the equation,” she said.