Published 15:48 IST, July 15th 2019

Artificial Intelligence Could Be Used To Find Out If There Is Discrimination At Various Levels

Researchers developed a new artificial intelligence (AI) tool for detecting unfair discrimination such as race or gender.

Follow: Google News Icon
  • share
null | Image: self
Advertisement

Researchers developed a new artificial intelligence (AI) tool for detecting unfair discrimination such as race or . Preventing unfair treatment of individuals on basis of race, or ethnicity, for example, been a long-standing concern of civilized societies. However, detecting such discrimination resulting from decisions, wher by human decision-makers or automated AI systems, can be extremely challenging.

This challenge is furr exacerbated by wide option of AI systems to automate decisions in many domains including policing, consumer finance, higher education, and business.

Advertisement

"Artificial intelligence systems such as those involved in selecting candidates for a job or for mission to a university are trained on large amounts of data. But if se data are biased, y can affect recommendations of AI systems," said Vasant Honavar, one of researchers of study presented at meeting of Web Conference.

For example, he said, if a company historically has never hired a woman for a particular of job, n an AI system trained on this historical data will t recommend a woman for a new job."re's thing wrong with machine learning algorithm itself. It's doing what it's supposed to do, which is to identify good job candidates based on certain desirable characteristics. But since it was trained on historical, biased data it has potential to make unfair recommendations," said Honavar.

Advertisement

team created an AI tool for detecting discrimination with respect to a protected attribute, such as race or , by human decision-makers or AI systems that are based on concept of causality in which one thing a cause causes ar thing an effect"For example, question, 'Is re -based discrimination in salaries?' can be reframed as 'Does have a causal effect on salary?' or in or words, 'Would a woman be paid more if she was a man?' said Aria Khemi, one of researchers of study.

Since it is t possible to directly kw answer to such a hypotical question, team's tool uses sophisticated counterfactual inference algorithms to arrive at best guess. "For instance, one intuitive way of arriving at best guess as to what a fair salary would be for a female employee is to find a male employee who is similar to woman with respect to qualifications, productivity, and experience. We can minimize -based discrimination in salary if we ensure that similar men and women receive similar salaries," said Khemi.

Advertisement

researchers tested ir method using various s of available data, such as income data to determine wher re is -based in salaries.

"We analysed an ult income data set containing salary, demographic and employment-related information for close to 50,000 individuals. We found evidence of -based discrimination in salary. Specifically, we found that odds of a woman having a salary greater than $50,000 per year is only one-third that for a man. This would suggest that employers should look for and correct, when appropriate, bias in salaries," said Honavar. Although team's analysis of dataset revealed evidence of possible racial bias against Hispanics and African American individuals, it found evidence of discrimination against m on aver as a group.

Advertisement

"You cant correct a problem if you don't kw that problem exists. To avoid discrimination on basis of race, or or attributes you need effective tools for detecting discrimination. Our tool can help with that," said Honavar. Honavar ded that as data-driven artificial intelligence systems increasingly determine how businesses target vertisements to consumers, how police departments monitor individuals or groups for criminal activity, how banks decide who gets a loan, who employers decide to hire, and how colleges and universities decide who gets mitted or receives financial aid, re is an urgent need for tools such as one he and his colleagues developed."Our tool," he said, "can help ensure that such systems do t become instruments of discrimination, barriers to equality, threats to social justice and sources of unfairness. 

Also Re: Artificial Intelligence In Sports: AI Me Fairer For Wimbledon Courts

Advertisement

15:37 IST, July 15th 2019