• Home
  • AI News
  • Bookmarks
  • Contact US
Reading: Unfair Automated Hiring Systems Are Everywhere
Share
Notification
Aa
  • Inspiration
  • Thinking
  • Learning
  • Attitude
  • Creative Insight
  • Innovation
Search
  • Home
  • Categories
    • Creative Insight
    • Thinking
    • Innovation
    • Inspiration
    • Learning
  • Bookmarks
    • My Bookmarks
  • More Foxiz
    • Blog Index
    • Sitemap
Have an existing account? Sign In
Follow US
© Foxiz News Network. Ruby Design Company. All Rights Reserved.
> Blog > AI News > Unfair Automated Hiring Systems Are Everywhere
AI News

Unfair Automated Hiring Systems Are Everywhere

admin
Last updated: 2023/05/15 at 3:33 PM
admin
Share
4 Min Read

Earlier this month, Lina Khan, chair of the US Federal Trade Commission (FTC), wrote an essay in The New York Times affirming the agency’s commitment to regulating AI. But there was one AI application Khan didn’t mention that the FTC urgently needs to regulate: automated hiring systems. These range in complexity from tools that merely parse resumes and rank them to systems that green-light candidates and trash applicants deemed unfit. Increasingly, working Americans are obligated to use them if they want to get hired.

In my recent book, The Quantified Worker, I argue that the American worker is being reduced to numbers by AI technologies in the workplace, automated hiring systems chief among them. These systems reduce applicants to a score or rank, often ignoring the gestalt of their human experience. Sometimes they even sort people by their race, age, and sex, a practice that’s legally prohibited from being part of the employment decisionmaking process.

Ironically, many of these systems are marketed as being bias-free or guaranteed to reduce the probability of discriminatory hiring. But because they’re so loosely regulated, such systems      have been shown to deny equal employment opportunity on the basis of protected categories such as race, age, sex, and disability. In December 2022, for example, a female truckers union sued Meta, alleging that Facebook “selectively shows job advertisements based on users’ gender and age, with older workers far less likely to see ads and women far less likely to see ads for blue-collar positions, especially in industries that historically exclude women.” This is deceptive. Even more, it is unfair to job applicants and employers alike. Employers purchase automated hiring systems to reduce their liability for employment discrimination, and the vendors of those systems are legally obligated to substantiate their claims of efficacy and fairness.

The law puts automated hiring systems under the FTC’s purview, but the agency has yet to release specific guidelines on how purveyors of these systems ought to advertise their wares. It should start by requiring auditing to ensure that automated hiring platforms are fulfilling the promises they make to employers. The vendors of these platforms should be obligated to provide clear records of audits demonstrating that their systems reduce bias in employment decisionmaking as advertised. These audits should be able to show that the designers followed Equal Employment Opportunity Commission (EEOC) guidelines when creating the platforms.

- Advertisement -
Ad imageAd image

Also, in collaboration with the EEOC, the FTC could establish the Fair Automated Hiring Mark, which would be used to certify that automated hiring systems have passed the rigorous auditing process. As an imprimatur, the mark would be a useful signal of quality to consumers—both applicants and employers.

The FTC should also allow job applicants, who are consumers of AI-enabled online application systems, to sue under the Federal Credit Report Act (FCRA). Previously, the FCRA was thought to only apply to the Big Three credit agencies, but a close reading shows that this law can apply whenever a report has been created for any “economic decision.” By this definition, applicant profiles created by online automated hiring platforms are “consumer reports,” which means that the entities that generated them (such as online hiring platforms) would be considered credit reporting agencies. Under the FCRA, anyone that is the subject of one of these reports can petition the agency that made it to see the results and demand corrections or amendments. Most consumers do not know they have these rights. The FTC should launch an education campaign to inform applicants about these rights so they can make use of them.

admin Mai 15, 2023 Mai 15, 2023
Share this Article
Facebook Twitter Email Copy Link Print
Leave a comment Leave a comment

Schreibe einen Kommentar Antworten abbrechen

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert

Follow US

Find US on Social Medias
Facebook Like
Twitter Follow
Youtube Subscribe
Telegram Follow
newsletter featurednewsletter featured

Subscribe Newsletter

Subscribe to our newsletter to get our newest articles instantly!

[mc4wp_form]

Popular News

How Can Artificial Intelligence Improve Resource Optimization
November 19, 2022
The Hollywood Strikes Stopped AI From Taking Your Job. But for How Long?
Dezember 25, 2023
Books: Artificial Intelligence and Smart Cities
April 29, 2022
Will AI Replace Teachers?
Dezember 19, 2022

Quick Links

  • Home
  • AI News
  • My Bookmarks
  • Privacy Policy
  • Contact
Facebook Like
Twitter Follow

© All Rights Reserved.

Welcome Back!

Sign in to your account

Lost your password?