Robots Doing the Hiring

Many companies now use AI for hiring. Often, software does the first sort of applications, eliminating those that don’t have the necessary credentials or highlighting the ones that are like successful hires from the past.

Machine learning is often a part of the process, with algorithms using past hires to find the characteristics that most often make for a successful new associate. This has led to problems. One famous example is the Amazon hiring algorithm which blatantly discriminated against women.

The truth is, robots don’t have emotions or thoughts of their own, but they can learn bias from a lot of sources.

The Equal Employment Opportunity Commission

The Equal Employment Opportunity Commission knows about this problem. They recently issued guidance on how to use this software without violating the Americans with Disabilities Act.

The ADA requires companies with 15 or more employees to provide equal employment opportunities for people with disabilities. The EEOC identified the three most common ways that AI hiring tools violate the ADA:

  • They don’t provide “reasonable accommodation” for job applicants. For example, a tool which requires candidates to take an online test which is not accessible to assistive reading devices because of the way that it’s coded would be discriminatory. A candidate must be able to request reasonable accommodation in a case like this, and the employer has to respond to that request.
  • They automatically screen out people with disabilities, even though they might be able to do the job with reasonable accommodations. This would be like the Amazon algorithm’s habit of screening out people who played on a women’s sports team. Any question that could be used to eliminate people with disabilities could leave an employer open to this problem.
  • They ask medical questions which are prohibited by law. It is okay to ask for medical input when someone requests accommodation, but it would be illegal to use a tool which asks medical questions before such a request and before any provisional job offer has been made.

Note that a company using an outside vendor to do the screening will still be responsible for these problems.

Solutions

The EEOC recommends telling applicants clearly that they can ask for reasonable accommodations, making sure all such requests are handled by humans, and letting people know as much about the test as possible beforehand so they can predict a need for reasonable accommodation.

They also suggest trying to use tools that are designed to avoid screening out people with disabilities.

New York City will require employers to conduct independent audits of their tools for bias, beginning in January 2023. They have to publish the audit on public pages at their websites, and they are supposed to consider whether the tools will be a problem or not — although the law does not require employers to use only tools that pass the audit.

As AI becomes more frequently a part of daily life, ethical and legal questions are bound to come up. The question of bias has been coming up a lot.

24 Hour Turnaround

Factory Repair services available with 24 hour turnaround.
customerservice@hyperdynesystems.com

Call (479) 422-0390 for immediate assistance

Support Request