Register to receive a printed copy(For Lexis Practice Advisor® Subscribers Only)
Lexis Practice Advisor®Free Trial
Learn More AboutLexis Practice Advisor®
By: Richard R. Meneghello, Sarah J. Moore, and John T. Lai, Fisher & Phillips LLP
This article provides guidance and best practices for counseling employers on the legal implications of integrating artificial intelligence (AI) and robots into their workplaces.
Thanks to recent technological advances, AI algorithms and robots are developing the sophistication to displace human employees, causing many employers to engage in mass layoffs and reductions in force. For instance, Goldman Sachs recently laid off nearly 600 equity traders whose work has largely been supplanted by automated trading programs and a team of computer engineers.1
As employers continue to pursue disruptive technologies like AI and robotics that can reduce workforces, unions and employees will mount legal challenges in an effort to protect their positions. To ensure employers can implement these technologies with minimal repercussions, you should assess their risks and liabilities and help them put together a strategic plan. Consider the following measures to avoid liability from layoffs caused by AI and robotics.
Another way employers may utilize AI is to filter large pools of job applicants. For example, some employers use computer software programs to auto-screen resumes as a human recruiter would. Such programs use machine learning, algorithms, and/or natural language processing to identify the best candidates for employment. Similarly, employers can use an AI-powered recruiting assistant that allows applicants to communicate through messaging apps. One such program uses natural language processing to analyze data an applicant provides and then asks the applicant additional questions to help fill gaps in the applicant’s data. The applicant can also ask the virtual recruiting assistant questions. Other computer programs search social media to find information to fill the gaps in candidates’ profiles and then rank the candidates. Certain employers also have candidates play neuroscience-based computer games and use the results to determine which candidates to interview.
Some employers even use AI for conducting interviews. For instance, an employer might ask a candidate to record answers to interview questions, and a computer program would then analyze the interview (utilizing machine learning, algorithms, and/or natural language processing) for key words, the speed of speech, body language, or other relevant predictors of a candidate’s qualifications and future successes. The computer program would generate a report with suggestions that could then be used to determine whether a candidate should move on in the employer’s recruitment process.
While a sophisticated AI screening system may be able to eliminate unqualified candidates, system limitations and inherent biases may lead to employment discrimination lawsuits. Consider taking the steps below to limit exposure resulting from using AI in the screening and hiring process.
Robotics and AI raise novel issues and concerns for employers regarding employee safety. There are currently no Occupational Safety and Health Administration (OSHA) standards specifically for the robotics industry. However, OSHA highlights general standards and directives applicable to employers utilizing robotics.6 OSHA also provides guidelines for robotics safety.7
Under the Occupational Safety and Health Act (OSH Act), a covered employer utilizing robotics—like any other employer the OSH Act covers—must conduct a “hazard assessment” in which it reviews working environments for potential occupational hazards. 29 C.F.R. § 1910.132(d). An employer that identifies a hazard must implement a “hazard control” in the following order of preference: hazard elimination, hazard replacement, engineering controls, administrative controls, or personal protective equipment. With this legal framework as background, consider taking the following actions to mitigate the risk of employee exposure to hazards and legal actions associated with robots:
From the Apple Watch to the Fitbit, wearable technology is becoming increasingly prominent in modern life. In the workplace, using AI to catalog and assess employee data can be a significant boon for employers, which can use AI systems to track worker movements to identify and rectify inefficiencies. Nevertheless, privacy and data security concerns abound when employers utilize such technology
Consider the following measures to guard against privacy claims:
Data Security Issues
Whenever employers gather data, including via wearable technology, they must consider the risk of data breaches and how to prevent them. As this area of law is continually evolving, ensure that the employer consults with an attorney who is well-versed in cybersecurity issues. You should also determine whether the employer has appropriate safeguards in place to prevent unauthorized intruders from obtaining private, personal employee data. For instance, IT departments should mask data collected so that it cannot be linked to a specific user and should use encryption. Additionally, consider implementing regular audits to ensure the employer’s data security protocols are legally compliant and up-to-date.
The amount of data that parties produce in discovery in today’s employment litigations can be staggering. Compounding this problem, attorneys are expected to review this data efficiently— quickly and at a low cost. The faster and more accurately a lawyer can locate useful information, the better and more cost-effectively the attorney will be able to develop his or her case. Because AI can analyze a larger quantity of information more thoroughly than humans can, and in a fraction of the time, attorneys are turning to AI more and more as a key component of their legal practices. Consider taking advantage of recent developments in AI in your own practice in the following ways:
Richard R. Meneghello is the Publications Partner for Fisher Phillips. He develops legal alerts, web articles, newsletter features, and blog posts for the Fisher Phillips website. Rich is also an accomplished litigator. He won a unanimous decision before the U.S. Supreme Court in the case of Albertsons v. Kirkingburg, an Americans with Disabilities Act case, as well as cases for clients at the Ninth Circuit Court of Appeals, the Oregon Supreme Court, and the Oregon Court of Appeals, along with trial victories in both state and federal courts. Sarah Moore is a partner at Fisher Phillips, in its Cleveland office. She enjoys a robust practice that crosses industries in the private and public sectors and routinely incorporates the insights and best practices from this diversity in experience into her work. Sarah thrives on handling highly sensitive and challenging issues and regularly works hand-in-hand with her clients addressing the full spectrum of labor and employment concerns. John T. Lai is an associate in the firm’s Irvine office. He practices in all areas of labor and employment law. John has experience in intellectual property matters, unfair competition, and complex litigation.
To find this article in Lexis Practice Advisor, follow this research path:
RESEARCH PATH: Labor & Employment > Investigations, Discipline, and Terminations > Discharge and Layoffs/RIFs > Practice Notes
For more information on voluntary separation programs and alternatives to reductions in force (RIFS), see
> ALTERNATIVES TO REDUCTIONS IN FORCE (RIFS)
RESEARCH PATH: Employee Benefits & Executive Compensation > Employment, Independent Contractor, and Severance Agreements > Executive Separation Agreements & Severance Plans > Practice Notes
For a discussion of drafting separation agreements, see
> SEPARATION AGREEMENTS: DRAFTING AND NEGOTIATION TIPS (PRO-EMPLOYER)
RESEARCH PATH: Labor & Employment > Discrimination and Retaliation > Claims and Investigations > Practice Notes
For information on state laws concerning RIFs, see
> THE MASS LAYOFF AND PLANT CLOSING LAWS COLUMN IN INVESTIGATIONS, DISCIPLINE, AND TERMINATIONS STATE PRACTICE NOTES CHART
For an overview of requirements under the Worker Adjustment and Retraining Notification Act (WARN Act), see
> WARN ACT COMPLIANCE CHECKLIST
RESEARCH PATH: Labor & Employment > Investigations, Discipline and Terminations > Discharge and Layoffs/RIFs > Checklists
For best practices on drafting policies concerning employee privacy when using electronic devices, including a sample policy, see
> CREATING POLICIES ON COMPUTERS, MOBILE PHONES, AND OTHER ELECTRONIC DEVICES
RESEARCH PATH: Labor & Employment > Employment Policies > Company Property and Electronic Information > Practice Notes
For more information on the risks of wearable technology, see
> UNDERSTANDING EMPLOYMENT PRIVACY ISSUES UNDER FEDERAL LAW
RESEARCH PATH: Labor & Employment > Privacy, Technology and Social Media > Monitoring and Testing Employees > Practice Notes
1. See Nanette Byrnes, As Goldman Embraces Automation, Even the Masters of the Universe Are Threatened, MIT TECHNOLOGY REVIEW (Feb. 7, 2017). 2. See, e.g., Renton News Record, 136 N.L.R.B. 1294, 1297–98 (1962); NLRB v. Columbia Tribune Publ’g Co., 495 F.2d 1384, 1391 (8th Cir. 1974); Newspaper Printing Corp. v. NLRB, 625 F.2d 956, 964 (10th Cir. 1980). 3. See EXECUTIVE OFFICE OF THE PRESIDENT, Big Data: A Report on Algorithmic Systems, Opportunity, and Civil Rights (May 2016), p. 14; Roger W. Reinsch and Sonia Goltz, The Law and Business of People Analytics: Big Data: Can the Attempt to be More Discriminating be More Discriminatory Instead?, 61 St. Louis L.J. 35, 40–42 (2016). 4. See Data-Driven Discrimination at Work, Pauline T. Kim, 58 Wm. & Mary L. Rev. 857, 863, 873 (2017); Sofia Granaki, Autonomy Challenges in the Age of Big Data, 27 Fordham Intell. Prop. Media & Ent. L.J. 803, 826 (2017); Solon Barocas & Andrew D. Selbst, Big Data’s Disparate Impact, 104 Calif. L. Rev. 671, 682, 689, 722 (2016); Federal Trade Commission, Big Data: A Tool for Inclusion or Exclusion? (January 2016), p. v. 5. See Anupam Chander, Reviews: The Racist Algorithm?, 115 Mich. L. Rev. 1023, 1029 (2017). 6. See Robotics, Standards, Occupational Safety and Health Administration, Safety and Health Topics. 7. See Guidelines for Robotics Safety, OSHA Instruction STD 01-12-002 (1987). 8. See, e.g., Elgin v. St. Louis Coca-Cola Bottling Co., 2005 U.S. Dist. LEXIS 28976, at *7–11 (E.D. Mo. 2005); Gerardi v. City of Bridgeport, 2007 Conn. Super. LEXIS 3446, at *17–20 (Super. Ct. Dec. 31, 2007)