An interesting action from the Consumer Financial Protection Bureau (CFPB) saw it announce its new guidance to protect workers from unchecked digital tracking and opaque decision-making systems.
The guidance warns that companies using third-party consumer reports – including background dossiers and surveillance-based, “black box” AI or algorithmic scores about their workers – must follow Fair Credit Reporting Act (FCRA) rules. FCRA protections extend to companies that assemble detailed dossiers about consumers and sell this information to those making employment decisions.
And it means employers must obtain worker consent, provide transparency about data used in adverse decisions, and allow workers to dispute inaccurate information. As companies increasingly deploy invasive tools to assess workers, this ensures workers have rights over the data influencing their livelihoods and careers.
“Workers shouldn’t be subject to unchecked surveillance or have their careers determined by opaque third-party reports without basic protections,” said CFPB Director Rohit Chopra. “The kind of scoring and profiling we’ve long seen in credit markets is now creeping into employment and other aspects of our lives. Our action today makes clear that longstanding consumer protections apply to these new domains just as they do to traditional credit reports.”
Third-party reports, third-party apps
The CFPB’s guidance addresses the use of third-party consumer reports by employers to make employment decisions about their workers.
These reports increasingly extend beyond traditional background checks and may encompass a wide range of information and assessments about workers. For example, some employers require workers to install apps on their personal phones that monitor their conduct, which may be used to assess their performance.
Currently, such consumer reports may be used to:
- Predict worker behavior: This includes assessing the likelihood of workers engaging in union organizing activities or estimating the probability that a worker will leave their job, potentially influencing management decisions about staff retention and engagement strategies.
- Reassign workers: Automated systems may use data on worker performance, availability, and historical patterns to reassign team members.
- Issue warnings or other disciplinary actions: These consumer reports might flag potential performance issues, leading to automated warnings or recommendations for disciplinary measures (potentially including firing) without direct human oversight.
- Evaluate social media activity: Some reports may include analysis of workers’ social media presence, potentially affecting hiring or other decisions.
The CFPB notes that while background checks have long been a part of employment and hiring practices, the emergence of new technologies has expanded the scope and depth of worker tracking.
“These reports often contain sensitive information unknown to workers, which can significantly impact hiring decisions, job assignments, and career advancement. Inaccurate reports may cause workers to lose job opportunities, face unfair treatment, or suffer career setbacks due to information they did not even know existed, let alone had a chance to dispute,” the CFPB said.
Fair Credit Reporting Act
Congress passed the FCRA in response to concerns about companies that assemble detailed dossiers about consumers and sell this information. In doing so, Congress was particularly cognizant of the impact of so-called “credit reporting” on consumers’ employment.
The law applies both to information used for initially evaluating a consumer for employment and to information used for ongoing employment purposes.
The Act’s protections with respect to consumer reports include:
- Consent: The new CFPB guidance makes clear that when companies provide these reports, the law requires employers to obtain worker consent before purchasing them.
- Transparency: The CFPB circular emphasizes that employers are required to provide detailed information to workers when taking adverse action – including firing, denials of promotions, and demotions or other reassignments – based on the reports.
- Disputes: The CFPB circular makes clear that when a worker disputes what is in a report, companies are required to correct or delete inaccurate, incomplete, or unverifiable information.
- Limits: The guidance makes clear that employers can only use these reports for purposes that are allowed under the law. For example, employers generally cannot sell this information on the open market or use it to market financial products to their workers.
The CFPB says it will be working with other federal agencies and state regulators to ensure the responsible use of worker data, and it encourages employers to review their current practices regarding the use of third-party consumer reports to ensure compliance with FCRA requirements.
Surveillance technology
Last year, the CFPB submitted a comment letter in response to a White House Office of Science and Technology Policy request for information (RFI) regarding the automated tools used by employers to surveil, monitor, evaluate, and manage workers.
“Workers shouldn’t be subject to unchecked surveillance or have their careers determined by opaque third-party reports without basic protections.”
CFPB Director Rohit Chopra
As the CFPB explained in its comment letter, the agency was most concerned about how worker surveillance products can augment employer’s decisions about everything from hiring to promotion, reassignment and retention. Additional concerns included what the CFPB called “the proliferation of the gathering and use of information resulting from worker surveillance technologies, including as a consequence of employers adjusting work schedules and revising workplace norms in the wake of the COVID-19 pandemic.”
The comment letter ended by saying that using new forms of surveillance technology – like increasingly sophisticated AI tools – does not absolve any business from the federal consumer financial laws.