During a recent White House Roundtable, following a previous inquiry into the business practices of data brokers, CFPB Director Rohit Chopra announced that the agency will be developing rules to prevent misuse and abuse by data brokers that track, collect and monetize information about people. Many of these firms assemble data to feed “artificial intelligence” that makes decisions about our daily lives, according to the CFPB.
After conducting the aforementioned public inquiry into data brokers and assessing current uses of AI that are often powered by data from the surveillance industry, the CFPB plans to issue proposed rules under the Fair Credit Reporting Act to address business practices used by companies that assemble and monetize consumer data.
On August 15, 2023, the CFPB released remarks of Director Chopra regarding the protection of Americans from data broker practices that the agency believes to be harmful.
In summary, Director Chopra stated that “ ’artificial intelligence’ ” and other predictive decision-making increasingly relies on ingesting massive amounts of data about our daily lives. This creates financial incentives for even more data surveillance. This also has big implications when it comes to critical decisions, like whether or not we will be interviewed for a job or get approved for a bank account or loan. It’s critical that there’s some accountability when it comes to misuse or abuse of our private information and activities.”
Director Chopra made clear that the Consumer Financial Protection Bureau is part of an “all-of-government effort” to tackle the risks associated with AI, and has launched a rulemaking “to ensure that modern-day digital data brokers are not misusing or abusing our sensitive data.”
According to Commissioner Chopra, during the agency’s formal inquiry, the CFPB “learned more about the significant harms – from the identification of victims for financial scams to the facilitation of harassment and fraud. While these firms go by many labels, many of them work to harvest data from multiple sources and then monetize individual data points or profiles about us, sometimes without our knowledge. These data points and profiles might be monetized by sharing them with other companies using AI to make predictions and decisions.”
Of particular concern to the CFPB is that “today’s surveillance firms have modern technology to build even more complex profiles about our searches, our clicks, our payments, and our locations. These detailed dossiers can be exploited by scammers, marketers, and anyone else willing to pay.
Federal and state lawmakers are increasingly considering efforts to expand personal data protections, particularly when it comes to AI.
In 1970, Congress enacted the Fair Credit Reporting Act. The law covers a broad range of background reports assembled on consumers, even beyond those used for extending loans. The law granted people new rights and protections, including: (i) safeguards to ensure accurate information, (ii) the right to dispute errors, (iii) the right to access your own information, and (iv) restrictions on how others can use your information.
To ensure that modern-day data companies assembling profiles about us are meeting the requirements under the Fair Credit Reporting Act, the CFPB will be developing rules to prevent purported misuse and abuse by these data brokers.
Two of the proposals under consideration are worth highlighting:
First, rules under consideration will define a data broker that sells certain types of consumer data as a “consumer reporting agency” to better reflect today’s market realities. The CFPB is considering a proposal that would generally treat a data broker’s sale of data regarding, for example, a consumer’s payment history, income and criminal records as a consumer report because that type of data is typically used for credit, employment and certain other determinations. This would trigger requirements for ensuring accuracy and handling disputes of inaccurate information, as well as prohibit misuse.
A second proposal under consideration will address confusion around whether so called “credit header data” is a consumer report. Much of the current data broker market runs on personally identifying information taken from traditional credit reports, such as those sold by the big three credit reporting conglomerates – Equifax, Experian, and TransUnion.
This includes key identifiers like name, date of birth, and Social Security number that are contained in consumer reports generated by the credit reporting companies. The CFPB expects to propose to clarify the extent to which credit header data constitutes a consumer report, reducing the ability of credit reporting companies to impermissibly disclose sensitive contact information that can be used to identify people who do not wish to be contacted, such as domestic violence survivors.
Any updated rules under the Fair Credit Reporting Act can be enforced by the CFPB and state law enforcement across sectors of the economy.
Federal Trade Commission attorneys, the Department of Transportation, the Department of Agriculture, and other agencies can enforce these rules for specific sectors under their jurisdiction.
The CFPB’s data broker rulemaking will complement other work occurring across levels of government, especially by FTC lawyers, which is leading so many efforts on privacy and data security.
The CFPB is expected to publish an outline of proposals and alternatives under consideration for a proposed rule, shortly, and seeks public input on the proposals under consideration.
Richard B. Newman is an FTC advertising compliance, investigation and defense attorney at Hinch Newman LLP. Follow FTC defense lawyer on National Law Review.
Informational purposes only. Not legal advice. May be considered attorney advertising.