In November 2018, Marco Antonio Fernandez, returned home after a yearlong Navy deployment in South Korea and searched for an apartment near his next posting in Fort Meade, Maryland.
Fernandez, whose national security work had already earned him a top-secret clearance, had little to worry about when he was asked to undergo a tenant screening — a process involving credit, criminal records and eviction checks. But the screening’s algorithmic-based software rejected him for an apartment: It found he had a drug conviction and three misdemeanors for petty theft. That’s because it confused him with Marco Alberto Fernandez Santana, an alleged Mexican drug trafficker.
The correct Fernandez sued RentGrow, a tenant screening firm, in a proposed class-action filed in Baltimore in April 2019, and has now also similarly sued CoreLogic Credco, a division of a larger property analytics firm, CoreLogic, in federal court in San Diego last July. In both cases, Fernandez says the groups violated the 51-year-old Fair Credit Reporting Act, which allows consumers to see and challenge data held by private companies about them. Fernandez’ lawyers said in court papers that the “inaccurate reporting will follow Plaintiff for the rest of his career as he is reinvestigated every five years to maintain his Top Secret security clearance.”
The judge in San Diego has put the case on hold until the Supreme Court rules in a related case that is set to be heard this month. Across the country in Baltimore, RentGrow has also asked that the judge issue a stay.
Housing law advocates say that Marco Antonio Fernandez is one of thousands of people who are mistakenly flagged by tenant screening software that culls criminal records data from many sources and that is made by CoreLogic, RentGrow, RealPage, AppFolio and a handful of other companies. This industry has accelerated over the last two decades as the rental market has increased and the digitization and real estate analytics market has boomed. Nearly all landlords now use some sort of tenant screening software as a way to find who they consider to be the highest-quality tenants.
“The status quo disproportionately impacts vulnerable populations,” said Nicol Turner Lee, a senior fellow in governance studies and the director of the Center for Technology Innovation, at the Brookings Institution, who has studied algorithmic bias. “I think first and foremost we need to solidify privacy law: We have a better grasp on what is being collected, how long and being transparent about that data. That’s fundamental.”
It’s big business, too: Real estate tech and tenancy screening firms have drawn the interest of Wall Street investors in recent years. Data from PitchBook, a financial data firm, shows that the number of private equity deals in this area has jumped in total value from $1.7 billion in 2018 to $6.9 billion in 2019 and $6.6 billion in 2020.
Some companies have learned they want to stay away. Last July, CoreLogic formally announced that it would divest itself from this part of its business, possibly because of lawsuits, and possibly because this division wasn’t making enough money. As of February, CoreLogic told investors it is entirely out of the tenant screening business.
Other companies, however, only see its potential. Stone Capital and Insight Partners acquired CoreLogic for about $6 billion in February, while another private equity firm, Thomas Bravo, acquired RealPage for $9.6 billion in December. Meanwhile, AppFolio was recently dubbed the fastest-growing company in America, according to Fortune magazine. Stone Capital, Insight Partners and Thomas Bravo did not respond to requests for comment.
Legal Tests
But as the industry has grown, so have the lawsuits. Scores of people have sued alleging similar mistakes by these companies in recent years. And this may be the year the industry is forced to change.
The relatively new subset of credit reporting, the $3 billion tenant screening industry, which has both grown and gone largely unregulated since its inception, is about to be tested in new ways. In 2021, a notable lawsuit in Connecticut that could stop how this type of software is used has a good chance of going to trial. At the same time, California legislators are trying to figure out how to reduce algorithmic bias in industries like housing.
This credit reporting issue doesn’t just touch housing, but other kinds of credit-based background searches as well. Fernandez’ San Diego case has been put on hold until lawyers know the outcome of Transunion v. Ramirez in the Supreme Court. The case involves a man from Fremont, California, who tried to obtain a car loan, but was erroneously flagged — like Fernandez — for being on a federal list that prevents American companies from doing business with them. This list is usually reserved for wanted criminals and terror suspects.
The Supreme Court is set to hear oral argument in Transunion on March 31. The justices are being asked to decide whether class-action status should have been granted, and whether members of that class who did not suffer the same injury as the plaintiff, Sergio Ramirez, should receive the same amount in damages as him.
Standing up
In 2016, Carmen Arroyo, a medical assistant from Windham, Connecticut, had a simple request for her landlord: She wanted to bring her adult son, Mikhail, home.
Months earlier, Mikhail had been electrocuted while working atop a utility pole, and fell 30 feet to the ground. The accident left him in a coma for six-months, during which he could barely speak, walk or care for himself. After he recovered somewhat, Carmen Arroyo wanted to move the two of them from her one-bedroom apartment, into a two-bedroom apartment in the same complex, ArtSpace Windham, just east of Hartford.
When Arroyo filled out the new application, she assumed that the process would be perfunctory. Mikhail has lived with her before. But, she soon learned that the property manager, WinnResidential, uses CoreLogic’s CrimSafe service and had examined her son’s criminal background.
WinnResidential quickly rejected her, saying that Mikhail had “disqualifying” records. Arroyo could not figure out what the obstacle was. Her leasing agent refused to answer.
“So now I’m not just angry, but I’m trying to figure it out and play detective here because I’m not getting any answers here,” she said. “Everywhere I turned to was a dead end. There were no answers to my questions on why. All of this while I’m trying to work a 40-hour shift and take care of my son at the hospital. It was a lot for me. It makes a person just so upset but at the same time makes you feel like, ‘What is going on?’”
WinnResidential did not respond to NBC News’ requests for comment.
Arroyo contacted the Connecticut Fair Housing Center, a legal advocacy group in Hartford, to see if it could help. But the lawyers there quickly learned that the landlord really did not know why the Arroyos had been rejected.
“We obtained documentation from the background check that the landlord had received from CoreLogic, and we noticed that they were not misleading us when they said ‘they had no idea,’” said Salmun Kazerounian, one of Arroyo’s lawyers at the center.
Arroyo’s lawyers eventually determined that Mikhail had been charged in Pennsylvania in 2014 with retail theft for $150, but that the charge was ultimately dropped. Arroyo knew nothing about the charge, and Mikhail was not able to speak about it.
Arroyo’s lawyers sued on her behalf not only under the Fair Credit Reporting Act, where monetary damages and settlements are common, but also under the Fair Housing Act, a key pillar of the Civil Rights Act of 1968.
The lawyers say that by making claims concerning tenant screening software under the Fair Housing Act, believed to be for the first time, there is a strong likelihood that CoreLogic may be compelled to changes it software and practices. That’s because a critical guidance issued in 2016 by the Department of Housing and Urban Development said that denying rent based on criminal history could violate the Fair Housing Act.
The legal theory in Arroyo’s lawsuit goes like this: Because Latinos and African Americans are arrested, convicted and incarcerated at higher rates than whites, members of those groups suffer an “unlawful disparate impact” if CoreLogic disqualifies them as prospective tenants based on their criminal history.
In August 2018, CoreLogic’s attorneys argued in a motion to dismiss the case that it was not at fault, because it is simply an outside party: It does not make an affirmative or negative decision to rent a given apartment; only the landlord does that. However, two years later, the judge did not find this argument persuasive.
“RPS [CoreLogic Rental Property Solutions] and WinnResidential acted hand-in-glove to deny Mr. Arroyo housing. RPS allowed screening on the basis of charges that did not lead to a conviction and allowed its customer to conceal from its line staff the basis for an ‘unqualified’ classification,” wrote U.S. District Judge Vanessa Bryant. “In so doing RPS was an integral participant in the denial of housing by WinnResidential to persons charged with an offense even though the charges were dismissed.”
Eric Dunn, of the National Housing Law Project and a member of the Arroyo legal team, celebrated the ruling.
“The screening companies have been pushing an idea that using decision-only screening shields landlords from discrimination claims on the theory that if landlords don’t even know why they are denying an applicant, then how can they be doing so discriminatorily?” he emailed.
“Of course, as this litigation and many other criminal history screening cases have shown, rental screening algorithms often produce results that disproportionately and unnecessarily exclude BIPOC [Black, Indigenous, people of color] applicants or members of other protected classes, ”
For now, Arroyo’s lawsuit is scheduled to go to trial in August.
Internal Fears
Even the creators of these tenant screening programs are starting to raise concerns. Richard Leurig, who worked at CoreLogic for over a decade, until 2017, used to run the Rental Property Solutions division, which dealt with tenant screening. CoreLogic was one of the early pioneers in tenant screening, but lost market share to newer rivals over time.
“When I went to run it [in 2015], they said you have to meet with the attorneys because we have these ongoing lawsuits,” he recalled, noting that when he stepped into the role, he was tasked with determining whether this was a viable business model for CoreLogic to be in anyway.
“I guess I felt like the captain of a sinking ship,” Leurig said, noting that using a patchwork of criminal data from a myriad of sources was a “fundamentally unsolvable problem.”
Government Intervention
Within the last few years, the Federal Trade Commission has pursued lawsuits against two companies, concluding with a $3 million settlement with RealPage in October 2018 and a $4.3 million settlement with AppFolio in December.
RealPage was reminded, as the Fair Credit Reporting Act requires, to “maintain reasonable procedures to assure the maximum possible accuracy.” AppFolio, for its part, in addition to adhering to the act, was no longer allowed to include “nonconviction criminal or eviction records older than seven years.” In the formal settlement filings submitted to federal courts in Texas, and Washington, D.C., neither company admitted any wrongdoing.
“When you have an increase in the use and the increase in the companies that are furnishing these reports, it’s reasonable to see that you would see more of a widespread problem,” said Tiffany George, an attorney with the FTC’s division of privacy and identity protection.
“I think this will continue to be an area of focus for the commission, particularly in light of the wave of evictions as part of the pandemic.”
These companies, including CoreLogic, which either did not respond to requests for comment or declined to make anyone available for an interview, referred questions to the Consumer Data Industry Association, a trade group that represents these companies and traditional credit reporting agencies, including Experian, TransUnion and Equifax. That association defended their members’ algorithmic processes.
“There will be errors,” said Eric Ellman, a senior vice president at the association. “But all of the data that we have seen shows that background checks for employment and background checks for tenant screening are highly accurate and highly reliable.”
In a written response sent to NBC News, the association noted that the FTC acknowledged in a 2004 report that such consumer reporting agencies have “market incentives to maintain and improve the accuracy and completeness of the reports they sell.”
However, that same report also noted: “Consumers who were the subject of inaccurate reports had little or no recourse.”
In its statement to NBC, the association also said that its member consumer reporting agencies “already have a high degree of accuracy.” But it stated that it supported efforts to “improve the accuracy of public records at the source, where the records are first created.”
Changing laws
For years, there have been some calls for audits of such algorithms, including in a 2019 bill in Congress and even some proposed legislation in New York City. But none have gotten off the ground. Bluntly, it’s not always clear how a meaningful algorithmic audit would even be conducted.
But, in December, a member of the California state Assembly proposed what is believed to be the first serious attempt in any state to regulate algorithmic bias in not only housing, but also lending, hiring and more. It first puts the burden on the companies that made the software themselves, and requires them to explain what they have done in the design of their system.
The bill, the “Automated Decision Systems Accountability Act,” seeks to compel companies to explain before a watchdog group how their algorithms are being tested to mitigate against possible bias or adverse impact on a protected class, such as minorities or women.
“This is dipping the toe in the water of algorithmic regulation,” explained Vinhcent Le, an attorney with the Greenlining Institute, an advocacy organization in Oakland, California, that helped write the bill.
The bill’s author, Ed Chau, a Democrat who represents Monterey Park, a city just east of Los Angeles, said in an email that such algorithms have historically been opaque to most people. As of early March, the bill had not yet been heard in committee.
“Establishing transparency and accountability measures for these systems is increasingly urgent,” Chau wrote. “If we fail to act now, it will become exceedingly difficult to implement these regulations.”
Source: | This article originally belongs to Nbcnews.com