With each passing year, technology helps us work smarter in some ways, but not as smart in others. As generative AI increasingly becomes our personal assistant, there is one key area we must not outsource: compliance with fair housing laws.
why?
You may not know it, but there are already examples of generative AI exacerbating or reinforcing inequities. here and hereWith significant and costly penalties yet to start being imposed, now is the perfect time to chart a course or course correct for your team.
Also, and correct me if I forgot, the real estate industry is more heavily regulated (there are many laws protecting different demographics) and overseen on many fronts than other industries. Just to be clear, depending on where you are in the US, some of the protected classes include:
- Race
- color
- Sex
- Family situation
- nationality
- Disability (this evolved to “people who use assistive devices”)
- religion
- Year
- ancestor
- Sexual orientation
- Gender Identity
- marital status
- Military Status
- Victims of domestic violence
- Source of income
- Genetic information
- pregnancy
- HIV/AIDS
- Criminal history
- others
In today’s litigious environment, this is a perfect time to wonder, “Will artificial intelligence (e.g., AI chatbots) get me ‘canceled’, blocked, fined, or even jailed?”
Not if you remember these nine considerations for responsible AI in real estate.
- How does this app/tool integrate fair housing (including fair lending) laws at the federal, state, and local levels? Fair Housing Decoder Tip: We’ve noticed that some of the most popular chatbots and other generative AI include the federal “big 7” (race, color, sex, familial status, national origin, disability, religion) but do not include fair housing laws at the state or local level.
- How often is the app/tool updated to reflect policy changes? Fair Housing Decoder Tip: There have been many new or updated fair housing laws and case law enacted or updated in the US within the last 12 months, so developers should take legal changes into account at least monthly.
- Has the developer consulted with local, regional or national fair housing agencies and conducted pair testing (such as considering mystery shoppers of different protected classes)?
- How does this app/tool target people (e.g., a “marketing avatar”)? Fair Housing Decoder Tip: In business school, we’re taught to have a “customer avatar.” This is basically the ideal customer your brand targets. But with fair housing (which includes fair lending), your ideal customer cannot exclude a protected class. The key word here is “exclude.” Sure, you can have specialized resources for people going through a divorce, for example, but you don’t exclude (reject) people who aren’t.
- Is the “target” based on a fair housing protected class (federal, local, or trade group)? Fair Housing Decoder Tip: Use a tool that allows you to focus on property features (“perfect home for a family of five” vs. “a home with five spacious bedrooms that can be used however you like”) rather than people features.
- How does this app/tool handle different neighborhoods/zip codes? Fair Housing Decoder Tips: Modern-day Redlining Cases (cf One case) indicates a business that doesn’t offer the same services to neighboring areas. This is something you should never do.
- Are you “directing” people of certain demographics to zip codes that don’t direct other people? Fair Housing Decoder Tip: Even if the developer isn’t pair testing, your team can do it! With new technology, it’s important that your team goes the extra mile to ensure they don’t face legal penalties.
- How does this app/tool fall into its niche?
- Is your niche based on a protected class? Fair Housing Decoder Tip: There is wealth in niches, but there are also “face-snatching cases.” Narrow your niche unless it’s based on a protected demographic.
The seven pillars of responsible AI governance include compliance, trust, transparency, fairness, efficiency, human touch, and reinforcement learning. The questions above summarize these and can help you start and build an AI partnership. In a litigious industry, if developers are not willing to be transparent about any of these areas (starting with the eight questions above), it may be wise not to be an early adopter of a given platform.
This column does not necessarily reflect the opinion of HousingWire editorial staff or its owners.
To contact this article’s editor: [email protected]