Skip to content

Fair Housing Compliance When Using AI Screening Tools

May 4, 20257 min readCompliance
Share

The fair housing + AI intersection

The Fair Housing Act prohibits discrimination based on race, color, religion, sex, national origin, familial status, and disability. State and local laws add additional protected classes. When you use AI to screen tenants, you're responsible for ensuring the AI doesn't discriminate — intentionally or unintentionally.

AI discrimination can happen in three ways: disparate treatment (treating prospects differently based on protected characteristics), disparate impact (neutral policies that disproportionately affect protected groups), and steering (directing prospects toward or away from certain properties based on demographics).

Questions your AI must never ask

These are red-line questions that create immediate fair housing risk:

  • Race, ethnicity, or national origin — including inference from name or accent.
  • Religion — including questions about religious holidays, dietary restrictions, or place of worship.
  • Familial status — including questions about pregnancy, number of children, or plans to have children.
  • Disability — including questions about medical conditions, medications, or need for accommodations.
  • Sexual orientation or gender identity — protected in many jurisdictions.
  • Source of income — increasingly protected; you cannot discriminate against Section 8 or other assistance programs in covered jurisdictions.

Questions your AI should ask

Safe, legally neutral screening questions focus on tenancy criteria:

  • Move-in date and lease length preferences.
  • Monthly budget and income verification.
  • Household size (for occupancy compliance only — not to discriminate).
  • Pet information (if you have a pet policy).
  • Smoking status (if you have a non-smoking policy).
  • Rental history and references.

Audit your AI regularly

AI systems can drift over time. A model that was fair at launch may develop biases as it processes more data. Schedule quarterly audits of your AI screening tool:

Review a random sample of screening conversations. Check that all prospects received the same questions in the same order. Verify that no protected-class information was requested or inferred. Document your findings. If you find issues, retrain or reconfigure the AI immediately.

Some jurisdictions now require algorithmic impact assessments for AI tools used in housing. Even if not required, conducting one is best practice.

Documentation is your defense

If a fair housing complaint arises, your documentation is your only defense. Keep records of:

  • Your written screening criteria (dated and signed).
  • The AI's question set and configuration.
  • All screening interactions (questions asked, answers given).
  • Your audit logs and any corrective actions taken.
  • Training materials provided to staff on fair housing and AI use.
fair housingcomplianceAI screeninglegal
Share

Ready to put these ideas into action?

Automate your rental inquiries, pre-screening, and follow-ups with Rentalot.