Editor’s note: AI Pulse is RISMedia’s ongoing roundup of AI happenings, providing trends and real-world use cases to help navigate the rapidly evolving AI landscape.
Why ChatGPT gave opposite answers to a buyer and seller on a $50 million deal
It is a frustrating situation that nearly every agent is familiar with: a client trusts some other source or authority, maybe their Zestimate, maybe their uncle who was a broker in the 1980s, over your advice.
But what happens when both the buyer and seller are listening to the same authority—and that authority is literally contradicting itself?
Celebrity agent Ryan Serhant and owner of SERHANT. posted on social media earlier this year that a major luxury deal almost didn’t happen because both the buyer and the seller asked ChatGPT for advice—and were given opposite assessments of the situation.
According to Serhant, the AI chatbot told the buyer they were overpaying, and the seller that they should be listing for higher. That almost derailed the deal entirely, he said, as both sides were convinced they were getting the short end of the stick.
After a lot of time and effort, Serhant said he was eventually able to get both parties back on the same page.
“Agent beat AI—this time,” Serhant said in a subsequent interview with Fox Business.
The phenomenon of AI subverting actual human authority is not new, Serhant continued, noting that before AI usage became widespread, buyers and sellers fell for unreliable internet—and human—sources.
“ChatGPT is a version of that,” he said.
Serhant told Fox Business that he told the buyer and seller in this scenario that AI is heavily influenced by inquiries and how questions are framed: For instance, asking if you are overpaying can clue the chatbot into the fact that you already believe that you likely are.
The fact that, in this scenario, the same AI model was assessing the same property entirely differently proved it wasn’t objectively looking at the housing market or data the way agents do.
“AI can model the market, but it can’t model the deal,” he said.
What is a “Synth Human?” HomeServices of America wants to put a face on your AI rep
Her name is Mae, HomeServices of America said, and she is more than a pretty face—in fact, she may be several.
Last month, the Minnesota-based firm announced that it would be launching a “Synth Human” who will interface with consumers visiting HomeServices brokerage sites.
Mae is the creation of Reliant AI, which aims to create “hyper-realistic, enterprise-grade AI” specifically for the real estate industry that “finally feel(s) human,” according to the company’s website. Largely intended to speak to consumers, Mae is also able to liaison with agents and provide details on what a buyer or seller is looking for.
Speaking to the Minnesota Star-Tribune, HomeServices of America CEO Chris Kelly recently got into more details regarding what Mae can do (or look like), saying she could offer different appearances or sound different in different markets as the company “develop(s) different personas.”
And although Mae herself said in a video interview with Kelly that she can “understand and integrate the nuances of human communication,” including interpreting visual and body-language cues during a video conversation, Kelly told the local news outlet that he was not worried Mae would replace agents due to consumers’ desire for emotional guidance during a transaction.
Can you “vibecode” your way to consumers? A Compass agent says she did
One of the disciplines AI has proven most effective at is coding. People with relatively little experience in programming languages or building software have, with at least some success, built apps or projects from the ground up that provide useful business functions.
Irina Norrell, a Compass agent in the D.C. area with “no technical background,” claims that over several months in 2025, she built what she is billing as a “hyperlocal, consumer-first educational resource,” including data breakdowns, closing cost calculators and neighborhood guides.
Norrell wrote that the project started with a developer “whose vision kept diverging from hers.” AI allowed her to put her real estate expertise directly into the website, focusing on the kinds of neighborhood-level info and insights that local agents are best equipped to share.
“AI made it possible to cherry-pick: learn what you need to understand, delegate what you don’t need to do yourself, and still produce something professional without hiring another specialist,” Norrell wrote.
AI listing photos? The line isn’t clear yet—but some agents have already crossed it
Making your listing photos shine with a little digital magic is nothing new, and is a practice that is largely accepted by consumers and the real estate industry (the National Association of Realtors®, NAR, has long drawn the line at the somewhat vague level of “misrepresenting pertinent facts”).
But with AI now able to completely redraw, refurnish or reconstitute whole rooms, consumers—and regulators—are beginning to take notice. California passed a law early this year (going into effect in 2027) that will require disclosure of modified images.
At the same time, some real estate entities are leaning into the power of AI to modify images, with North Carolina-based Hive MLS partnering with proptech company Roomvo to integrate a “home visualizer tool” for members.
How much can you change the appearance of a listing before you’re getting unethical—and counterproductive? Earlier this year, one D.C. based agent appeared to accidentally insert “a nightmarish creature” into a photo of a bathroom, presumably with AI.
Another agent told Business Insider that a seller’s agent had removed a bunch of ugly power lines from all listing photos, making the property a non-starter for his buyer. A Canadian REMAX agent added several new windows to a property exterior with AI, prompting a public apology.
Is relying on common sense or NAR guidelines enough to know how much you can and can’t do with listing photos? Probably most of the time. But as has often been the case in real estate, a small number of agents going too far can give the whole industry a bad name.







