UX Research / Product Strategy / UI

AI Lead Nurture

The introduction of AI and what it promised revolutionized nearly every industry in 2023, and the real estate industry was no exception. Our stakeholders were quick to imagine how we might use AI in our products. One of our product owners and leaders within the company, was quick to build in his spare time a prototype of a very basic chat interface where you could interact with AI as if you were the lead and it was the real estate agent. This prototype showed that AI could qualify a lead via a text message conversation. It would ask the lead the primary lead-qualifying questions (LPMAMA) of where they were looking for homes, and at what budget (etc.). This prototype was shown at a PLACE Partner conference and to our CEO in early 2023 and sparked quite a bit of excitement to integrate a lead-qualifying AI-ISA into our CRM. It showed the potential to save agents hours of lead gen time every week. So we were off to the races to define scope and research what the evolution of this feature might be.

Role

UX Researcher

Lead UI Designer

Resposibilities

UX Research

Release Planning (PRDs)

UI Design

project length

6 Weeks (off and on)

Starting Out

I was one of three researcher/designers working on this project. We were really going off no internal research or user feedback (as this was a new feature), or a true feature comparison with direct competitors in the market as this was at the time a new concept. The only note that we received is that our first implementation would be through a text message interaction between the AI & the lead (after the lead was captured on our users’ IDX websites). Like any other project when you don’t know where to begin,

I was one of three researcher/designers working on this project. We were really going off no internal research or user feedback (as this was a new feature), or a true feature comparison with direct competitors in the market as this was at the time a new concept. The only note that we received is that our first implementation would be through a text message interaction between the AI & the lead (after the lead was captured on our users’ IDX websites). Like any other project when you don’t know where to begin, you can begin by asking questions.

We opened up a FigJam board and documented any questions that came to mind and strategized what would be the right questions to ask. As this was the first AI project for all three of us, we had a lot of questions. We grouped and prioritized them before trying to resolve them ourselves, or for those we couldn’t, we had multiple check-ins with devs and stakeholders, the ones who had a greater understanding of what we could do with LLMs.

you can begin by asking questions. We opened up a FigJam board and documented any questions that came to mind and strategized what would be the right questions to ask. As this was the first AI project for all three of us, we had a lot of questions. We grouped and prioritized them before trying to resolve them ourselves, or for those we couldn’t, we had multiple check-ins with devs and stakeholders, the ones who had a greater understanding of what we could do with LLMs.


User Flows

We delved into a comprehensive exploration of user flows to better understand what we didn't know. We explored a number of flows to help us scope out this project and communicate ideas to stakeholders that we would later need to prioritize in our release plans. Some of the flows we explored, as seen above were a initial big picture flow, configuration of AI settings, user pausing the AI, user setting up the AI for the first time, and a feature onboarding Intercom tour flow. The first key user flow involved lead identification and qualification, where we outlined how the AI would engage and categorize leads based on various criteria. This step aimed to enhance the efficiency of lead acquisition and ensure that the AI system could

We delved into a comprehensive exploration of user flows to better understand what we didn't know. We explored a number of flows to help us scope out this project and communicate ideas to stakeholders that we would later need to prioritize in our release plans. Some of the flows we explored, as seen above were a initial big picture flow, configuration of AI settings, user pausing the AI, user setting up the AI for the first time, and a feature onboarding Intercom tour flow. The first key user flow involved lead identification and qualification, where we outlined how the AI would engage and categorize leads based on various criteria. This step aimed to enhance the efficiency of lead acquisition and ensure that the AI system could intelligently prioritize and target the most promising leads.

intelligently prioritize and target the most promising leads.

The second user flow focused on building trust with our users through an AI settings page. Throughout both user flows, a critical aspect was maintaining transparency and control for users, ensuring that the AI-driven processes aligned with ethical considerations and provided users with the ability to fine-tune or override automated decisions as needed. These user flows laid the foundation for a an approach that ensured trust in the mind of the user who is likely using AI for lead gen purposes for the first time themselves.

The second user flow focused on building trust with our users through an AI settings page. Throughout both user flows, a critical aspect was maintaining transparency and control for users, ensuring that the AI-driven processes aligned with ethical considerations and provided users with the ability to fine-tune or override automated decisions as needed. These user flows laid the foundation for a an approach that ensured trust in the mind of the user who is likely using AI for lead gen purposes for the first time themselves.

Further Research

We executed competitive research to learn what we could from other CRMs within and outside of the real estate industry.

As we began to gather answers, we could begin to think about how the AI would simultaneously qualify a lead in real time, and also transparently educate the user on where it was in its lead qualification process. This led to further white-boarding (as you see in the images above), visually charting what the different states the AI could be in, actively qualifying or paused/inactive, how and when the AI could deem a lead ready for handoff, how the AI

We executed competitive research to learn what we could from other CRMs within and outside of the real estate industry.

As we began to gather answers, we could begin to think about how the AI would simultaneously qualify a lead in real time, and also transparently educate the user on where it was in its lead qualification process. This led to further white-boarding (as you see in the images above), visually charting what the different states the AI could be in, actively qualifying or paused/inactive, how and when the AI could deem a lead ready for handoff, how the AI reacts when the lead is unresponsive after a given number of attempted contacts, and so on. We ended up calling these AI-States & AI-Flags.

We soon felt comfortable enough to define a PRD in addition to our whiteboards, to help explain to stakeholders our findings and ideas, subsequently separating out the evolution and requirements of the AI Lead Nurture project into different releases.

reacts when the lead is unresponsive after a given number of attempted contacts, and so on. We ended up calling these AI-States & AI-Flags.

We soon felt comfortable enough to define a PRD in addition to our whiteboards, to help explain to stakeholders our findings and ideas, subsequently separating out the evolution and requirements of the AI Lead Nurture project into different releases.


Design Exploration

The PRD, largely flexible in the beginning, gave us enough of a direction that we would begin to visually explore integration in our CRM, in order to gain further buy-in from stakeholders on the scope of these releases and that we were headed in the right direction. We were specifically asked in the beginning to explore blue-sky (far-out) explorations.

Our early releases tackled Message Center (an area of our product which was already being primarily used for initial contact with new leads) starting with the ability to filter out AI-applied conversations and indication of an active or paused AI on the lead, as well as conversation summaries, and surfacing AI-Flags as a way of the AI telling the user the conclusion it has come to.

Beta Release

(The last release I worked on)

We landed on a solution that covered all of the necessary states and outcomes an AI could be in or run into. That solution was two AI states (active & paused) and five AI flags (Needs Agent, Not Responding, Stopped, Not Delivered and Failed).

We planned on one alpha, one beta, and finally an MVP release. However one alpha release quickly became 3 different releases to our testers as we could fit only a select number of the requirements into different sprints as devs were limited on sprint capacity, busy with other business objectives.

As we began to tackle UI work for the specific releases, we refined big picture concepts of our most highly used screens that we would need for initial implementation. These screens were our Message Center, Contact Detail Page, and Contact Index. I led the final crafting of the UI for all releases.

The Message Center features a new AI tab for filtering the Message Center for AI-applied conversations, AI Flags on conversation threads so the user could visually differentiate those conversations. We added an AI Summary block on AI-applied leads so the user could catch up on what info the lead has captured in its lead qualification objective as well as give us feedback on how the AI was performing for them. We also visually differentiated AI-sent text messages so the user could clearly look back in a thread and see what the AI sent vs what they have sent.

The Leads Index features a new lead group category: AI Leads. On this particular view of the lead index, a user would see various AI filtering options you see at the top of the screen (the text toggle component + filter pills). We also added an AI column, so the user at a quick glance could gain clarification of the AI's state and if the AI has given the user a flag on the AI-applied leads.

The Contact Detail Page features the same AI Summary Block as the Message Center, strategically positioned at the top intentionally to changing the position of that lead's interactions. We want the user to catch up on what the AI has gathered, and stop/pause it before interacting. Additionally, an AI block was added to the right of the page, that sets us up for the future when we would add other AI objectives other than lead qualification. As per requested by a stakeholder, the empty state for that AI block on the right would have a large "Add AI" button spanning the width of the column, to increase feature adoption.

Gathering Insights

Early on we knew that we wanted to test our first alpha and beta versions on a select group of users and gain insights way before we ever released this feature to our entire user base. There is a level of volatility and unpredictability the AI was going to bring into the lead qualification process, and from the very beginning we defined a primary goal of building trust within our users to increase feature adoption. We crafted a forum on which alpha and beta testers could submit feedback, and provide us examples of unwanted AI behavior, so that our devs could further refine the prompt we were using, or set up new guardrails.

FAQs

Discussion Board

Feedback Center

Takeaways

This project gave me a lot of confidence moving forward on conducting a large scale research and explorative phase for a project. It taught me the importance of concept clarity as we needed buy-in from a diverse group of stakeholders, who were rightfully questioning the details of our design decisions that we were making.

AI can be frightening to some users, and the trust you build with your users should not be undervalued. It is an integral piece to building a product that users love to use and share via word-of-mouth marketing to other potential users.

It taught me how to make appropriate compromises when we couldn’t build everything in a release that

This project gave me a lot of confidence moving forward on conducting a large scale research and explorative phase for a project. It taught me the importance of concept clarity as we needed buy-in from a diverse group of stakeholders, who were rightfully questioning the details of our design decisions that we were making.

AI can be frightening to some users, and the trust you build with your users should not be undervalued. It is an integral piece to building a product that users love to use and share via word-of-mouth marketing to other potential users.

It taught me how to make appropriate compromises when we couldn’t build everything in a release that we had planned for. My ability to adapt designs to our users’ needs while simultaneously hitting business objectives, and a sprint schedule, made me a more fluid and seasoned designer.

In truth, this project also taught me patience as the research phase was split up by nearly half a year before it was revisited again, due to a reallocation of my team and I’s priorities. To dive so deep into a project, and yet be pulled away for so long, helped me develop more of an appreciation for the heavy mindmaping, user flows, and documentation we had in Figma, FigJam and Notion so that half a year later, we could jump right back into the depths of where we were at.

This project gave me a lot of confidence moving forward on conducting a large scale research and explorative phase for a project. It taught me the importance of concept clarity as we needed buy-in from a diverse group of stakeholders, who were rightfully questioning the details of our design decisions that we were making.

AI can be frightening to some users, and the trust you build with your users should not be undervalued. It is an integral piece to building a product that users love to use and share via word-of-mouth marketing to other potential users.

It taught me how to make appropriate compromises when we couldn’t build everything in a release that we had

we had planned for. My ability to adapt designs to our users’ needs while simultaneously hitting business objectives, and a sprint schedule, made me a more fluid and seasoned designer.

In truth, this project also taught me patience as the research phase was split up by nearly half a year before it was revisited again, due to a reallocation of my team and I’s priorities. To dive so deep into a project, and yet be pulled away for so long, helped me develop more of an appreciation for the heavy mindmaping, user flows, and documentation we had in Figma, FigJam and Notion so that half a year later, we could jump right back into the depths of where we were at.

planned for. My ability to adapt designs to our users’ needs while simultaneously hitting business objectives, and a sprint schedule, made me a more fluid and seasoned designer.

In truth, this project also taught me patience as the research phase was split up by nearly half a year before it was revisited again, due to a reallocation of my team and I’s priorities. To dive so deep into a project, and yet be pulled away for so long, helped me develop more of an appreciation for the heavy mindmaping, user flows, and documentation we had in Figma, FigJam and Notion so that half a year later, we could jump right back into the depths of where we were at.