An image of Freelio's project ui

Designing an AI Copilot to Streamline Building Code Compliance

Timeline

8 Weeks (Oct-Dec 2025)

Role

Product Designer (Copilot Experience)

Team

1 PM, 2 designers

Skills

Product thinking, interaction design, heuristic evaluation

Project Context
What is Trax Codes?

Trax is a platform that digitizes Canadian building codes, enabling architects, construction professionals, and engineers to quickly search and reference specific code content. By transforming static codebooks into a searchable digital experience, Trax reduces the time and cognitive effort required to find and validate compliance information.

My role and impact

As a team, we redesigned three core touchpoints across the Trax platform: the Library and Search Results experience, open document pages, and the Copilot feature. I was the sole designer responsible for the Copilot experience and led its design end to end, from problem definition and research through ideation, iteration, and final prototyping.

Trax Copilot represents the platform’s first conversational AI interface and is still under development. Early feedback from clients has been highly positive, as AI search already plays a critical role in their compliance workflows. This project focused on making that experience more discoverable, understandable, and trustworthy so it could better support real-world usage and scale over time.

Where the Experience Broke Down
DEFINE & SCOPE 🎯
Current Workflow

While Trax offered AI-powered Q&A, users struggled to discover and use it effectively. The AI entry point looked nearly identical to standard search modes (tables, document search), even though each required different question formats. This led to accidental clicks, irrelevant results, and frustration.

Because the interface didn't resemble familiar chatbot experiences, users lacked confidence in phrasing questions, interpreting responses, or validating sources. As a result, the feature was underused despite its value.

How might we make AI search more discoverable, intuitive, and trustworthy while reducing confusion between standard and AI-driven modes?
DEFINE & SCOPE 🎯
Establishing Product Requirements (Scoping)

In recent months, Trax has been redefining its business model (free vs. Pro tiers) and rebranding as an "AI-powered Building Codes Copilot." We also recently expanded into a Microsoft Teams extension, integrating a chatbot interface.

I scoped this project by identifying how the existing AI workflow could support these business goals: differentiating feature access across tiers, maintaining experience cohesion, and aligning with the Copilot rebrand.

Most of the Copilot backend already existed in the Teams extension and current Q&A interface. I worked closely with the PM to understand technical capabilities and identify where design could improve discoverability and usability without expanding scope.

Understanding the Existing Experience
PRIMARY RESEARCH 🔎
Heuristic Evaluation

Due to time constraints, I focused on analyzing feature requests from Trax's user feedback board and conducting a heuristic evaluation of the current Copilot experience. This allowed me to quickly surface usability issues and patterns across real user feedback.

Six key themes emerged: discoverability, guidance, readability, reference validation, interaction continuity, and personalization.

Some of the guiding questions I explored during this phase included:

  • What breaks users' flow when refining or building on previous questions?

  • How do users assess whether an AI answer is trustworthy?

  • What signals help users understand how to ask a good question?

  • What makes a response feel overwhelming vs. easy to scan?

SECONDARY RESEARCH 🔎
Competitive Analysis

I then analyzed UpCodes (a US building codes platform) and everyday AI tools like ChatGPT and Gemini, approaching each as a user to document interaction patterns that supported trust, efficiency, and clarity.

Key takeaways

  • Dedicated references panels with inline citations let users validate sources without losing context

  • Clear filtering and visible context indicators help users understand what answers are based on

  • Options to adjust response length/style support different expertise levels

  • Loading states and status messages reduce uncertainty and reinforce transparency

Designing a Clearer Path Forward
IDEATE 🖌️
User Flows

Using insights from research and product requirements, I mapped the existing Copilot flow to identify friction points. I then refined the flow to clarify entry points, guide question-asking, and surface Copilot capabilities more intentionally.

Initial user flow diagram

Refined user flow diagram

I validated the refined flow with the team before moving into wireframing.

IDEATE 🖌️
Lo-Fi Wireframes

I began with low-fidelity wireframes to test layout patterns and interaction models without committing to visual design. This let me focus on information architecture, hierarchy, and core interactions early.

Throughout this phase, I collaborated closely with the PM to confirm scope and ensure design decisions aligned with business goals.

Refining Through Critique
FEEDBACK 💬
Design Critique!

Next, it was time to get some formal feedback from the team. I presented my work during design critiques with the design team and product manager. Feedback focused on usability, consistency across features, and alignment with long-term product strategy.

These discussions directly informed key design decisions and helped refine both the interaction model and feature prioritization.

ITERATE 🔁
Design Decisions

Design decision #1: presenting suggested questions

How should Copilot introduce suggested prompts and capabilities?

Display all suggestions upfront

Quickly shows examples

Takes excessive screen real estate

Questions may become repetitive over time

Progressive disclosure

Selected

Prioritizes explaining what Copilot can do

Reduces visual clutter and cognitive load

Supports users with specific tasks who don't need examples

Allows discovery when needed without forcing it

Why progressive disclosure:

Users needed to understand what Copilot could do before seeing examples. This respected users with specific questions while supporting those needing guidance without overwhelming the interface or competing for attention.

Key considerations:

  • Is understanding capabilities more valuable than seeing example questions upfront?

  • How do we serve both new users and returning users effectively?

Design decision #2: filter display within search bar

How should selected filters be displayed for easy confirmation and editing?

Box-style filters

Harder to accommodate longer filter names

Difficult to scan all selections at once

Inconsistent spacing across one vs. two lines

Pill-style filters

Selected

Scales as users add more filters

Easy to scan and remove selections

Risk of clutter (mitigated by limiting to 2 lines with expansion option)

Why pill-style filters:

This design supported iterative refinement, which is core to how users interact with Copilot. Pills are more scannable, easier to edit, and scale better as filter types expand over time.

Key considerations:

  • How does this support iterative prompting workflows with constant filter refinement?

  • What happens as users select multiple filters in one query?

Design decision #3: pro tier upgrade prompt placement

How should we display the upgrade CTA to Standard users?

Popup within chat

Impossible to miss

Blocks response content

Interrupts the user's primary goal (reading answers)

Likely to become frustrating over time

Persistent sidebar placement

Selected

Keeps response area unobstructed

Remains visible across sessions without interrupting workflows

Accumulates visibility over time naturally

Why the persistent sidebar CTA:

Users' primary goal is reading and validating responses. Interrupting that flow, even for business goals, creates friction. A persistent, non-blocking CTA maintains visibility while respecting the user's task.

Key considerations:

  • What is the user's main goal on this page that we must not interrupt?

Design decision #4: new chat behaviour in standard tiers

How should new questions be handled in the Standard subscription?

Override current chat

Fewer clicks

Matches Pro tier flow

Creates false expectations of chat history and follow-up context

Misleading mental model for Standard users

Start new chat

Selected

Clearly signals questions are independent

Users can see suggested questions for each new question

Removing the search bar immediately communicates limitations

Avoids frustration of typing into a disabled input

Requires extra click from user

Why the "start new chat" flow:

Clarity over convenience. This approach prevented misleading users about Standard limitations while maintaining a clear, honest experience. Removing the disabled search bar avoided a common frustration pattern seen in tools like ChatGPT's free tier.

Key considerations:

  • What mental model will users form at each stage of this flow?

  • How can we keep Standard similar to Pro while communicating limitations honestly?

Design decision #5: project linking functionality

Where should Copilot conversations live within a user's project workflow?

Standalone projects area inside Copilot

Accessible within Copilot

Increases complexity beyond MVP scope

Creates two separate locations for project content

Fragments workflows across the platform

Link chats directly within project pages

Selected

Aligns with users' existing mental model

Avoids duplicating project content locations

Keeps Copilot focused on Q&A, not project management

Reduces navigation and fragmentation

Why the existing project page integration:

Users already understood where their project work lived. Creating a parallel structure would have added confusion and scope. This approach delivered value (saved chats) without fragmenting the experience and staying within scope.

Key considerations:

  • What is the smallest version of this feature that delivers value without expanding scope?

  • Where do users expect Copilot interactions to live within their broader project workflow?

Bringing the Experience Together
PROTOTYPE ▶️
Final Designs!

The redesigned Copilot experience introduces a dedicated conversational interface that's discoverable, guides users through asking questions, and builds trust through transparent sourcing and filtering.

Key improvements:

  • Clear entry point with progressive disclosure of capabilities

  • Inline citations and a dedicated references panel for validation

  • Visible filtering with easy refinement

  • Pro/Free tier differentiation without interrupting workflows

  • Direct integration with project pages for continuity

FINAL DESIGNS ▶️
Polishing for Handoff

After finalizing the flow, I reviewed and edited the flow to ensure alignment with Trax's design system across typography, colours, spacing and padding. I then prepared the designs for engineering handoff using Figma Dev Mode, where I documented component specifications, interaction states, responsive behavior and notes to ensure smooth implementation.

What This Project Taught me
REFLECTIONS 💡
Lessons Learned

Design the flow before the interface

Strong UX starts with how users move through an experience, not individual screens. Starting with low-fidelity sketches let me focus on information architecture and decision points before visual polish. This made my later designs more intentional and grounded in real user flows.

Stay flexible and detached from solutions

Design directions shifted throughout the project, whether aligning with other features or pivoting for business needs. I learned to stay objective, document decisions clearly, and adapt quickly rather than becoming attached to a single solution. Treating design as iterative helped me respond to evolving requirements without losing momentum.

Balance ambition with focus

Exploring AI capabilities made it tempting to expand functionality. I learned to regularly pressure-test ideas against product requirements and user goals, ensuring each decision served the core experience rather than adding complexity. This helped prevent scope creep while keeping the design achievable.

REFLECTIONS 💡
Next Steps

Deepen user understanding through direct user research

This project relied on heuristic evaluation rather than direct user input. With more time, I would recruit Trax clients for task-based interviews and lightweight IA validation (card sorting, tree testing) to understand their mental models around navigation, filtering, and project context. This would ensure future iterations align more closely with real-world workflows.

Validate through usability testing

Multiple rounds of testing would help evaluate core workflows, refining queries, validating responses, navigating projects, and surface assumptions made during design. Since I'm not the primary user, testing would be critical to identify where guidance or interactions could be simplified.

Design for responsive contexts

Current designs are optimized for desktop, but Trax is also used on mobile and tablet for quick, in-the-moment questions. I would explore responsive patterns that preserve clarity and hierarchy on smaller screens, ensuring usability across devices.

Revisit the projects experience with user input

Due to scope constraints, Copilot chats were embedded in existing project pages rather than creating a standalone projects area. With user input, I would explore whether a centralized view or deeper project linking better supports continuity and long-term usage.

Plan for scalability

As Copilot adoption grows, features like filtering and personalization need to scale. With Trax planning to add more filter types, I would design interaction patterns that remain intuitive as usage increases and user needs evolve.

Like what you see? Let's chat!

Made with lots of love🫶🏻 and coffee☕

Like what you see? Let's chat!

Made with lots of love🫶🏻 and coffee☕

Like what you see? Let's chat!

Made with lots of love🫶🏻 and coffee☕