


Ardor Cloud
Ardor Cloud
2025
Product Strategy, UI/UX, Prototyping
Founding Product Designer
TL;DR
Founding Product Designer at Ardor, leading the design of an AI-native platform for building and running production-grade software.
I owned the core product experience end-to-end, from UX strategy and AI interactions to UI systems, user research, and execution. My work shaped how users plan, build, deploy, and manage applications through a conversational, chat-first interface, making AI-powered software development accessible without deep technical expertise.
Founding Product Designer at Ardor, leading the design of an AI-native platform for building and running production-grade software.
I owned the core product experience
end-to-end, from UX strategy and AI interactions to UI systems, research,
and execution. This shaped how users plan, build, deploy and manage applications without needing deep AI or infrastructure expertise.
PROBLEM
Too many entry points, no clear starting point
When I joined, Ardor was already powerful but users consistently told us the same thing: they didn't know where to start.
The platform had multiple tools and surfaces, but no clear front door
Users spent more time orienting themselves than actually building
The AI Copilot was powerful but felt disconnected from the rest of the experience, users weren't sure when to use it, or how it related to other tools
As features grew, the gap between what Ardor could do and what users understood widened
When I joined, Ardor was already powerful but users consistently told us the same thing: they didn't know where to start.
The platform had multiple tools and surfaces, but no clear front door
Users spent more time orienting themselves than actually building
The AI Copilot was powerful but felt disconnected from the rest of the experience, users weren't sure when to use it, or how it related to other tools
As features grew, the gap between what Ardor could do and what users understood widened
When I joined, Ardor was already powerful but users consistently told us the same thing: they didn't know where to start.
The platform had multiple tools and surfaces, but no clear front door
Users spent more time orienting themselves than actually building
The AI Copilot was powerful but felt disconnected from the rest of the experience, users weren't sure when to use it, or how it related to other tools
As features grew, the gap between what Ardor could do and what users understood widened
The core issue wasn't that the platform lacked capability. It was that users couldn't find a natural starting point, and without that, the power of the platform became its biggest barrier.
The core issue wasn't that the platform lacked capability. It was that users couldn't find a natural starting point,
and without that, the power of the platform became its biggest barrier.
The core issue wasn't that the platform lacked capability. It was that users couldn't find a natural starting point, and without that, the power of the platform became its biggest barrier.
Problem + Requirements
One of the strongest signals from user interviews was that Ardor felt overwhelming, users didn't
struggle with individual features, but with knowing where to start.
The experience was scattered across multiple surfaces, causing users to spend more time
orienting themselves than building.
Our goal became clear: make conversation the workflow. Let users describe what they want, and have the platform surface the right tools and context around that single entry point.
This meant rethinking the role of every surface in the platform. The canvas, services panel, deployment tools, and settings weren't removed, they became supporting layers that activate based on what the user is doing in conversation with the Copilot.
One of the strongest signals from user interviews was that Ardor felt overwhelming, users didn't struggle with individual features, but with knowing where to start. The experience was scattered across multiple surfaces, causing users to spend more time
orienting themselves than building.
Our goal became clear: make conversation the workflow. Let users describe what they want, and have the platform surface the right tools and context around that single entry point.
This meant rethinking the role of every surface in the platform. The canvas, services panel, deployment tools, and settings weren't removed, they became supporting layers that activate based on what the user is doing in conversation
with the Copilot.
One of the strongest signals from user interviews was that Ardor felt overwhelming, users didn't struggle with individual features, but with knowing where to start.
The experience was scattered across multiple surfaces, causing users to spend more time
orienting themselves than building.
Our goal became clear: make conversation the workflow. Let users describe what
they want, and have the platform surface the right tools and context around that
single entry point.
This meant rethinking the role of every surface in the platform. The canvas, services panel, deployment tools, and settings weren't removed, they became supporting layers that activate based on what the user is doing in conversation with the Copilot.
This visual shows how the core building blocks came together.
USER RESEARCH & CONTINUOUS FEEDBACK
How talking to users shaped the product direction
How talking to users shaped
the product direction
User research at Ardor wasn't a phase; it was the engine behind every major design decision, including the shift to a chat-first experience. I regularly spoke with users across the spectrum, from non-technical "vibe coders" to experienced developers, in structured conversations designed to uncover how they think about building software and where they lose trust in the AI.
User research at Ardor wasn't a phase; it was the engine behind every major design decision, including the shift to a chat-first experience. I regularly spoke with users across the spectrum, from non-technical "vibe coders" to experienced developers, in structured conversations designed to uncover how they think about building software and where they lose trust
in the AI.



"I don't understand what the AI just did" - When the Copilot took actions or something broke, users wanted to see the reasoning, not just the result. Without visibility into what happened and why, they couldn't iterate confidently.
"I don't understand what the AI just did" When the Copilot took actions
or something broke, users wanted to see the reasoning, not just the result. Without visibility into what happened and why, they couldn't iterate confidently.
"I don't understand what the AI just did" - When the Copilot took actions
or something broke, users wanted to see the reasoning, not just the result. Without visibility into what happened and why, they couldn't iterate confidently.
"I don't know where to start" - Users saw the platform's tools but couldn't figure out the right entry point for their task.
"I don't know where to start" - Users saw the platform's tools but couldn't figure out the right entry point for their task.
"It feels like a lot of separate tools" – Even after unifying the UI, the platform still felt like a collection of disconnected surfaces rather than a single workflow.
"It feels like a lot of separate tools" – Even after unifying the UI, the platform still felt like a collection of disconnected surfaces rather than a single workflow.
Some of these pain points sat at the intersection of design and engineering, platform reliability and AI accuracy weren't mine to solve alone. But how failures were communicated, and how users could recover from them, was
a design problem. That's where I focused: making AI behaviour legible so users could understand, trust, and
course-correct.
These insights directly shaped the redesign of the Copilot as the primary entry point, and informed how we surfaced AI reasoning, error states, and recovery paths throughout the experience.
Some of these pain points sat at the intersection of design and engineering, platform reliability and AI accuracy weren't mine to solve alone. But how failures were communicated, and how users could recover from them, was a design problem. That's where I focused: making AI behaviour legible so users could understand, trust, and course-correct.
These insights directly shaped the redesign of the Copilot as the primary entry point,
and informed how we surfaced AI reasoning, error states, and recovery paths throughout
the experience.
Some of these pain points sat at the intersection of design and engineering, platform reliability and AI accuracy weren't mine to solve alone. But how failures were communicated, and how users could recover from them, was a design problem. That's where I focused: making AI behaviour legible so users could understand, trust, and course-correct.
These insights directly shaped the redesign of the Copilot as the primary entry point,
and informed how we surfaced AI reasoning, error states, and recovery paths throughout
the experience.
DESIGNING FOR THE AGENTIC EXPERIENCE
My Design Principles
These principles guided every major decision I made:
Conversation as the front door: start with chat, surface tools as needed
Transparency over magic: AI should explain itself
Progressive disclosure: users grow into power
Mental models first: design how people think, not how systems are built
Consistency compounds: repeated patterns reduce cognitive load
Conversation as the front door:
start with chat, surface tools as needed
Transparency over magic:
AI should explain itself
Progressive disclosure:
users grow into power
Mental models first:
design how people think, not how systems are built
Consistency compounds:
repeated patterns reduce cognitive load
Design Evolution



1
Early Prototype
This is what the UI looked like when I first joined. An experimental, Miro-like canvas focused on collaboration and ideation.
Powerful, but fragmented, with inconsistent layouts and side panels across the platform.
This is what the UI looked like when I first joined. An experimental, Miro-like canvas focused on collaboration and ideation. Powerful, but fragmented, with inconsistent layouts and side panels across the platform.
1
Early Prototype
This is what the UI looked like when I first joined. An experimental, Miro-like canvas focused on collaboration and ideation. Powerful, but fragmented, with inconsistent layouts and side panels across the platform.



2
Unified system
Standardized panels, layouts, and navigation into a cohesive UI system.
Redesigned chat and Copilot interactions to improve transparency and user trust.
Standardized panels, layouts, and navigation into a cohesive UI system. Redesigned chat and Copilot interactions to improve transparency and user trust.
Standardized panels, layouts, and navigation into a cohesive UI system. Redesigned chat and Copilot interactions to improve transparency and user trust.



3
Chat-First Experience
Evolving toward a chat-first platform where conversation is the primary entry point. Users describe what they want to build, and the platform surfaces the right tools and context around that conversation – canvas, deployment, and services become supporting layers, not destinations.
Evolving toward a chat-first platform where conversation is the primary entry point. Users describe what they want to build, and the platform surfaces the right tools and context around that conversation – canvas, deployment, and services become supporting layers,
not destinations.
DESIGN DECISION #1
Platform-wide UX Unification
The product felt fragmented. Different pages used different side panel patterns, layouts varied between core workflows and settings, and there was no clear visual hierarchy to guide primary versus secondary actions. As a result, users spent more time orienting themselves than building.
I audited the entire product and introduced a standardized layout system that unified side panels, content areas, and configuration views. I also applied consistent interaction patterns across key surfaces, including services, deployment, billing, and settings, so users could rely on familiar behaviors as they moved through the platform.
This reduced cognitive switching across workflows and created a scalable foundation for future features. More importantly, it helped the product feel like a single, coherent system rather than a collection of disconnected tools.
The product felt fragmented. Different pages used different side panel patterns, layouts varied between core workflows and settings, and there was no clear visual hierarchy to guide primary versus secondary actions. As a result, users spent more time orienting themselves than building.
I audited the entire product and introduced a standardized layout system that unified side panels, content areas, and configuration views. I also applied consistent interaction patterns across key surfaces, including services, deployment, billing, and settings, so users could rely on familiar behaviors as they moved through the platform.
This reduced cognitive switching across workflows and created a scalable foundation for future features. More importantly, it helped the product feel like a single, coherent system rather than a collection of disconnected tools.
The product felt fragmented. Different pages used different side panel patterns, layouts varied between core workflows and settings, and there was no clear visual hierarchy to guide primary versus secondary actions. As a result, users spent more time orienting themselves than building.
I audited the entire product and introduced a standardized layout system that unified side panels, content areas, and configuration views. I also applied consistent interaction patterns across key surfaces, including services, deployment, billing, and settings, so users could rely on familiar behaviors as they moved through the platform.
This reduced cognitive switching across workflows and created a scalable foundation for future features. More importantly, it helped the product feel like a single, coherent system rather than a collection of disconnected tools.






ARDOR UI (BEFORE)
ARDOR UI (BEFORE)
ARDOR UI (BEFORE)
ARDOR UNIFIED UI (NEW)
ARDOR UNIFIED UI (NEW)
ARDOR UNIFIED UI (NEW)
DESIGN DECISION #2
Designing AI Chat & Copilot Interactions
When we observed how users interacted with the platform, we noticed two recurring patterns: users would click around trying to find the right tool, often getting overwhelmed by features presented all at once, and when they did use the Copilot, they were unsure what it was doing or why, which made it hard to trust or iterate on its outputs.
This told us the platform needed both a clearer entry point and a more transparent AI experience. The Copilot needed to be positioned as the main way users interact with the platform, but only if users could actually see and understand what it was doing. That's what drove the redesign of the prompt box, chat panel, and ultimately the shift to a chat-first entry point.
Prompt Box:
Restructured how users communicate with the Copilot, making AI actions and
reasoning visible rather than opaque.
Chat Panel:
Evolved from a simple messaging interface into a control surface that exposes progress,
intermediate steps, and outputs in context.
Chat-First Entry Point:
Elevated chat from a side panel to the platform's front door, so users start with
a conversation and the right tools surface around them.


REDESIGNED PROMPT BOX TURNS AI INTERACTIONS FROM A BLACK BOX, INTO A VISIBLE WORKFLOW.
REDESIGNED PROMPT BOX TURNS AI INTERACTIONS FROM A BLACK BOX, INTO A VISIBLE WORKFLOW.

REDESIGNED PROMPT BOX TURNS AI INTERACTIONS FROM A BLACK BOX, INTO A VISIBLE WORKFLOW.
Chat Panel
The chat panel evolved from a simple messaging interface into a control surface for AI-driven work.
By exposing progress, intermediate steps, and concrete outputs in context, the experience helps users stay oriented, intervene when needed, and build confidence in AI-assisted changes without breaking flow.
The chat panel evolved from a simple messaging interface into a control surface for AI-driven work.
By exposing progress, intermediate steps, and concrete outputs in context, the experience helps users stay oriented, intervene when needed, and build confidence in AI-assisted changes without breaking flow.
The chat panel evolved from a simple messaging interface into a control surface for AI-driven work.
By exposing progress, intermediate steps, and concrete outputs in context, the experience helps users stay oriented, intervene when needed, and build confidence in AI-assisted changes without breaking flow.



Chat–First Entry Point
Elevated chat from a side panel to the platform's front door. Users begin by planning, scoping, and defining requirements through conversation with the Copilot and as their intent becomes clear, the platform surfaces the right tools like canvas, deployment, and services to support the next step.
Elevated chat from a side panel to the platform's front door. Users begin by planning, scoping, and defining requirements through conversation with the Copilot and as their intent becomes clear, the platform surfaces the right tools like canvas, deployment, and services to support the next step.
Elevated chat from a side panel to the platform's front door. Users begin by planning, scoping, and defining requirements through conversation with the Copilot and as their intent becomes clear, the platform surfaces the right tools like canvas, deployment, and services to support the next step.



DESIGN DECISION #3
Designing for Real Developer Workflows
To design effectively for a platform that builds and runs real software, I needed to deeply understand the underlying systems. Ardor sits at the intersection of AI, infrastructure, and developer tooling, where poor abstractions can quickly break user trust.
I invested time in learning how large language models behave in practice, how GitHub-based workflows operate, and how deployment, observability, and billing function in production environments. I then translated this understanding into product designs for GitHub integration, deployment and logging views, and billing flows that balanced abstraction with necessary visibility.
This work grounded Ardor in real-world developer expectations. Users could move from experimentation to production with fewer surprises, while the platform maintained the simplicity required for less technical builders.
To design effectively for a platform that builds and runs real software, I needed to deeply understand the underlying systems. Ardor sits at the intersection of AI, infrastructure, and developer tooling, where poor abstractions can quickly break user trust.
I invested time in learning how large language models behave in practice, how GitHub-based workflows operate, and how deployment, observability, and billing function in production environments. I then translated this understanding into product designs for GitHub integration, deployment and logging views, and billing flows that balanced abstraction with necessary visibility.
This work grounded Ardor in real-world developer expectations. Users could move from experimentation to production with fewer surprises, while the platform maintained the simplicity required for less technical builders.
To design effectively for a platform that builds and runs real software, I needed to deeply understand the underlying systems. Ardor sits at the intersection of AI, infrastructure, and developer tooling, where poor abstractions can quickly break user trust.
I invested time in learning how large language models behave in practice, how GitHub-based workflows operate, and how deployment, observability, and billing function in production environments. I then translated this understanding into product designs for GitHub integration, deployment and logging views, and billing flows that balanced abstraction with necessary visibility.
This work grounded Ardor in real-world developer expectations. Users could move from experimentation to production with fewer surprises, while the platform maintained the simplicity required for less technical builders.

AI-ASSISTED DEPLOYMENT FAILURES, SUMMARISED FOR CLARITY WITHOUT LOSING TECHNICAL TRACEABILITY
AI-ASSISTED DEPLOYMENT FAILURES, SUMMARISED FOR CLARITY WITHOUT LOSING
TECHNICAL TRACEABILITY

AI-ASSISTED DEPLOYMENT FAILURES, SUMMARISED FOR CLARITY WITHOUT LOSING TECHNICAL TRACEABILITY
GitHub Integration
I designed the GitHub integration to align with developers’ existing mental models, making repository connection, syncing, and deployment feel familiar and predictable rather than abstracted or opaque. The experience balances automation with visibility, ensuring users understand what’s connected, what’s being synced, and how changes flow into production.
For deeper technical details, see the full documentation
A key takeaway from Edulabs is the value of guided, customisable workflows
that help admins efficiently manage high-frequency tasks like term setup. Streamlining steps into an integrated workflow, bulk actions, robust filtering options, and flexibility in editing invoices allow admins to reduce errors and
overall friction.
For OClass, adopting and enhancing these practices is essential to deliver a
user-friendly experience for critical, recurring workflows.



CURRENT FOCUS
Turning ambiguity into shared direction
Brainstorm workshops at Ardor are treated as working sessions rather than discussions. As the only UX designer, I collaborate closely with engineers, founders, and marketing to sketch, prototype, and iterate in real time—collapsing feedback loops and turning ambiguity into concrete decisions. These sessions help align user needs, AI behavior, and technical constraints early, ensuring the product evolves through collective ownership rather than isolated handoffs.
My final focus at Ardor was evolving the chat-first direction from concept to execution. The insight that conversation should be the primary interface unlocked a broader rethinking of how every surface in the platform relates to user intent. Instead of organising the platform around modes or tool categories, we're designing
around a simple principle: users start with a conversation, and the platform assembles the right context
and tools around that conversation.
Brainstorm workshops at Ardor are treated as working sessions rather than discussions. As the only UX designer, I collaborate closely with engineers, founders, and marketing to sketch, prototype, and iterate in real time—collapsing feedback loops and turning ambiguity into concrete decisions. These sessions help align user needs, AI behavior, and technical constraints early, ensuring the product evolves through collective ownership rather than isolated handoffs.
My final focus at Ardor was evolving the chat-first direction from concept to execution. The insight that conversation should be the primary interface unlocked a broader rethinking of how every surface in the platform relates to user intent. Instead of organising the platform around modes or tool categories, we're designing around a simple principle: users start with a conversation, and the platform assembles the right context
and tools around that conversation.
Brainstorm workshops at Ardor are treated as working sessions rather than discussions. As the only UX designer, I collaborate closely with engineers, founders, and marketing to sketch, prototype, and iterate in real time—collapsing feedback loops and turning ambiguity into concrete decisions. These sessions help align user needs, AI behavior, and technical constraints early, ensuring the product evolves through collective ownership rather than isolated handoffs.
My final focus at Ardor was evolving the chat-first direction from concept to execution. The insight that conversation should be the primary interface unlocked a broader rethinking of how every surface in the platform relates to user intent. Instead of organising the platform around modes or tool categories, we're designing around a simple principle: users start with a conversation, and the platform assembles the right context
and tools around that conversation.


COLLABORATIVE BRAINSTORM WORKSHOPS
COLLABORATIVE BRAINSTORM WORKSHOPS

COLLABORATIVE BRAINSTORM WORKSHOPS
RETROSPECTIVE
What I learned designing an AI-native product
What I learned designing an
AI-native product
Over the past 6 months at Ardor, I've learned that designing for AI is less about novelty and more about responsibility. AI accelerates iteration and expands what users can build, but it also amplifies confusion when intent, context or system behaviour is unclear. The most impactful design work often came from slowing down to make AI actions legible so users could trust, steer and build alongside the system. Designing for AI ultimately means designing for evolving mental models, where clarity, control, and adaptability matter more than polished interactions alone.
AI has unlocked unprecedented possibilities within a single tool, requiring a rethink of traditional UI paradigms. Interfaces will become more dynamic and adaptive, shifting with user intent and context to maintain clarity, focus, and flow throughout the workflow.
ver the past 6 months at Ardor, I've learned that designing for AI is less about novelty and more about responsibility. AI accelerates iteration and expands what users can build, but it also amplifies confusion when intent, context or system behaviour is unclear. The most impactful design work often came from slowing down to make AI actions legible so users could trust, steer and build alongside the system. Designing for AI ultimately means designing for evolving mental models, where clarity, control, and adaptability matter more than polished interactions alone.
AI has unlocked unprecedented possibilities within a single tool, requiring a rethink of traditional UI paradigms. Interfaces will become more dynamic and adaptive,
shifting with user intent and context to maintain clarity, focus, and flow throughout the workflow.
Over the past 6 months at Ardor, I've learned that designing for AI is less about novelty and more about responsibility. AI accelerates iteration and expands what users can build, but it also amplifies confusion when intent, context or system behaviour is unclear. The most impactful design work often came from slowing down to make AI actions legible so users could trust, steer and build alongside the system. Designing for AI ultimately means designing for
evolving mental models, where clarity, control, and adaptability matter more than polished interactions alone.
AI has unlocked unprecedented possibilities within a single tool, requiring a rethink of traditional UI paradigms. Interfaces will become more dynamic and adaptive, shifting with user intent and context to maintain clarity, focus, and flow throughout the workflow.