
Intelligent Mobile Experiences at Scale
Designing an adaptive, AI-powered mobile system that scales across industries, personas, and platforms.
Role: Mobile Design Lead
Partnered with multiple web designers across teams to deliver cross-surface experiences.
Problem
Mobile workflows slowed agents and frustrated requestors
Research, deep-dive interviews, and discovery studies revealed why:
Too many taps. Simple tasks took forever.
Help was hard to find. Users wanted to self-service, but content was scattered across systems and generic.
Heavy friction. Agents in the field had limited ability to interact with devices, slowing task updates and status changes.
Outdated UX. Compared to competitors, the experience lacked fluid conversational design and felt like a legacy chatbot.
It wasn't modern. It wasn't fast. And it wasn't the experience users expected from mobile.
Opportunity
Use AI to deliver real value in mobile workflows
We asked: how might we use AI's current capabilities to add real value for ServiceNow mobile users?
Explored a wide range of ideas through brainstorming and an internal hackathon.
Grounded concepts in feasibility. Summarization was the LLMs strongest capability at the time.
Focused on intentional use, aiming to reduce friction and surface answers faster rather than layering AI everywhere.
Strategy
Prioritize impact, align across platforms
We needed to design AI where it added the most value, strategically, not reactively.
Focused on strategic placement of AI in high-impact workflows.
Partnered closely with PMs to assess feasibility, prioritize features, and align on roadmaps.
Worked cross-functionally with web teams to keep experiences consistent, despite their different release cadences and larger design resources.
Adopted a mobile-first approach when timelines diverged, ensuring mobile users still received early value.
Process
Iterating fast, adapting timelines
Delivering AI across platforms required us to evolve our way of working.
Shifted mobile from two major releases per year to four (quarterly) to better sync with web and accelerate delivery.
Led mobile design independently while collaborating with multiple web designers to co-develop shared features.
Validated ideas through research.
Curious about how I kept mobile and web aligned while moving fast?
Let’s chat. I’d love to share how I adapted release cadences, balanced being the sole mobile designer, and still drove cross-platform execution.
Solution 1 of 4
Timely task resolution
Problem:
Agents struggled with lengthy, error-prone note taking that slowed them down and left businesses without usable data.
Solution:
Auto-generated resolution notes workflow
Pattern for auto-generating form fields
Reusable component for editing text with AI
Impact:
Cut resolution notes from ~15 mins -to ~5 mins
Increased completion rates (agents stopped skipping the notes or writing gibberish)
Businesses gained better data to fuel knowledge articles.
Solution 2 of 4
Smarter search
Help shouldn’t feel like foraging. We rebuilt search with AI at its core:
Summarized answers right at the top, not buried in a doc.
Pulled from everywhere—SharePoint, Confluence, you name it.
Smart suggestions as you type.
Contextual entry points into chat, so answers could flow into action.
Solution 3 of 4
New AI powered chat
Search points you in the right direction. Chat helps you get things done.
Synthesized, actionable answers—not just text, but next steps.
Conversational workflows to launch forms or kick off tasks.
Agentic AI that felt like a partner, not a bot.
Prominent entry in the tab bar, making chat a first-class citizen.
Search and chat became two halves of the the information finding experience: one finding, one acting. Designed to look and feel connected, so moving between them felt seamless.
Solution 4 of 4
Mobile AI Design System
Integrating AI into mobile wasn’t just plug-and-play. It meant rethinking the system itself. Our old design language wasn’t built for it, so we built a new one as we went. Out of that came a toolkit of patterns now used across mobile, including:
Taglines + disclaimers to keep AI transparent
Loading and latency cues that make waiting feel faster
Contextual entry points so AI shows up exactly where it’s needed
Voice and vision patterns for hands-free, eyes-up interactions
AI-powered editing tools that let people work smarter, not harder
Early impact
We’re still measuring the long-term impact of these new patterns, but the early signals are strong:
Positive feedback in user testing, with employees noting faster answers and fewer dead ends.
High customer interest, with multiple enterprise teams exploring adoption in their workflows.
Clear appetite for AI-powered help, as chat and search became standout demos in customer previews.
The foundation is set. As adoption grows, we expect to see measurable improvements in task resolution speed, fewer tickets filed, and a smoother path from question to action.