AI Scenario Modeling

This project is password protected

Work My Design Process Resume

AI Scenario Modeling

Building the first chapter of PepsiCo’s GTM Scenario Tool—where scenario work earns a real home, and where predictive modeling has room to grow.

ROLE

Lead UX Designer

DURATION

February 2026 – Present

TOOL

Figma, Figma MCP, Cursor

TEAM

Google, PepsiCo, Product, Engineering

Product demo

ABOUT THE PROJECT

When plans have to lock, can the org still show one clear picture of scenarios—or only a patchwork of files and threads?

Scenario work splinters across tools and conversations; ownership and truth get fuzzy right when leadership needs alignment.

The bet is predictive scenario modeling. Phase 1 doesn’t ship that yet—it builds the governed workspace first. I lead UX with Google and PepsiCo.

PROBLEM & INTENT

From deterministic & manual to predictive & guided—so people log in and see what models ran and what needs attention, not just another crowded UI. Phase 1 doesn’t ship that yet; it builds the governed workspace so prediction has somewhere credible to land.

Today — deterministic & manual

  • Heavy manual work, complicated interactions, no predictions
  • Scenarios and context splinter across tools; ownership and status get lost

Where we’re headed — predictive & guided

  • Predictive modeling in-product—runs, deltas, and what to act on
  • On login: what the model ran and what deserves attention
  • Phase 1 workspace (scenarios, portfolio, status) becomes the shell for that guidance

That work lives in the GTM Scenario Tool: a scenario here is a real object in the product—bounded “what if” work that can be owned, tracked, and compared—not a slide title. Phase 1 is how that vocabulary shows up in the UI.

THE CHALLENGE

Trust

Credibility under pressure: high-stakes planning—every label and state builds trust or erodes it while models evolve.

One workspace

Novices and power users together: first-time scenario owners and daily operators share the same home.

Portfolios

Different realities: GTM and supply constraints still need to read as one leadership picture.

SOLUTION (PHASE 1)

Phase 1 is a manual, step-by-step flow: people define scope, choose stores, read baseline truth, run scenarios with chosen or default parameters, read results against that baseline, and compare runs. Nothing is “magic” yet—but that loop produces the structured data and habits the product needs before predictive scenario modeling can land. The product demo above shows the workspace and scenario flow; the clip here walks the core modeling path.

Create & scope

Define scope, then select stores or upload a list for the run.

Baseline

Baseline data—actuals for those stores in the selected period—before anything changes.

Run

Adjust parameters or run with defaults.

Results

Executive summary vs the baseline—did the scenario work?

Compare

Two scenarios side by side—where each wins.

Modeling walkthrough

Manual today—structured so prediction has real runs to learn from tomorrow.

Groundwork for prediction

This manual state is intentional: it lays the groundwork for predictive scenario modeling as data accumulates and the AI system is trained on real runs—not on guesses layered over a broken workflow.

Where the product goes next

01

Predictive modeling from richer history and trained models

02

Guided experiences tuned to what “attention-worthy” means per portfolio

03

Automation and AI steps on the same baseline / compare model Phase 1 establishes

COLLABORATION: GOOGLE × PEPSICO

Partner

PepsiCo

Ground truth on how work really happens

Partner

Google

Patterns, platform scale, and what we owe users

How we work together

Cadence

Standing design and product conversations—not periodic readouts—so feedback lands while it can still change the experience.

Critique

Both sides bring constraints: PepsiCo on operations, Google on scalable patterns and what we owe users when models misbehave.

System of record

One Figma home: components, states, annotations, and accessibility notes that survive into build.

Sequencing

Joint sequencing with engineering so Phase 1 ships as a coherent story, not a scattered MVP.

Ownership

I own the UX narrative end to end: clarifying problems with PepsiCo, pairing with Google on IA and interaction patterns, and insisting the interface stay honest—especially when AI or modeling edges are still catching up. Partners own feasibility and roadmap; I own whether the product still reads as trustworthy when you’re tired and under deadline.

REFLECTION

Challenges

Balancing work across two teams came with a learning curve—but everyone showed up ready to collaborate, which turned the friction into one of the most valuable parts of the project: learning from new partners and designers while still sharpening my own craft.

A collaboration and product shaped like this was a first for PepsiCo, so we were all learning as we went. That made the room genuinely supportive—but “first” still meant ambiguity, course-corrections, and challenges we had to surface and solve together.

What I’m carrying forward

This work isn’t only about designing for AI—it’s about designing with AI. From Phase 2 onward, I want to weave in more AI tooling so handoffs stay faster and clearer for everyone involved.

AI-assisted prototypes already improved shared understanding; the next step is handing off front-end code where it makes sense, so design-to-build is tighter end to end.

I’m also carrying the collaboration habits—and the excitement—to keep learning and designing alongside others, not in a silo.

Let's Connect!