From 47% Abandonment to Clear Navigation

From 47% Abandonment to Clear Navigation

From 47% Abandonment to Clear Navigation

Designing Onboarding That Explains the Process

Designing Onboarding That Explains the Process

Designing Onboarding That Explains the Process

Company

Delib.org

Delib.org

Timeline

Sep-Oct 2024

Sep-Oct 2024

My Role

UX/UI Designer

UX/UI Designer

Project

Mass Consensus Onboarding Design

Mass Consensus Onboarding Design

Status

Design complete, stakeholder-validated. Addresses client's 47% abandonment issue.

Platform evolution needed: dynamic configuration for scalability across multiple clients.

Design complete, stakeholder-validated. Addresses client's 47% abandonment issue.

Platform evolution needed: dynamic configuration for scalability across multiple clients.

The Challenge

A political engagement pilot achieved 31% participation — 372 citizens directly influencing a political leader's Knesset agenda — but lost 47% mid-process


The pilot report identified why: process opacity. Users said "the process is not clear or transparent" and had no visibility into journey length, their progress, or how the system worked.

A political engagement pilot achieved 31% participation — 372 citizens directly influencing a political leader's Knesset agenda — but lost 47% mid-process


The pilot report identified why: process opacity. Users said "the process is not clear or transparent" and had no visibility into journey length, their progress, or how the system worked.

A political engagement pilot achieved 31% participation — 372 citizens directly influencing a political leader's Knesset agenda — but lost 47% mid-process


The pilot report identified why: process opacity.

Users said "the process is not clear or transparent" and had no visibility into journey length, their progress, or how the system worked.

The Solution

I designed onboarding addressing this HIGH PRIORITY finding from the report (process opacity):

Visual journey map

• Progress dots (Tracking progress)

• Plain explanations ("Why random selection?")

• Pause capability

• Confirmation messages

I designed onboarding addressing this HIGH PRIORITY finding from the report (process opacity):

Visual journey map

• Progress dots (Tracking progress)

• Plain explanations ("Why random selection?")

• Pause capability

• Confirmation messages

Design Goal

Increase completion from 53% to 75% by addressing process opacity—the HIGH PRIORITY finding that caused 47% abandonment.


Why: Simpler, shorter, easier to read.

Increase completion from 53% to 75% by addressing process opacity—the HIGH PRIORITY finding that caused 47% abandonment.


Why: Simpler, shorter, easier to read.

The Problem

The Problem

The Problem

47% Walked Away

The numbers told a story:

The numbers told a story:

372 people entered (31% entry rate)

372 people entered (31% entry rate)

174 gave up mid-process (47% drop-off)

174 gave up mid-process (47% drop-off)

198 made it to the end (53% of those who started)

198 made it to the end (53% of those who started)

What the Pilot Report Revealed

The pilot report identified the core issues:

The pilot report identified the core issues:

Process opacity: Users reported the process was unclear and not transparent

Process opacity: Users reported the process was unclear and not transparent

Platform gaps: The platform lacks explanation of how it works

Platform gaps: The platform lacks explanation of how it works

Technical barriers: Mobile bugs and slow performance compounded UX issues

Technical barriers: Mobile bugs and slow performance compounded UX issues

The Real Problem

These weren't people who didn't care — they wanted to participate. But they couldn't navigate something they didn't understand.

These weren't people who didn't care — they wanted to participate. But they couldn't navigate something they didn't understand.

What was missing:

What was missing:

How long the process would take

How long the process would take

Where users were in the journey

Where users were in the journey

Why they saw certain proposals (randomization mechanism)

Why they saw certain proposals (randomization mechanism)

Context when the process shifted between stages

Context when the process shifted between stages


The insight: The platform had no onboarding.

Users clicked a WhatsApp link and landed in a complex consensus process with no guide—feeling lost and overlooked.


The insight: The platform had no onboarding.

Users clicked a WhatsApp link and landed in a complex consensus process with no guide—feeling lost and overlooked.

The Real Problem

These weren't people who didn't care — they wanted to participate. But they couldn't navigate something they didn't understand.

What was missing:

How long the process would take

Where users were in the journey

Why they saw certain proposals (randomization mechanism)

Context when the process shifted between stages


The insight: The platform had no onboarding.

Users clicked a WhatsApp link and landed in a complex consensus process with no guide—feeling lost and overlooked.

The pilot report (Section 3.4) identified the 47%

abandonment as the "single largest opportunity for

improvement" and categorized "Process Transparency

& User Trust" as HIGH PRIORITY

The pilot report (Section 3.4) identified the 47%

abandonment as the "single largest opportunity for

improvement" and categorized "Process Transparency

& User Trust" as HIGH PRIORITY

The pilot report (Section 3.4) identified the 47% abandonment as the "single largest opportunity for improvement" and categorized "Process Transparency & User Trust" as HIGH PRIORITY

Solution

Solution

Solution

A Journey You Can Actually Follow

I designed 6 onboarding touchpoints that work together to make the process navigable:

I designed 6 onboarding touchpoints that work together to make the process navigable:

1. Visual Journey Roadmap

A winding path showing all stages upfront.

Like a trail map before a hike—you know where you're going.

A winding path showing all stages upfront.

Like a trail map before a hike—you know where you're going.

3. "You Are Here" Markers

Progress dots on every screen.

No more "am I almost done?" anxiety.

Progress dots on every screen.

No more "am I almost done?" anxiety.

4. Stage-Specific

Every stage has its own introduction explaining what happens now and why it matters.

Every stage has its own introduction explaining what happens now and why it matters.

5. "You Can Pause Anytime"

Progress saves automatically. Come back when ready.

Progress saves automatically. Come back when ready.

6. Visual Station Markers

Each stage has a unique icon. You always know where you are.

Each stage has a unique icon. You always know where you are.

7. Confirmation Messages

Clear feedback after every action. "Your suggestion was added and will be shown to others."

Clear feedback after every action. "Your suggestion was added and will be shown to others."

2. Plain-Language Explanations

"Why random selection?" → Because fairness matters

"Why random selection?" → Because fairness matters

"How long will this take?" → 10-15 minutes (and you can pause)

"How long will this take?" → 10-15 minutes (and you can pause)

"Did my suggestion go through?" → Yes, others will see it

"Did my suggestion go through?" → Yes, others will see it

Main onboarding

Main onboarding

Visual Journey Roadmap

Evaluation step

Evaluation step

Progress dots

Stage has its own introduction and explanation

Design Decisions That Mattered

Design Decisions That Mattered

Design Decisions That Mattered

Decision 1: Journey Over Progress Bar

What I did: Created a winding path with stations instead of a standard progress bar


Why: Elderly users need concrete visuals, not abstract UI. A journey is something everyone's taken—it's human. Percentage bars feel endless.

What I did: Created a winding path with stations instead of a standard progress bar


Why: Elderly users need concrete visuals, not abstract UI. A journey is something everyone's taken—it's human. Percentage bars feel endless.

Decision 1: Journey Over Progress Bar

What I did: Created a winding path with stations instead of a standard progress bar


Why: Elderly users need concrete visuals, not abstract UI. A journey is something everyone's taken—it's human. Percentage bars feel endless.

Decision 2: Dots Over Percentages

What I did: Used clear dots showing stages


Why: "Step 2 of 5" hits faster than "40% complete"—especially for elderly users. You can count dots. Percentages feel abstract and endless.

What I did: Used clear dots showing stages


Why: "Step 2 of 5" hits faster than "40% complete"—especially for elderly users. You can count dots. Percentages feel abstract and endless.

Decision 2: Dots Over Percentages

What I did: Used clear dots showing stages


Why: "Step 2 of 5" hits faster than "40% complete"—especially for elderly users. You can count dots. Percentages feel abstract and endless.

Decision 3: Explain Everything

What I did: Added a "Why random selection?" explanation even though it adds reading time


Why: The pilot report specifically identified process opacity as a problem.

Users abandoned from lack of understanding, not information overload.

Getting them started quickly doesn't help if they abandon confused 5 minutes later.

What I did: Added a "Why random selection?" explanation even though it adds reading time


Why: The pilot report specifically identified process opacity as a problem.

Users abandoned from lack of understanding, not information overload.

Getting them started quickly doesn't help if they abandon confused 5 minutes later.

Decision 3: Explain Everything

What I did: Added a "Why random selection?" explanation even though it adds reading time


Why: The pilot report specifically identified process opacity as a problem.

Users abandoned from lack of understanding, not information overload.

Getting them started quickly doesn't help if they abandon confused 5 minutes later.

Decision 4: Show the Whole Map Upfront

What I did: Displayed all stages on the very first screen


Why: Seeing the complete journey addresses the "unclear process" issue identified in the pilot.

You're more likely to start a journey when you can see where it ends.

Revealing stages one at a time creates the same "endless tunnel" feeling that contributed to the original 47% abandonment.

What I did: Displayed all stages on the very first screen


Why: Seeing the complete journey addresses the "unclear process" issue identified in the pilot.

You're more likely to start a journey when you can see where it ends.

Revealing stages one at a time creates the same "endless tunnel" feeling that contributed to the original 47% abandonment.

Decision 4: Show the Whole Map Upfront

What I did: Displayed all stages on the very first screen


Why: Seeing the complete journey addresses the "unclear process" issue identified in the pilot.

You're more likely to start a journey when you can see where it ends.

Revealing stages one at a time creates the same "endless tunnel" feeling that contributed to the original 47% abandonment.

Decision 5: Lead With "You Can Pause"

What I did: Put "You can pause anytime—progress is saved" right on the main onboarding


Why: It removes the biggest entry barrier: "I don't have 15 minutes right now."

Users need this reassurance before they commit, not after they're already anxious.

What I did: Put "You can pause anytime—progress is saved" right on the main onboarding


Why: It removes the biggest entry barrier: "I don't have 15 minutes right now."

Users need this reassurance before they commit, not after they're already anxious.

Decision 5: Lead With "You Can Pause"

What I did: Put "You can pause anytime—progress is saved" right on the main onboarding


Why: It removes the biggest entry barrier: "I don't have 15 minutes right now."

Users need this reassurance before they commit, not after they're already anxious.

Working With Constraints

Working With Constraints

Working With Constraints

The Research Reality

The Research Reality

After the September 2024 pilot concluded, I inherited:

After the September 2024 pilot concluded, I inherited:

What I had:

What I had:

• Pilot report with quantitative data (372 entries, 47% abandonment rate)

• HIGH PRIORITY finding: "Process opacity"

• General feedback: "The process is not clear or transparent"

• Patterns from previous elderly user research (FreeDi, DocSign projects)

• Pilot report with quantitative data (372 entries, 47% abandonment rate)

• HIGH PRIORITY finding: "Process opacity"

• General feedback: "The process is not clear or transparent"

• Patterns from previous elderly user research (FreeDi, DocSign projects)

What I didn't have:

What I didn't have:

• Session recordings showing where users got stuck

• Detailed analytics on abandonment points by stage

• Direct user quotes about specific confusion moments

• Ability to conduct follow-up interviews with pilot participants

• Session recordings showing where users got stuck

• Detailed analytics on abandonment points by stage

• Direct user quotes about specific confusion moments

• Ability to conduct follow-up interviews with pilot participants

My Approach

My Approach

Working with limited qualitative user data, I prioritized solutions based on:


• Pilot report findings (process opacity as #1 issue)

• Accessibility research patterns from elderly users (concrete visuals > abstract UI)

• Stakeholder knowledge (PM insights from facilitating the pilot)

• Impact potential (fixed deal-breakers first, then addressed frustrations)

Working with limited qualitative user data, I prioritized solutions based on:


• Pilot report findings (process opacity as #1 issue)

• Accessibility research patterns from elderly users (concrete visuals > abstract UI)

• Stakeholder knowledge (PM insights from facilitating the pilot)

• Impact potential (fixed deal-breakers first, then addressed frustrations)

Validation Strategy

Validation Strategy

Rather than waiting for perfect data, I created:


• Interactive prototype demonstrating complete journey

• Stakeholder alignment on problem prioritization

• Clear success metrics defined for future validation

This approach balanced evidence-based design with practical

constraints while validating the core UX pattern.

Rather than waiting for perfect data, I created:


• Interactive prototype demonstrating complete journey

• Stakeholder alignment on problem prioritization

• Clear success metrics defined for future validation

This approach balanced evidence-based design with practical

constraints while validating the core UX pattern.

What I Learned

What I Learned

Designing for Scale

The 5-stage onboarding successfully addressed the pilot's 47% abandonment issue by providing clear progress indicators and journey visibility.

Post-design, I learned Mass Consensus needs to work for 3-10 stage configurations across different clients. My fixed 5-stage roadmap doesn't flex.

The gap: I built a solution. They needed a system.

Post-pilot insight

As Mass Consensus expands to new clients, each organization runs different process lengths (3-10 stages depending on their decision framework).

The next iteration

The core pattern (journey map, progress dots, stage explanations) validates across all configurations. The enhancement needed is making stage count dynamic rather than fixed—allowing the same UX pattern to flex for different organizational needs.

The core pattern (journey map, progress dots, stage explanations) validates across all configurations. The enhancement needed is making stage count dynamic rather than fixed—allowing the same UX pattern to flex for different organizational needs.

What I'd build

• Dynamic progress system (●●○ for 3 stages, ●●●●●●○ for 7 stages)

• Admin configuration for stage names and descriptions


The lesson: Validate the pattern with real users first, then systematize for scale. The onboarding concept works—the next step is making it configurable.

From 47% Abandonment to Clear Navigation

Designing Onboarding That Explains the Process

The Challenge

A political engagement pilot achieved 31% participation — 372 citizens directly influencing a political leader's Knesset agenda — but lost 47% mid-process


The pilot report identified why: process opacity.

Users said "the process is not clear or transparent" and had no visibility into journey length, their progress, or how the system worked.

The Solution

I designed onboarding addressing this HIGH PRIORITY finding from the report (process opacity):

Visual journey map

• Progress dots (Tracking progress)

• Plain explanations ("Why random selection?")

• Pause capability

• Confirmation messages

Design Goal

Increase completion from 53% to 75% by addressing process opacity—the HIGH PRIORITY finding that caused 47% abandonment.


Why: Simpler, shorter, easier to read.

The Problem

47% Walked Away

The numbers told a story:

372 people entered (31% entry rate)

174 gave up mid-process (47% drop-off)

198 made it to the end (53% of those who started)

What the Pilot Report Revealed

Process was unclear and not transparent. The platform lacked explanation of how it works.

Solution

6 onboarding:

• Visual journey roadmap showing all 5 stages upfront

• Progress dots on every screen

• Plain-language explanations ("Why random selection?")

• "You can pause anytime" with auto-save

• Visual station markers (unique icon per stage)

• Stage-specific context at each transition

Key Design Decisions

Journey Over Progress Bar

Elderly users need concrete visuals, not abstract percentages

Journey Over Progress Bar

Elderly users need concrete visuals, not abstract percentages

Dots Over Percentages

Faster to process, you can count dots

Dots Over Percentages

Faster to process, you can count dots

Explain Everything

Users abandoned from lack of understanding, not information overload.

Explain Everything

Users abandoned from lack of understanding, not information overload.

Show the Whole Map Upfront

Seeing the complete journey addresses "unclear process" feedback

Show the Whole Map Upfront

Seeing the complete journey addresses "unclear process" feedback

Working With Constraints

What I had:

• Pilot report with quantitative data (372 entries, 47% abandonment rate)

• HIGH PRIORITY finding: "Process opacity"

• General feedback: "The process is not clear or transparent"

• Patterns from previous elderly user research (FreeDi, DocSign projects)

My Approach

Working with limited qualitative user data, I prioritized solutions based on:

• Pilot report findings (process opacity as #1 issue)

• Accessibility research patterns from elderly users (concrete visuals > abstract UI)

• Stakeholder knowledge (PM insights from facilitating the pilot)

• Impact potential (fixed deal-breakers first, then addressed frustrations)

Full detailed view available on desktop

Other projects

Reducing Platform Abandonment Through Simplified UX

Redesigning Document Review for 21,000 Community Members

Redesigning Document Review for 21,000 Community Members