joystick

joystick

joystick

a game discovery system designed to reduce emotional risk for players navigating unsafe, exclusionary, or exhausting gaming spaces.

impact: designed a values-aware game discovery system that layers accessibility, representation, trigger warnings, and community safety signals onto traditional game metadata - helping players evaluate not just what a game is, but how it may feel to actually play.

my role
UX/UI Designer, UX Researcher
end-to-end systems ownership (solo)

timeline
april - june 2025

scope
user research, interface design, design systems

↦ “It's dangerous to go alone."
-The Legend of Zelda

↦ the why

game discovery is often treated like a neutral browsing experience. but in reality, it is anything but neutral.

For many players, especially those from marginalized communities, choosing a game can involve a quiet but very real form of risk assessment.

Will this game feel welcoming?
Will its community be hostile?
Will it include accessibility support?
Will it handle identity, violence, or trauma in ways that feel manageable …. or not?

Most existing platforms still prioritize popularity, performance metrics, and aggregate scores. What they tend to miss are the factors that make a game feel emotionally safe, socially tolerable, or even worth attempting in the first place.

Joystick came from a different premise: discovery systems should reduce risk, not externalize it onto the player.

the problem:
There is no centralized, trustworthy discovery platform that helps players evaluate games through the lenses of emotional safety, accessibility, representation, and community behavior.

Instead, players are forced to piece that understanding together on their own - across forums, Reddit threads, YouTube videos, Discord conversations, and review sites that often surface opinion without context.

That creates a discovery process shaped less by curiosity and more by emotional labor.

The issue is not just missing data. It is missing interpretation. Existing tools may tell players whether a game is popular, but they rarely help answer more human questions.

the problem

There is no centralized, trustworthy platform that helps users discover games that align with their values, identities, and access needs. Current tools offer raw data or unmoderated reviews, but none center the lived experiences of players who are most at risk of harassment or exclusion.

the hypothesis:
I believed that if a discovery system surfaced emotional and identity-aware signals alongside traditional game metadata, then players could make more informed, lower-risk decisions without having to do so much invisible pre-screening on their own.

My goal was to design a platform that supported autonomy and trust without becoming overwhelming, patronizing, or extractive.

the problem

There is no centralized, trustworthy platform that helps users discover games that align with their values, identities, and access needs. Current tools offer raw data or unmoderated reviews, but none center the lived experiences of players who are most at risk of harassment or exclusion.

↦ the research

understanding harm, trust, and decision fatigue in game discovery

My research showed that discovery decisions are shaped less by preference than by risk avoidance. For many players, especially those with prior experiences of harm, choosing a game involves assessing potential emotional and social risk with limited information.

So, rather than feeling empowered by choice, players often experience uncertainty, fatigue, and distrust toward existing platforms.

Key research insights:

  • 76% of online multiplayer users have experienced harassment (ADL, 2024).

  • Players want tools that feel personal and contextual, not generic or algorithmic.

  • Accessibility and safety are foundational requirements, not optional enhancements.

These insights were informed by personas, empathy mapping, and usability testing across four prototypes.

designing for players who carry risk into discovery

Persona: Jordan B. (they/them)

Jordan uses games as a way to decompress after emotionally demanding workdays. Past experiences with harassment have made them cautious, requiring extra effort to assess whether a game will feel safe, inclusive, and accessible.

↦ the insights

through a competitive analysis, I found that across platforms, the failure mode was consistent: discovery tools surface volume without context, leaving players to infer safety, tone, and representation on their own

  • Common Sense Media offers thoughtful guidance, but is primarily child-focused and limited in scope.

  • Metacritic aggregates reviews without moderation, emotional framing, or safety signals.

  • MobyGames provides detailed metadata but lacks usable, human-centered interpretation.

  • IGDB prioritizes developer needs over player understanding.

For many players, especially those with prior experiences of harm, choosing a game is not simply about taste. It is about trying to predict emotional cost with incomplete information.

A few of these key findings shaped the project most:

Players needed context, not just more data

Across platforms, there is no shortage of information. What is missing is usable interpretation. Players wanted tools that helped them understand emotional tone, community norms, representation, and accessibility in a way that felt readable and relevant.

Accessibility and safety were not “bonus” filters

They functioned more like baseline requirements. Players did not see these signals as niche enhancements. They saw them as foundational to whether a game was even worth considering.

Discovery tools often shifted the burden back onto the user

Competitive analysis revealed the same pattern over and over: platforms offered volume, but not clarity. Players still had to infer whether a space would be welcoming, hostile, or exhausting.

Trust had to be designed very carefully

A system like this could easily become too heavy-handed, too clinical, or too visually loud. Users responded best to designs that felt calm, affirming, and transparent without feeling overly curated or emotionally manipulative.

↦ the design decisions

joystick became much stronger once I stopped thinking about it as “better filtering” and started treating it as a trust-building system.

1. I designed for emotional safety, not just information retrieval

Traditional discovery systems optimize for search and recommendation. I wanted Joystick to also support emotional preparation and informed consent.

Trade-off: The system had to communicate more nuanced signals without making the experience feel dense or intimidating.

2. I surfaced identity-aware criteria without forcing self-disclosure

Joystick includes signals around accessibility, representation, trigger warnings, and toxicity, but does not require users to explain or declare their identity in order to benefit from those filters.

Trade-off: Less personalization upfront, but more privacy, flexibility, and emotional safety.

3. I used warm, calm visual language instead of competitive or performance-heavy aesthetics

Many gaming interfaces lean loud, intense, or overstimulating. I intentionally moved in the opposite direction to communicate trust, softness, and readability.

Trade-off: A less “traditional gaming” aesthetic, but a much stronger emotional fit for the problem space.

4. I treated contribution as care, not labor

Rather than designing around extraction or endless content generation, I explored how users could contribute feedback through affirming, values-based prompts and badges.

Trade-off: Slower contribution volume, but a stronger sense of meaning and respect.

5. I broke the system into four prototypes across the discovery journey

Instead of trying to solve everything in one flow, I tested onboarding, tutorial, browsing/filtering, and contribution as separate but connected moments.

Trade-off: More moving parts in the process, but better insight into where trust was gained or lost across the full experience.

↦ the refinement

one of the biggest tensions in this project was how to design for safety without increasing cognitive load.

At first, there was a risk of over-explaining everything - too many labels, too much guidance, too much protective language. But that would have created its own kind of friction. A platform meant to reduce emotional burden should not feel emotionally heavy to use.

Usability testing surfaced both strengths and friction points, directly shaping refinements across onboarding, navigation, and contribution flows.

Iterative testing revealed both strengths and points of friction, shaping subsequent refinements across onboarding, navigation, and contribution flows.

So the work became less about adding signals everywhere and more about deciding where they mattered most, how they should be framed, and how much the user should be asked to process at once.

That led to a more restrained system: clearer onboarding, more skippable education, better filter visibility, more context around badges, and calmer contribution flows.

↦ the solution

i designed Joystick as a discovery system that supports safer, more values-aligned gameplay decisions from first use through long-term contribution.

The final concept was shaped through four connected prototypes, each focused on a different moment in the user journey.

prototype 1: onboarding

The onboarding flow was redesigned to reduce the emotional cost of entry.

Rather than opening with highly personal or identity-heavy questions, the experience begins with familiar topics like favorite games and genre preferences. This creates a gentler entry point while still building toward more meaningful customization over time.

I also introduced:

  • skip options to preserve user control

  • improved button size, contrast, and readability

  • a more invitational tone that felt supportive rather than interrogative

What this improves

  • lowers pressure during first-time use

  • builds trust without forcing vulnerability too early

  • makes the platform feel more approachable from the start

View the full Figma prototype here.

prototype 2: tutorial

The tutorial was designed to guide without overwhelming.

Because many users may arrive with prior experiences of harassment, exclusion, or fatigue, I wanted the instructional layer to feel lightweight, optional, and easy to revisit. Rather than treating guidance like a mandatory training sequence, I framed it as a soft orientation.

I refined:

  • badge explanations through separate, clearer screens

  • explicit “tap to continue” prompts

  • the ability to skip or revisit tutorial content

  • Sage’s role as a gentle guide rather than a dominant mascot

What this improves

  • reduces onboarding fatigue

  • gives users more control over pace and depth

  • makes the system’s values legible without overselling them

View the full Figma prototype here.

prototype 3: browsing and filtering

This part of the system focused on helping users search and explore without leaving important emotional or identity-related criteria buried.

Alongside genre and platform, users could browse with filters related to:

  • accessibility support

  • representation markers

  • trigger warnings

  • community safety signals

Key refinements included:

  • changing “All PS5 Titles” to “Filtered Results” for clarity

  • adding visible filter states and dynamic sorting

  • increasing checkbox and CTA size

  • reorganizing tag categories to make browsing feel more intuitive

What this improves

  • makes hidden risk factors more visible

  • supports both exploratory and search-driven behavior

  • reduces the need for off-platform research and guesswork

View the full Figma prototype here.

prototype 4: contribution and persuasion

Contribution was framed as a form of care, not a demand for free labor.

Rather than relying on competitive gamification, I explored values-based badges like “Access Ally” that affirm the impact of contributing useful context for other players. The goal was to encourage participation in a way that felt respectful, emotionally coherent, and aligned with the platform’s purpose.

Refinements included:

  • linking feedback more clearly to specific games

  • clarifying the purpose of each badge

  • standardizing validation language across flows

  • making contribution feel contextual rather than bolted on

What this improves

  • makes participation feel meaningful instead of extractive

  • supports long-term trust and community usefulness

  • reinforces the platform’s values through action, not just copy

View the full Figma prototype here.

↦ the visual system

the visual system was designed to signal safety, clarity, and emotional legibility - not performance.

That meant:

  • lavender and teal as the emotional anchor of the palette

  • restrained supporting colors for familiarity and balance

  • accessible typography with soft but readable hierarchy

  • icon and tag systems that made meaning scannable

  • Sage (the mascot) as a quiet guide, not a mascot that hijacks attention

I treated visual and tonal decisions as trust signals, not decoration. The interface needed to feel calm enough to lower defenses, but clear enough to support real decision-making.

↦ the impact

joystick was ultimately about reducing the invisible labor players often carry into discovery. even without a launched product, the design impact became clear in a few important ways.

For players

  • made emotional and identity-related considerations more visible during discovery

  • reduced the need to piece together safety information across fragmented sources

  • supported more informed, lower-risk gameplay decisions

For the product concept

  • tested four connected prototypes across the full discovery journey

  • identified where trust was gained or lost across onboarding, navigation, and contribution

  • clarified how emotional design can support informed consent without creating overload

Through usability testing

  • validated the importance of warmth, clarity, and gradual value communication

  • surfaced key accessibility and interaction issues that shaped final refinements

  • strengthened the case for calmer, more human-centered gaming interfaces overall

↦ the reflection

this project taught me that designing for emotional safety requires a lot of restraint.

It is not about layering on endless reassurance or trying to solve every harm through interface language. It is about making thoughtful decisions around what to surface, when to surface it, and how to preserve user autonomy throughout.

It also reminded me that inclusion is often built through small choices: opt-outs, readable defaults, plain language, clear filter states, and moments of affirmation that do not ask people to prove why they need them.

If I continued the project, I would focus next on:

  • testing with more disabled, neurodivergent, queer, and caregiving players

  • refining how community toxicity signals are gathered and displayed

  • expanding profile flexibility without increasing self-disclosure pressure

  • exploring how trust evolves over longer-term use, not just first-time onboarding

don't be a stranger . . . !

@ 2025 by summer chaves

don't be a stranger . . . !

@ 2025 by summer chaves

don't be a stranger . . . !

@ 2025 by summer chaves