Living the design double diamond life
Empowered user experience engagements as a UX generalist
What happens when you can ride the design “double diamond” all the way to the end? In many organizations, it’s more of an aspiration than a plan. It’s rare that you encounter organizations mature enough to place value on the entire process, and rarer still that you find yourself empowered to work on the entire process yourself. These past 3 years, I’ve seen the benefits of living the double diamond life and I’m here to talk about what it looks like. This isn’t to say things have been perfect, but I’ve been grateful to use the full spectrum of design’s capabilities at my current employer.
So what does it look like when you get your wish… and you’re not only sitting at the table, but helping to set the menu, too?
What’s the double diamond, anyway?
The design “double diamond” is a metaphor that depicts the expansion-contraction process critical to most design practices. As designers, in an ideal environment we are taking the time to explore widely before narrowing our sights on specific opportunities or solutions. Notably, the double diamond is split into four segments: two in each diamond. Each is provided a specific phase name, as identified by the Design Council in 2003:
- Discover: Observe and question to uncover the current state
- Define: Analyze the current state to identify the greatest areas of opportunity
Typically, at this halfway point you’ll pause to identify which of the opportunities are most important for your customers. Then you’ll define problem statements and success criteria.
- Develop: Explore multiple approaches to the problems you’re solving.
- Deliver: Prototype or beta test potential solutions as you refine your approach toward the most optimal solution
Our organization
I’ve been blessed these past few years in my design career to work at a people-focused foundation that’s genuinely interested in human centered design on a number of fronts. I’m a member of the Human-Centered Design team within our Information Technology organization. We have (under normal circumstances) 5 full-time team members: a design leader and 4 full-time Principals, of whom I am one. Getting a seat at the table doesn’t require (or guarantee) a huge team.
While my title these past years on the Human-Centered Design team has been “Principal UX Designer”, at the foundation we function as design leads and full-stack designer/researchers. (Yes, we have a bit of a content design problem up front. Someday…)
In our group, each principal designer acts as Design Lead for a vertical aligned to our overall Information Technology organizational priorities regarding the foundation’s employee experience; mine is Productivity & Collaboration. In contrast to my peers who focus on line of business applications, my work primarily focuses on the “glue” that connects foundation employees — and often that includes service design challenges related to products we don’t control.
Within each vertical, we have a few vendors who partner with us to help us scale by taking on distinct projects. (In addition to our employee experience purview, we also consult periodically on programmatic work, like Polio program workshops.)
I credit my predecessors for making a strong case for HCD early on. That created an environment where my initial charter on day 1 was “Learn from our employees and tell us how we can have the biggest impact on the employee experience.” But — for me, at least — it was largely uncharted territory after years at organizations where user research was chronically understaffed and overlooked. When I joined Amazon in 2014, I’d heard that the number of researchers at that 150,000+ person company was in the double digits.
Framing the work
A large portion of my team’s work has been powered by discovery: primarily qualitative research including interviews and ethnographic research techniques to uncover real process and identify previously unknown issues. This discovery focus reminds me of new product teams I’ve worked on like the Echo Look team, but it’s rare when you’re not trying to discover new markets.
The broader human-centered design team doesn’t call our work “research” because our organization employs some of the world’s top scientists, who have a much different mental model for that term. Instead, we use the terms for the component phases of the research. The term “discovery” has resonated well with our partners without causing much confusion.
(Honestly, I wouldn’t have even foreseen this terminology clash if it weren’t for Mark Boulton’s insightful talk at the DesignOps Summit in 2018. My whole team has now watched the video of his talk, and if you also work with scientists of any stripe, it’s worth your time!)
My team’s UX research work typically falls into one or more of three categories:
Broad-scale qualitative research
In our context, discovery often means learning from human behavior and tool usage without the constraint of a specific product or solution.
Techniques I use often:
- Open-ended interviews regarding past lived experiences
- Ethnographic techniques including contextual inquiries, job shadows, and meeting observation sessions
- Diary studies and surveys
Past projects have taken anywhere from 4 weeks to 6 months and involved 6–32 participants, often recruited in iterative waves. Findings include a variety of posters, journey maps, and reports for a wide audience.
My first study in my current role, the “Employee Experience 2019” project, was a 6-month ethnographic study with 30 primary participants (interviews and contextual inquiries) and about 20 secondary participants (surveys and meeting observations) out of a potential “customer” ”pool of about 1600 employees. The deliverables included a 26-page report, an executive summary presentation deck, 2 user role posters (defining 8 behavioral profiles), and 2 end-to-end journey maps. Very heavyweight for most purposes. But our goal was to identify the areas for greatest opportunity, with a secondary goal of improving the department’s understanding of our customers. Within a year, that project had resulted in the replacement of our teleconferencing software worldwide, supported a change to our VPN software, and informed the start of a multi-year knowledge management effort at the foundation. In addition, the journey maps and findings around employee onboarding are still in use today by multiple teams.
Customer understanding for a product or service
Even when a specific product or service lands in our business plan, often projects spin up without a true understanding of potential user roles, common scenarios, and the underlying jobs to be done. These insights are critical to setting meaningful success criteria for a project and to identifying priority improvements. Findings include an executive summary and sometimes a short report or visualization.
We do not deliver personas as the abstraction is not helpful for our small user base, but I have found that a combination of user roles and jobs to be done is a very effective way to communicate observed customer behavior in this environment.
Evaluation of an existing solution or process
In other classic cases, we know there’s a problem but we don’t have a full picture. Since the space is constrained, this takes much less time than a broad qualitative or ethnographic study.
For example, once the COVID closures began our AV team saw a spike in the number of requests for “whiteboarding” tools. We proposed a short study to investigate what folks were trying to accomplish with visual collaboration tools, and my colleague Naomi’s final deliverable included both customer needs and an evaluation of a variety of tools against those needs. This led to the foundation’s successful adoption of MURAL and widespread usage during the pandemic.
I’ve been delighted with the impact this kind of work has in this organization. My biggest success story is also my first: one of the outcomes of my initial Employee Experience report was a significant finding that our Skype for Business product was causing significant pain and lost time across the foundation, particularly for remote employees because Skype was hosted “on premises” and not in the cloud. In partnership with my stakeholder director, we used the voice of our customers to build a pitch for the executive team: approve a $2 million accelerated replacement timeline for all of our global conference room hardware so that we could switch to Team Meetings over a year early. Our proposal was accepted — and thank goodness, because we switched to Teams painlessly two weeks before the unexpected closure due to the pandemic. That preparedness is a huge win for the power of observational research.
Right technique, right time
Our colleagues see us as expert interviewers, but in truth we use a much broader variety of techniques. Interviews will always be a core part of the work since we have such ready access to our peer customers, and because of the spirit of helping in our place of work.
- Artifact review: I spend a ridiculous amount of time searching for documents on SharePoint, reviewing docs sent to me as examples, reviewing past research, etc. This can also include review of existing support tickets, tech learning curricula, video meetings, and more.
- Interviews: The cornerstone of our research work. I prefer to limit the initial interview to 30 minutes; if there’s opportunity to follow up it’s likely a different technique would yield additional insight.
- Contextual inquiries: Observation of existing work, either 1:1 or small groups (meetings). I’ve taken to calling 1:1 inquiries “job shadows” as that’s a much more accessible term.
- Diary studies: These are typically informal — I’ll spin up a survey form with a few fields so folks can self-report on an event or issue. The most helpful diary study I ran was for meeting issues — I received a few dozen reports of meeting issues that allowed me to schedule follow-up research and to reach a broader set of folks.
- In situ surveys: Inspired by the now-inappropriate airport bathroom touch surveys, we launched a series of in-room kiosks to get real-time reports about silent meeting failures — the kind that never result in an IT ticket but lead to mounting frustration.
- Participatory design: A key element of our journey mapping work is, of course, collaborative contributions or review of early sketches with key subject matter experts.
- Other quantitative evaluations: We use surveys extremely sparingly. Where we can, we leverage existing surveys that the companies or project teams are launching. It’s easier to get telemetry and usage data, though, since we are part of the IT organization.
Design beyond visuals
This is a lot of content about research, but we’re still doing design. My peers spend a great deal of time on graphical user interface design, but my team has an interesting challenge. Our foundation’s productivity and collaboration tools are often off-the-shelf, like the aforementioned Microsoft Teams — or built on top of an off the shelf framework like SharePoint. We can’t change products we don’t control; we can only attempt to smooth the edges.
We still work on traditional designs, and a fair number of them. We’ve designed interactive room control systems for every conference room console at the foundation; my team member Naomi created a new demand management system, and between myself and my team member Paulé we are designing an entirely new knowledge management solution for the foundation.
But a great deal of our work also includes service design. When the Teams Meetings acceleration was approved, I immediately took on the challenge of designing our approach to a rollout that would address the customer needs we’d identified and allow employees to voluntarily adopt the new software on their own time. That involved content design, change management, evaluation of the room installations, instructional design, and more. We can’t change the Teams UI, so I needed to run usability tests to ascertain where we’d run into difficulties and design accommodations for those problems. In the middle of our double diamond, one of the key success metrics we set was voluntary adoption: we wanted to create an environment in which at least 50% of the foundation felt confident enough to switch to Teams prior to our official launch. In this case, we far exceeded our target goal — 80% of meetings at the foundation were Teams meetings by the time the switch was made official.
Making the work tangible
There’s nothing particularly groundbreaking about our deliverables, but it helps to talk about the specifics a bit. My team does focus on narrative content, as we are a memo-and-presentation-heavy culture.
- Project brief: 1–2 page summary of background, problem statement, timeline, proposed techniques, and deliverables. Used to generate consensus and shared understanding amongst stakeholders.
- Executive summary deck: covers techniques, top findings, and insights or recommendations in less than 30 slides. Appendix can contain details and verbatims.
- Visual artifacts: can include posters, reference cards, flow diagrams, storyboards, blueprints, or full-scale journey maps. Journey maps are of particular interest, and my early work has led to even larger-scale engagements to chart our entire end-to-end strategy process.
- Reports: For large-scale discovery work, one or more narrative reports are often generated. These can be long-form, 25+ page reports or targeted 1–2 page reports targeted to a very specific audience.
- Case study: For large studies with multiple deliverables, we collate our deliverables onto a project page for later reference — or write up a 2-page narrative case study of the work.
Messaging and communication
Close relationships with key service owners and leaders help us understand what parts of our work will be most valuable, and when. In some cases, that understanding has also helped us message preliminary findings early enough to increase our impact via strategic timing.
I was actually surprised at how findings, when shared with participants, are vectors for healing and awareness. Just the simple act of sharing my findings with participants yielded strong positive feedback — folks said they “felt really heard”, and it improved the perception of our IT services as a whole. Furthermore, folks forwarded the findings far more than we expected — a bit too far at first, ironically, but it led to future requests for our work. For future studies that go outside IT, I hope we can continue this tradition of sharing responsibly with our participants.
As designers, we often document our work for our portfolios. Why don’t we do that internally as well? I’ve led the charge for our broader Human-Centered Design team as we begin to create Project Recap pages, which include a narrative summary of a project’s goals, findings, outcomes, and artifacts. These were originally intended as reference for HCD/UX peers, but they’re also incredibly useful for onboarding other IT team peers, and I’ve found viewer numbers of these pages on SharePoint are surprisingly high.
Not reinventing the wheel — just driving faster
If you’re an experienced user experience researcher, I hope this all sounds familiar. It’s not groundbreaking to perform any of these techniques. What’s refreshing, for me, is the extended use of qualitative research to inform ongoing projects.
It’s like falling through the looking glass to work as an in-house generalist: I get to study employee behavior, make recommendations — and then flip to the other side and provide design leadership for the projects that spin out of our broader discovery work. It’s immensely satisfying. But it’s a ton of work, and sometimes the number of opportunities and places I feel I should be directing my attention are overwhelming. At old jobs, forces kept us in “our lane” — but here, if I chase a thread I feel confident we can have an impact.
And the greater my team’s success, the higher the demand for our services. We would never be able to scale to help every team with existing headcount caps. Instead, the foundation has invested in LUMA practitioner training both for our UX/HCD principals and for selected IT team members from other disciplines. I’ve also put together interview toolkits and taught interviewing workshops for folks running their own employee outreach. We’ll still often chime in on analysis and synthesis, but we’re seeing good early results when we put in the time to train our partners to level up their skills.
Do you have questions about this work? Share them below! Until then, may you edge ever closer to that seat at the table — even if the table has now gone virtual.
Cheryl Platz joined the Bill and Melinda Gates Foundation as a full-time Principal UX Designer in late 2018 and is proud to contribute her UX expertise to their work. She does not speak on behalf of her employer; all opinions here are her own.