top of page

Building a Research Culture: From Emergent to Integrated Organizational Development Case Study | EPAM Systems / Finance Client

 

 

 

 

 

 

What's the challenge?

A large financial services company had embedded UX researchers into multiple product teams — but research as a discipline didn't exist. In practice, "user research" meant engineers crowding into a Zoom call and firing unstructured questions at a single subject matter expert. There was no methodology, no recruitment strategy, no consistency, and no framework for turning what was gathered into anything actionable. Product features were being designed and shipped based on these ad hoc sessions, with no actual users in the room. Research existed in name only. The organization didn't yet know what it was missing.

My Role

​As the embedded UX researcher, I recognized early that the core problem wasn't a research problem. It was an organizational one. The team lacked the knowledge, structures, and culture to make research a meaningful part of how they worked. I took on the role of change agent, working to shift mindsets, build shared processes, and create the conditions for research to become sustainable.

 

What I Did

Rather than waiting to be assigned work, I took initiative early by offering the product team a presentation on how research could integrate into their existing cadence, where it would have the most impact, what methods I would use, and what they could expect from working with me. I walked them through a concrete vision: testing each major new design with real users, collaborating closely with the designer to ensure prototypes were built to capture what mattered most, and using the System Usability Scale to generate consistent, comparable data across studies. The team responded immediately. The product owner began deferring to me on research decisions, and shortly after assigned me to oversee the work of the third-party designer.

From there I reviewed the designer's work before anything went in front of users. I checked for spelling errors, tested buttons and action steps, and ensured each prototype was structured around usable, testable scenarios. By the time users saw a prototype, it was research-ready, and findings were clean, actionable, and tied to real design decisions. Improved prototype quality and fewer revision cycles significantly reduced session cancellations due to prototype error, accelerating the research timeline and reducing wasted recruitment and scheduling costs.

One of the most telling moments came early on. When the product team sat in on user sessions, they would jump in with unplanned questions, a habit that fragmented the research and introduced bias. I gently but firmly redirected this behavior, asking team members to share their questions with me in advance so I could integrate them into the study protocol. Every user would then receive the same questions, in the same order, in the same context. It was a small intervention with a significant organizational message: research has a methodology, and that methodology produces more reliable insights than spontaneous curiosity.

As sessions continued, something shifted. The product owner began to be genuinely surprised by what users revealed, responses to features he hadn't anticipated, needs he hadn't considered. His team watched this happen in real time, session after session. Research stopped being something that happened alongside product development and started being something the team relied on. That shift, from skepticism to dependence on evidence, is the marker of a culture change, not just a process change.

Results

  • Team UX research maturity advanced from Level 1 (Emergent) to Level 5 (Integrated) on the Nielsen Norman Group UX Maturity Scale

  • Improved prototype quality and fewer revision cycles significantly reduced session cancellations due to prototype error, accelerating the research timeline and reducing wasted recruitment and scheduling costs

  • Research contributed to the identification and shipping of 15 new product features across two product teams in the first six months

  • Team members, including the business manager, began consulting me proactively rather than waiting to be assigned research

  • I was invited to EPAM leadership meetings to present research findings and recommendations, a seat at the table that reflected the organizational credibility research had earned

  • I built a close working alliance with the business manager, who used research findings as evidence that the new system was not serving users, employees with 25 years of experience in a legacy platform who were now being asked to do the same tasks in twice as many clicks

  • I formally recommended to leadership that the Pega platform be reconsidered based on user data. Leadership seriously considered the recommendation. The investment had gone too far to reverse, but the argument was heard, documented, and taken seriously

What This Demonstrates

At the center of this project were employees who had done their jobs the same way for 25 years and were now being asked to adapt to a system that made everything harder. My role was to make sure that experience was visible — to the product team, to the business manager, and eventually to leadership. The research gave everyone in the room a shared language for what wasn't working, and a reason to take it seriously.

Designing a Shared Research Practice: Alignment Across Teams Organizational Development Case Study | EPAM Systems / Finance Client

The Challenge

A team of UX researchers was operating without shared standards, and it was showing. One researcher ran a 17-week study that consumed significant resources and produced inconclusive results. Another completed studies in two weeks, said yes to every request, and was stretched too thin to produce thorough work. Management didn't know what to expect from the team. Product owners couldn't plan around research. And researchers had no shared framework for holding boundaries or communicating realistic timelines.

The problem wasn't effort or skill, but the absence of a common operating model.

My Role

​Management asked the research team to align on methods, timelines, and what product teams could realistically expect from us. I took the lead in facilitating that process, bringing the team together to build consensus around a sustainable, consistent approach to research delivery.

What I Did

I convened a series of peer working sessions with the research team, and at times a manager, to work through the core questions together: How many users did we actually need per study? How long did each research phase realistically take? What was a sustainable cadence that wouldn't burn anyone out? These weren't easy conversations, as researchers had been working in very different ways and some of those patterns were deeply ingrained. My job was to facilitate honest peer discussion about best practices without making anyone feel judged for how they had been working.

Out of those sessions I proposed a middle ground: a consistent three-week research cadence, six to eight users per study, with clear phase-by-phase timelines that accounted for recruitment, study design, facilitation, synthesis, and reporting. To make this tangible I built a detailed Mural board that mapped out exactly how long each step took, and how discovery research compared to usability studies in scope and duration. It gave everyone a concrete reference point that a conversation alone couldn't provide.

What I Did

  • Management adjusted their expectations based on the timeline, replacing vague productivity pressure with a realistic understanding of what quality research actually requires

  • Product owners gained a clear picture of what to expect from researchers and when, allowing them to plan research into their product cycles rather than around them

  • Fellow researchers adopted the timeline framework and used it to hold boundaries with their own product teams, saying no more confidently and managing workload more sustainably

  • A consistent three-week research cadence was established across 8 product teams, replacing a wide spectrum of ad hoc approaches with a shared standard

 

What This Demonstrates

Research teams, like any team, need shared structures to function well. What I built here wasn't just a timeline but a common language for how we worked, what we needed, and what we could commit to. Facilitating that kind of alignment across peers, in a room where people had different habits and different pressures, required as much listening as it did designing, and the Mural board made the argument visual, while the conversations made it stick.

Building Organizational Capacity in a Humanitarian Crisis Context Organizational Development Case Study | Mercy Corps / Turkey

The Challenge

Mercy Corps Turkey had scaled up rapidly following a large influx of US government funding, operating a $200 million bread program that fed half of Aleppo by purchasing flour from Turkish vendors, trucking it across the border, and distributing fresh bread through Syrian bakeries to residents and refugees. The Gaziantep office had grown quickly to meet the demand, and the staff reflected that urgency: roughly half Turkish, half Syrian, with expat managers from the US, UK, France, and the Arab world. Turkish and Syrian staff did not speak each other's languages. Management communicated in English, which many staff did not speak fluently. Racial tension was present, particularly among Turkish staff who were new to humanitarian work and brought strong cultural assumptions about gender roles and professional hierarchy. Syrian staff, by contrast, were highly experienced humanitarian professionals but were navigating a new country and a new organizational culture.

The Australian head of HR had been brought in to manage the chaos, was overwhelmed, spoke no languages other than English, and was dealing with an active sexual harassment allegation when I arrived. He was open to any ideas I had.

My Role

​I was brought in as a Research Coordinator and Program Assistant, but the work quickly became organizational. With fluency in Turkish and working knowledge of Arabic, I became a cultural bridge across the office, a facilitator of difficult conversations, and ultimately the designer of two major organizational programs: a Gender Equity and Code of Conduct training for 300 staff, and a multilingual language program for 200 employees.

What I Did

My first task was to meet individually with each party involved in the sexual harassment allegation, listening carefully and communicating in Turkish. What emerged was a combination of genuine gender bias and significant misunderstanding. Through careful facilitation I helped both individuals work through the situation, and both were ultimately promoted and able to continue working together effectively.

From there I turned to the Gender Equity and Code of Conduct training that Mercy Corps HQ in Portland had developed and asked me to adapt and deliver to 300 staff. The material assumed a level of Western liberal cultural fluency that didn't exist in this context, where strict gender norms were deeply embedded and concepts of power and sexual exploitation were framed through explicit UN examples that weren't appropriate for this professional setting or audience. I translated the material into Turkish, adapted it to the local cultural context as best I could with the resources available, and made real-time judgment calls about what to use, what to modify, and what to set aside entirely.

The first delivery was in English with interpreters scattered through the room to assist individual tables. It became clear that the material wasn't landing, and management responded by commissioning a second delivery in Turkish. That feedback loop, recognizing a gap and adapting, was itself an organizational learning moment.

One of the most memorable sessions was with the organization's drivers and chauffeurs. When asked to challenge the stereotype that women can't drive, the room unanimously agreed that women can't drive. Rather than shutting the conversation down, I worked through it with them: do some women drive? Yes. Was there a time when you didn't know how to drive? What did it take to learn? By the end of the session the conversation had shifted enough that the organization made the decision to train some of the office cleaners, who were women, to become drivers, a small but concrete organizational change that came directly from that work.

To address the deeper communication gap across the office I proposed hiring language teachers to offer Turkish, Arabic, and English classes to staff, creating both a practical tool for cross-cultural communication and a signal that the organization valued the effort of learning each other's languages. I posted hiring ads on local platforms in Turkish and English, leveraged existing interpreter relationships, and designed an audition process in which each applicant delivered a fifteen-minute guest lesson before being hired. The program served 200 staff members and helped shift the culture of the office from one of frustrated miscommunication to one where learning across difference was actively supported and rewarded.

Screenshot 2026-03-18 at 11.42.57 AM.png

Building Capacity and Culture at a Living History Museum Organizational Development Case Study 

 

 

 

 

 

 

 

The Challenge

​The Old Aurora Colony Museum ran a daily living history education program for hundreds of fourth and fifth graders, teaching hands-on crafts like candle making, bread baking, and woodworking. The program had real potential, but the organizational conditions were making it hard to deliver consistently. The team was navigating unclear expectations and communication gaps that were affecting morale, supplies were running out mid-program because there was no central tracking system, and the information needed to maintain inventory wasn't being shared completely.

 

My Role

​I came in as Education Program Coordinator, responsible for daily programming delivery, team management, and operational oversight. In practice the work quickly became as much about the people and systems as it was about the program itself.

 

 

 

 

What I Did

My approach with the teaching staff was quiet and direct. Rather than formal feedback sessions I found moments to pull people aside one-on-one during the day, offering practical, specific coaching in the moment: suggesting shorter wicks to make candles wider, or encouraging a gentler tone with the kids. These small interventions were low-stakes for the individual but cumulatively shifted how the team showed up, building confidence and improving the quality of what students experienced.

At the end of one particularly difficult session I facilitated a debrief conversation that helped surface some of the communication dynamics the team had been navigating. The conversation was reflective and constructive, and led to a meaningful shift in how the team worked together going forward.

On the operational side I identified that supplies were running out because there was no central tracking system and because inventory information wasn't being shared completely across the team. I built a shared supply tracking document that both the program manager and I could update, making the inventory visible and creating a reliable handoff process. It gave us a shared reference point and reduced the friction that had been building around supply shortages.

Results

  • Volunteers and staff reported feeling more supported and confident in their teaching roles following one-on-one coaching

  • A facilitated debrief conversation helped surface communication dynamics that had been affecting the team, leading to a constructive shift in how people worked together

  • A supply tracking system was built from scratch, reducing mid-program shortages and creating a reliable operational handoff process

  • Daily programming was delivered consistently to groups of 60-70 students, maintaining quality and engagement throughout

Results

Good organizational development work isn't always large-scale or formal. Sometimes it's a quiet word with a nervous volunteer, an honest conversation that surfaces what a team has been navigating, or a simple spreadsheet that makes everyone's job easier. At Aurora Colony I worked at every level of the organization simultaneously, coaching individuals, managing up, and building the operational infrastructure the program needed to run well. The tools were modest, but the impact on the people and the program was real.

bottom of page