Aug - Dec 2024
Senior Product Designer
Ozette’s AI-driven platform started as a closed system, granting exclusive data-modification rights to internal data scientists. Because Ozette’s analysis algorithms are proprietary, partners had limited visibility to preserve underlying IP.
Hence, external partners could only view snapshots of their clinical data during scheduled meetings. As partnerships multiplied, one-sided collaboration and inflexible workflows led to major scalability issues.
We believed that giving partners a timely, guided way to inspect and comment on their data would alleviate resource strain and bolster trust. By carefully controlling what partners could see and how they could provide input, we aimed to improve collaboration while safeguarding Ozette’s proprietary methods.
To shape our approach, I conducted deep-dive sessions with four data scientists managing separate clinical partner projects.
The objective was to dig deep into their day-to-day frustrations, explore inefficiencies, and identify the key obstacles preventing smoother collaboration.
User research made one thing clear: partners wanted more involvement in the data review process, but full autonomy wasn’t an option.
The challenge wasn’t just about faster feedback loops—it was about designing a system where partners could collaborate with data scientists without introducing inefficiencies or compromising the integrity of the analysis. Initially, we framed our ideation around a structured feedback model: how might we create a way for partners to indicate what needed to change without disrupting existing workflows?
This led to our first proposed solution: Flagging.
Flagging was designed as a low-friction way for partners to request changes without needing full data access. Instead of waiting for scheduled calls to provide feedback, partners could highlight specific data points, add context, and provide revision requests directly within the platform.
This introduced asynchronous collaboration, allowing scientists to address flagged issues on their own timeline.
After presenting the first iteration to data scientisits and partners, these were our key findings:
Partners still lacked autonomy
They could request changes, but they still relied on data scientists to execute them. This created a bottleneck rather than solving it.
Review cycles were still slow
Even though Flagging introduced asynchronous feedback, partners still had to wait for scientists to review and apply their requests.
Workload shifted, not reduced
Instead of improving efficiency, Flagging created long task queues for data scientists, meaning feedback piled up rather than being addressed in real time.
Recognizing that Flagging wasn’t enough, we reframed our strategy around active collaboration instead of structured feedback. Instead of relying on data scientists to interpret and execute changes, we asked:
"How might we give partners more direct control while maintaining trust, security, and data integrity?"
This shift led to Edit Mode, a more robust approach that enabled partners to make real-time modifications within guardrails, reducing delays and ensuring smoother collaboration.
Over the next few weeks, I explored and tested a wide range of flows with the data scientitsts on the team. I developed user-flow diagrams for both scientists and partners, detailing how they would navigate from data inspection to flag resolution.
This ensured the interface naturally guided them through the review, flag, revise cycle without requiring heavy training.
#1
How can we give partners a simple way to switch between “view only” and “edit” states for their data plots—without overloading them with unnecessary functionality or revealing Ozette’s proprietary algorithms?"
#2
How do we handle simultaneous editing on complex immunological data plots so that two different users can make real-time changes without overwriting each other’s work—or requiring endless version management?
#3
How do we empower partners to finalize and download results after making edits—without always needing a data scientist to confirm every step?
We launched the "Edit Mode" feature under controlled feature flags and rolled it out gradually to gather customer feedback and refine the user experience.
Workflow integration
Enabling partners to edit final gates provided clearer insights for the data science team into approved gate states. This data serves as valuable training data for algorithm improvements and enhances our understanding of customer needs.
Reduced communication overhead
By empowering partners with direct editing capabilities, we observed a reduction in communication between Ozette's data science leads and customers regarding gate approvals.