Thrown into the deep end
I joined Styra as part of a two-person design team, hoping to learn more about design from my manager and grow the team alongside her.
Unfortunately, there were layoffs within months.
I then became the only designer within the company and was added onto all the existing projects, one of them being the Policy Builder.
But first, what is Styra?
Styra is used primarily for authoriziation management. Basically, they create tools that help companies control who can do what inside their complex tech environments.
Let's imagine how Styra would be useful at Netflix.
Microservices
Software deployments
User access rules
Duolingo for Rego?
Styra’s product is called DAS, which allows companies to write clear, flexible rules (aka policies) in the form of Rego, a declarative language. Like SQL.
But learning a new language is hard. There’s no Duolingo for it.
Policy Builder
The Policy Builder was a new product feature, aimed to provide a low-code approach to policy authoring for the occasional policy writer or those with a less technical background.
Jumping in mid-stream
This project was an interesting challenge because the feature was already in development and design decisions were mostly established. I had to get myself quickly up to speed with the project’s current phase.
The team brought me into the project because they wanted design eyes on it as a final polish before releasing it to our customers.
Demonstrating the value of UX
My goal was to demonstrate the value of design.
So apart from conducting a heuristic audit of the product to flag potential friction points, I also proposed internally testing the feature to folks within the company. There was just one problem.
My hypothesis on the lack of user participation
Historically, the product team had a difficult time getting users - internal and external - to test out the feature. Why was it so hard?
After speaking to a few coworkers, I came up with an hypothesis involving feedback.
With previous releases, we provided a link to the testing environment and asked folks to provide feedback in the Slack channel. There were two key issues with this approach.
Without a structure for feedback, usability issues were left unacknowledged and assumptions unchallenged.
Internal usability feedback
I came up with a tutorial-like study that guided users through a series of tasks, asking open-ended questions that pointed to friction points flagged earlier.
Increasing user engagement by 800%
We received responses from 18 colleagues, which was a huge uptick from the 1-2 responses the product team usually received. It was a successful format that allowed everyone at the company to familiarize themselves with the feature.
Additionally, because of the structured format, we were able to systematically collect insights and analyze it in a quantitative manner.
So how did I convince the team we are not our users?
This comment by Jeff shows that the research study shifted the team's initial beliefs and revealed hidden biases.
"I am very confused by this finding, I thought it was very intuitive and a delighter"
Next steps
The structured usability testing - even with internal users - uncovered confusing product behaviors. We addressed these behaviors before releasing the feature into production. Additionally, it sparked a constructive dialogue around the product experience as it exposed systemic problems that were deeper in nature. We marked them for future considerations.
The team really liked the internal usability study format. Sales wanted to use this format for their customers and the product team wanted to conduct similar usability studies for future releases.