Video Rating Design
Design By Accretion
GolfPass is rapidly emerging as a leading streaming platform for exclusive golf entertainment content. Since its launch, the platform has experienced exponential growth, with the total number of minutes streamed increasing more than 100-fold. To date, users have consumed nearly 100 million minutes, equivalent to over 69,000 days of continuous viewing, across all platforms featuring GolfPass programming.
With such impressive growth, a critical need has surfaced: to maintain and elevate user engagement levels. As the platform's content library expands and attracts more viewers, it becomes essential to keep users engaged with the content to sustain momentum and unlock further business growth opportunities. This means not only delivering high-quality, compelling content but also creating interactive features that foster a more personalized and immersive viewing experience.
The Challenge
Enhancing Engagement Through Content Rating
Before this project, GolfPass users couldn’t rate content, which limited engagement and personalized recommendations, affecting the user experience. Our goal was to create a foundational feature to support future platform improvements.
To validate its potential, we conducted a proof-of-concept survey and two additional surveys, including a Kano analysis, to explore user motivations around rating content. The research was crucial for estimating the project’s ROI and setting benchmarks to measure its impact post-launch.
My Role
I led the research for this project, my role included designing the methodology and ensuring all objectives were met. I worked with another UX Researcher, a Sr. Product Director, and a Software Engineer to integrate the embedded survey into the platform.
The Kick-Off
Proving A Concept
The goal of this research was simple: figure out if users would actually use a video rating system before we invested in design and production.
To test it, I kicked off the project with a survey that simulated a real rating system on the platform. This let us see how many users who watched or visited a video actually interacted with the ratings. Giving users a chance to engage with the feature gave us clear insights into how it would perform in the real world, and provided a visual reference to shape the design of the official rating system mockups.
Study Details
Tool: Qualtrics
Duration: The embedded survey was live 2 weeks
Study Type: Embedded feedback survey
Building an Embedded Survey
I worked with another researcher and a software engineer to embed a Qualtrics survey into the platform, requiring coding expertise and HTML access. I secured the Qualtrics URL and designed the rating options to match GolfPass’ branding.
We also needed a question to go along with the rating options. After a brainstorming session considering the platform’s tone and user audience, I presented several options to stakeholders. The chosen question, “What did you think of this video?” was seen as both engaging and aligned with the stakeholders’ goals.
Embedded Survey Layout
Original design without the embedded survey
Preview of how the embedded survey would look like
Although Qualtrics had limited visual customization capabilities, the platform’s rating options were sufficiently functional to gather the necessary data. While we had to work within the constraints of its design flexibility, the primary objective was to ensure that the rating options were clear, easy to use, and capable of capturing valuable user feedback.
WHERE RESEARCH MEETS SOFTWARE ENGINEERING
Once the visual previews were approved, our focus shifted to editing the webpage's HTML to integrate the embedded survey seamlessly.
I reconvened with my software engineer to strategize the execution phase of this project. Together, we developed a detailed plan for embedding the survey into the platform. This involved not only ensuring the survey was correctly implemented but also rigorously testing its functionality to confirm that it would operate smoothly within the existing webpage structure. The testing phase was crucial to identify and address any potential issues before the survey went live.
After dedicating several days to testing and fine-tuning the embedded survey, we reached a stage where the rating options successfully "mimicked" a real design. This process involved meticulous adjustments to ensure the rating system not only functioned properly but also aligned visually with the platform's existing user interface.
With the survey perfected, we embedded it across all video pages on the platform. This strategic placement allowed us to gather data not just on user interaction with the rating system but also on which specific video content or pages were garnering the most engagement. This additional layer of data collection provided insights into user preferences and content popularity.
Once everything was in place, the survey was launched and remained active for two weeks, continuously capturing user interactions and providing us with critical data for the next steps in the project.
The Results
Users Love Sharing Their FEEDBACK
After two weeks of continuous data collection, it was time to analyze the results.
Our initial assumption was that an interaction rate of 5% or higher with the rating options would indicate sufficient user engagement. This level of engagement would provide the evidence needed to justify moving forward with the design and implementation of a full-fledged rating system on the platform.
The data confirmed our hypothesis. The interaction rate met our threshold exactly, with 5% of users engaging with the rating options.
Number of Total Impressions (how many users saw the rating options)
21,596
Number of Total Clicks (users who clicked to rate a video)
Thumbs-Up vs Thumbs-Down Comparison
95%
1,080
5%
This result not only validated our belief that users were indeed engaged enough to provide the necessary data, paving the way for the team to proceed confidently with the next stages of the project, but they also revealed additional insights about user behavior. Specifically, the data showed that users were more inclined to provide positive feedback rather than negative feedback. This pattern suggests that while the rating system effectively captured user interactions, it also highlighted a tendency for users to express favorable opinions.
Uncovering Users’ Motivations & Preferences For Rating
Now that our initial research study has proved that users are engaged enough to interact with a rating system, the next challenge is designing a system that aligns with both user needs and business objectives while maintaining aesthetic coherence.
To address this, I designed a survey aimed at uncovering key insights into user motivations and preferences. The survey focused on several crucial areas: what motivates the GolfPass users to rate a video, what they expect to happen after they rate, and their preferences regarding how to share their opinions.
Study Details
Tool: UserZoom
Quota: 100 participants (Golfers/GolfPass Users)
Study Type: Unmoderated Qualitative Survey
survey results
This survey focused on user behavior across streaming platforms in general, aiming to gain insights at a broader scale rather than being limited to golf-related content.
The findings were highly informative. Participants reported that they consume content on streaming platforms almost daily. They also indicated that they actively rate the content they watch, frequently check ratings before selecting a video, and, notably, they rate videos primarily to ensure that those videos are automatically saved as favorites for easy rewatching.
How often do you watch content in a video streaming platform?
Moreover, participants identified their top three motivations for rating content: helping other viewers with similar tastes, sharing their opinions, and receiving personalized content recommendations based on their ratings.
How often do you…
What most often motivates you to rate content? (select up to 3)
The results suggest that content rating is not just about expressing an opinion, it's also about assisting others and, most importantly, managing expectations. Users expect that by rating content, the platform will generate a curated list of recommendations tailored to their viewing behavior and interests. This underscores the importance of content ratings as a tool for enhancing the user experience and personalizing content delivery.
Customer-Focussed Research: Kano Analysis
Having established that users are actively engaged in rating content, we now have the foundation to build a substantial database that can effectively recommend similar content to them. The next step is to assess how crucial this feature is to users and determine the priority it should be given to enhance overall user satisfaction.
To tackle this challenge, I employed a Kano analysis to gauge customer emotional responses to the “suggested content” feature. This approach allowed us to measure and explore how this feature resonates with users and its potential impact on their experience.
KANO Analysis findings
The results of the Kano survey revealed that the “suggested content” feature is highly attractive to participants. In Kano analysis, attractive features are associated with the Excitement Attribute line, indicating that they have the potential to delight users and elevate our platform above the competition.
Attractive features are those that bring satisfaction and excitement when present, but their absence doesn’t cause dissatisfaction since users don’t initially expect them to be part of the product. This insight underscored the value of the “suggested content” feature as a differentiator that can significantly enhance user engagement and satisfaction, setting our platform apart in a competitive market.
If your favorite streaming platform had a “recommended for you” section that recommends content that is tailored to you based on your ratings. How would you feel?
Study Details
Tool: UserZoom
Quota: 100 participants (golfers)
Study Type: Unmoderated Kano Survey
If your favorite streaming platform did NOT have a “recommended for you” section that recommends content that is tailored to you based on your ratings. How would you feel?
How IMPORTANT is it for you that your favorite streaming platform has a “recommended for you” section that recommends content that is tailored to you based on your ratings?
FINAL DESIGNS
My role in this project concluded after I successfully delivered all the research findings, which informed the design phase. Once the designers began creating the mockups, my direct involvement ended. The final design, shaped by the research insights, was approved and developed in Q2 of 2024.
Since then, users have consistently engaged with the rating system, steadily growing our database. This growing dataset is crucial as it will enable us to offer personalized content recommendations. The increased user engagement on the GolfPass platform has driven a 61% rise in content consumption.
The recommendation feature is currently in the design and testing phase, marking the next step in enhancing our platform's user experience.
Rating system final design
Favorites page final design