YES! WEBSITE
Enhanced usability for better event promotion and engagement
For the School of Design grad show
Overview
As a member of the website committee for the annual grad showcase at George Brown College, I contributed to redesigning the YES! website to improve usability and promote the event more effectively.
Collaborating with fellow designers with various skills, I was a part of a multidisciplinary team. The project began with extensive user research and ended with a successful deployment. The final result helped students, faculty, and industry professionals to access information about the graduating students and the event easily.
The redesigned website provided a streamlined user experience, driving better event visibility and engagement, and ultimately increasing visitor participation and satisfaction.
Role
UX/UI Design
UX Research
QA
Tools
Figma
Figjam
Notion
Team
4 Designers
1 Developer
Duration
3 months

THE PROBLEM
Despite its visual appeal, the previous year's website faced significant usability challenges, leading to confusion for users navigating event details and affecting their overall experience.
SOLUTION OVERVIEW
Website overhaul driven by a usability audit:
UI designed to increase user engagement
Simplified access to event information
Enhanced search & filtering system
Easier navigation from main to inner pages
PROCESS
DISCOVERY
Kicking off the project with primary research and effective teamwork.
Time was tight, so the website committee split into three subgroups with each using different research methods to maximize efficiency. The methods included user interviews, competitive analysis, and usability testing. I led the usability testing group.
“Who would be visiting our website, and what would be their needs and primary actions?”
In order to conduct user interviews and usability testing, our first step was to define the core users of our website. This was a crucial step before conducting user research, as it allowed us to identify and engage with individuals from these user groups for our primary research and insight analysis.
USER INTERVIEWS
Semi-structured 30-minute interviews with the key user groups.
At the initial stage of our research, the user interview team asked 5 participants from key user groups to get to know their opinions of the YES! website. We organized the insights into three main categories: Information & Design Clarity, Content, and last but not least, Navigation & Usability.
Summarizing our findings into insights to better understand user needs.
Since participants gave pretty in-depth feedbacks, we wanted to summarize them so it would be easier for us to grasp a starting point. Having a summarized feedback helped us see the current problem from a higher level.
Enhancing Clarity & Navigation
Streamlining Filtering System
Maintaining Consistent Elements
USABILITY TESTING
Preparing for usability testing sessions, starting with task planning.
Having planned and facilitated usability tests before, I was familiar with the NN Group’s recommendations. We conducted remote moderated sessions with 5 participants, each completing 5 tasks. Participants were given a realistic scenario of the personas they were well suited for, and each test was approximately 45-min long.
Task #1
Scrolling through the landing page
Task #2
Finding when/where the event takes place
Task #3
Finding work based on type and program
Task #4
Looking for student info page
Task #5
Finding work done by specific student
What does our user demographic look like?
During the pre-test interview, we asked basic demographic questions to our participants since they were a good representation of visitors to our website in general. Here's what we learned about our user demographic:
Average age of participants was 27.8 years old
2 out of 5 participants have attended YES! in person
Average tech proficiency of participants was 94%
Gathering quantitative insights by measurement based on ISO standards.
Through the testing sessions, we were able to measure the website's usability issues based on its effectiveness, efficiency, and satisfaction rate. The numerical data we collected and analyed was to be used for comparisons and validations of our future design choices.
Completion Rate
Tasks 1, 2, and 4 scored 100%
Tasks 3 and 5 each had critical errors, bringing them to 80% completion rate
Error Rate
Measurement N/A for Task 1, as it does not involve an end goal
Tasks 3 and 5 came lowest in score in terms of effectiveness
Time on Task
Average time spent per task was 106.2 sec, which equals to 1.77 min
Total time completing the tasks was 531 sec, which is 8.85 min
Satisfaction Rate
Overall satisfaction rate was 3.55 out of 5, equaling to 71%
Qualitative insights were gathered based on post-task and post-test questions.
While the quantitative data provided accurate scores, understanding the "why" behind the numbers was crucial. After each task, we asked participants follow-up questions to gather their opinions and any additional feedback. Once all 5 tasks were completed, we concluded with 5 post-test interview questions.

Insights & Analysis
Feedback from the tests was documented and analyzed in FigJam to identify common themes
Common themes included distracting animations, long scrolls, glitches/responsive issues, complicated layout, cohesive design, etc.
Sentiment Score Chart
Measures positive and negative feedback, providing an overview of areas for improvement
Overall user sentiment score was derived by quantifying feedback from post-test interviews on key themes

Suggestions
Suggestions were also grouped for quick understanding of user needs and wants
Feedback was diverse, covering internal homepage navigation, layout adjustments to reduce scrolling, and improved responsiveness
DEFINE
Setting goals to make the website more user-friendly and effective in promoting the event.
The primary goal was to revamp the website into a more user-friendly version, but it was equally important to consider the value it would provide beyond direct interaction. For many users, the website could be a first impression of the school and the event in general, so its presentation could have a major impact on some users’ decision to attend the event or not. Thus, we came up with 3 main goals to tackle.
Improve Usability
Address the feedback from users about usability issues from the previous website and implement changes to elevate the user experience.
Increase User Engagement
Design the website to encourage user interaction and engagement via enhanced features, added animation, etc.
Enhance Event Promotion
Ensure that key information about the event is easy to find and well-promoted, helping to boost participation as well as website traffic.
Tangible goals that can be measured with accuracy.
To ensure the revamped website achieves its objectives, we established clear and measurable success metrics. With usability testing for the newly designed website still underway, these goals will help evaluate the effectiveness of the design once testing is complete.
While we don’t have previous KPIs to compare to, we plan to use usability testing scores to compare the old and new websites based on the following quantitative and qualitative goals:
IDEATE
SKETCHING
The committe lead had already established this year's branding in the beginning, so we jumped into sketching ideas for the teaser page. The teaser page was a preview version of the website that was to be launched to the public in hopes of bringing anticipation for the event and the final version of the website.
Our sketches were pretty messy, so we had to draw one final good copy to keep record of our idea😂
WIREFRAMING
Once the teaser page went under development, we immediately started to design the full website. Each committee member was given a major section of the website to redesign. I was in charge of improving the Work pages (main page and inner page) which was a gallery of graduate work.
While working on the Work page, I was also tasked with the Events page after a committee member left to assist another team that needed urgent help. Knowing many users would access the site on mobile devices, I designed the layout with responsiveness in mind to ensure it was mobile-friendly.
DESIGN
Creating a component library that streamlines the design and development process.
Due to the time constraint, we didn't think it was a wise decision to establish an entire design system, as we worried this process would take a long time. Instead, I created a component library that contained the necessary elements that we would use to make hi-fi prototypes.
DEPLOY
Problems kept surfacing — even after the website launched.
Some issues came to surface after the launch, which was a challenge as the website was already getting visitors. There was a bug with one of the font types, which appeared completely different on mobile devices compared to desktop. However, with the expertise of our developer and continuous back-and-forth of checking to see if the issue was solved, we were able to successfully launch the website!
The "fixed" bin for resolved issues
All our hard work paid off, and it was time to showcase the improvements we implemented.
The site-wide changes to the website included:
Moving the navigation bar from the left-hand side to a top-center position for better content focus and easier navigation
Keeping the countdown visible on all pages
Making the search bar more prominent by increasing contrast and adding "Search" text beside the icon
Below are the before and after shots of the Work page. We kept the grid layout to focus on visuals, but made key changes such as replacing dropdown filters with tab filters for better visibility. The dropdown required too much scrolling, so tab filters made searching easier. We also added tags for program and discipline, which work with the filters to display relevant student work when clicked.
Below are the before and after shots of the Events page. The previous layout spread events across the entire page, making it hard to navigate and search for specific events. Due to the design being static, it also lacked excitement. In the improved version, we introduced expandable/collapsible rows to give users more control over the information they wanted to view. We also added images from last year's event to build anticipation and make the page more dynamic.
Before
After
Phase 2 introduced a significant number of enhancements compared to Phase 1.
Work Page
Classes in Next World are run by AI teachers in order to keep the consistency of education materials. Classes are divided by the user’s level, and levels in Next World are defined by figurative words such as, “Tulip” or “Daisy”. There are no numbered grades that are visible to other users. In Next World, success is not measured with numbers or standards.
Events Page
Classes in Next World are run by AI teachers in order to keep the consistency of education materials. Classes are divided by the user’s level, and levels in Next World are defined by figurative words such as, “Tulip” or “Daisy”. There are no numbered grades that are visible to other users. In Next World, success is not measured with numbers or standards.