UX/UI Design Case Studies

Australian Government Clean Energy Regulator

UI Redesign

Designing a more intuitive experience for users


The Back Story

The Australian Government Clean Energy Regulator (AGCER) is the government body responsible for administering legislation to reduce carbon emissions and increase the use of clean energy. Everything they do is connected to educating the general public about carbon abatement and measuring, managing, reducing or offsetting Australia's carbon emissions. 

The AGCER administers a range of schemes to support its objectives. Some schemes provide a framework for reporting company information about greenhouse emissions, and energy production and consumption, while others provide incentives for businesses, landowners and individuals to adopt new practices and technologies which reduce emissions.


The Task

I worked with two other user experience designers during the research and problem definition phase. We were tasked with reviewing the AGCER website to see if there were any usability issues and to provide recommendations for a better user experience. After defining key problem areas, I worked alone to redesign a new user interface and then circled back with my team members to gather feedback and iterate on my design.

 

First Impressions

The first thing we did was review the site both individually and as a team to familiarize ourselves with the AGCER; who they were, their goals, strategy, how they structured their content and how we felt while conducting this exploration. After taking notes from our observations, we made some initial assessments. However, before we could act on anything, we needed to answer questions like:

1. Who would visit a site like this?
2. What kind of information are they looking for?
3. How easily can they navigate the site?
4. Are they able to find and easily act on the information if applicable?






The Case for Julie Green

 

Proto Persona of Julie Green

 

We hypothesized based on assumptions and to initially define a target user. Our proto persona was Julie Green, a savvy, herb farmer looking to take advantage of any incentive programs the AGCER had to offer, so she could introduce more sustainable farming practices and reduce her energy costs. 

Why would Julie visit the AGCER site?

Exploratory: 

    • To see who AGCER is, and what they do

    • To confirm she is adhering to government policy in her farming practice

    • To find any incentive programs for carbon abatement

Task Execution

    • Once satisfied with the information she finds, explores further to determine if eligible to participate in the “Emissions Reduction Fund” scheme by completing the interactive questionnaire

    • Registers to start the application process

Thinking about Julie’s journey, a typical user path was illustrated and would later be used to help us outline our Research Plan.

 

User Path

 

Initial user testing would provide invaluable insights and directly impact how we’d start to design the user experience: would users take the typical path to complete tasks, and if not, what was their thinking process? 

We wanted to see how users would behave while completing tasks, by observing their actions and emotions, documenting them in detail, and having measurable indicators to compare against. Annotations were made on each respective page by redlining. To assess any usability and accessibility issues, a heuristic evaluation was conducted and any issues were recorded. Recommendations were also noted.

Stage 1 User Testing

After outlining the objectives of our research plan, we conducted five usability tests. Our testers ranged in age from 28 to 42 years of age. All of them were computer-savvy individuals and all had experience navigating different government sites. As we needed to test the responsiveness of the current design, we also had one of our testers test the site on his mobile device. 

 

Using a Miro Board to document observations

 
 

We documented the time it took for each user to complete tasks, so we could measure results and assess if our recommendations were justifiable.

The Challenge

Users brought to our attention many features on the current site we hadn’t noticed initially ourselves, eg. the numbers of links on each page. Users weren’t sure if they opened new pages with entirely new content or if they were similar pages outlining the same content, hence needless duplication and frustration due to extra time taken to explore these pages. At this point, it became clear that measuring the time taken to complete each task was an excellent indicator of success.

This valuable exercise highlighted some key pain points and usability issues, mainly:

  1. Users felt overwhelmed when navigating the site, due to the vast amount of content available on each page

  2. The verbiage on the site wasn’t consistent, adding to the confusion

  3. Incentive schemes weren’t easy to discover and some users (although having completed all tasks successfully) grew impatient

  4. The design was dry and the site was boring

  5. The way the information was organized, wasn’t optimal, eg. news articles and updates were given precedence over other important information, like accessing and applying for incentive programs and reviewing reports.

  6. Responsiveness on mobile devices was not ideal

After these insightful findings, we quickly updated our initial annotations and took detailed notes to help us stay on track. Due to time constraints, this initial stage of user testing helped us prioritize which aspects of the site we needed to focus on first, based on our persona, Julie’s user path.

A Much Needed Facelift

The IA (Information Architecture) of the entire site was then reviewed using the card sorting method. This would also address needless duplication of the content appearing on the current site. Content was first defined, then grouped, and finally restructured. Ideas were sketched and iterated on. A refreshing new sitemap was then developed, being mindful of the user path at every turn.

 

New sitemap developed after card sorting exercise

 

How did we address some of the pain points of our users?

  1. Users feeling overwhelmed when navigating the site due to the vast amount of content on each page.
    To address this, the content was arranged into cards on each applicable page, so users could quickly scan for relevant information, and then click on the “More” button if they chose to explore further.

  2. The verbiage on the site not being consistent, adding to the confusion.
    We used consistent verbiage and terms throughout, to help users navigate content easily.

  3. Incentive schemes weren’t easy to discover and some users (although having completed all tasks successfully) grew impatient.
    We featured these schemes as cards on the homepage, so users could easily scan and explore further if they chose to.

  4. The design was dry and the site was boring.
    We introduced a soothing green accent colour into the overall colour scheme but ensured that we still followed the AGCER brand. The colour green is associated with the environment, life, renewal, nature, and energy.

  5. The way the information was organized, wasn’t optimal, eg. news articles and updates were given precedence over other important information, like accessing and applying for incentive programs and reviewing reports.
    We gave prominence to what users felt was more important information like incentive programs on the homepage, and moved less important information to the bottom, eg. news and articles. We also used our new green colour to compartmentalize information on the site. The homepage would now feature a quick video introducing the AGCER, informing users about who they were and what they do.

  6. Responsiveness on mobile devices was not ideal.
    We ensured that the website was responsive for all devices, by organizing information into easily digestible cards. The mobile menu interface was designed to ensure that users could easily tap and find content easily.

Setting the Tone and Structure

Once all usability issues were documented, and possible solutions were ideated, moodboards were developed. These moodboards – with inspirational imagery, patterns, fonts, typography and colours – would help define a theme. 

Three separate moodboards were created: 

Section 1: UI Samples and Inspiration (to define a general theme)
Section 2: UI Patterns (to help us determine which designs and interactions could work)
Section 3: Government Agency Inspiration (a more refined approach to how we saw the design coming together)

One of three moodboards developed for inspiration

More testing was required at this point, but before doing so, I needed to design wireframes for the relevant pages starting with the navigation on both desktop and mobile devices to ensure responsiveness and to set the tone for all other pages. The moodboards provided some of the inspiration when designing the wireframes. Interactions were then incorporated into the wireframe, so they could be tested and validated when the time came. A clickable prototype was then developed. I tried to be mindful of keeping interactions simple, yet delightful and functional.

Wireframes

Stage 2 User Testing

A modal was introduced to guide users back on the right path

Five-second usability tests were conducted on the homepage for both desktop and mobile. Comments were documented and iterated on. 

I was truly thankful for this stage of the process, as users pointed out key areas needing improvement right away, which, if interated on, would impact the user experience immediately. This allowed me to use my time more efficiently, by addressing user pain points and testing for validation before any further work was done. For example, users felt that the new navigation on mobile took up far more real estate than was needed. After iterating and retesting later, users commented they really liked the new mobile navigation and it was easy to use, thus taking them less time to complete tasks!

Another key insight was that some users would visit the site at least once before applying for any programs. They would review the site first, retrieve any information they would need, compile any documentation required for the application process and then revisit the site to apply. Knowing this, I introduced an intuitive modal callout to prompt first-time users to complete the interactive questionnaire, validating their eligibility before going through the entire application process. If users were to discover they were ineligible or were applying for the wrong incentive scheme after completing the application process, this would really frustrate them, and understandably! Previous users who had completed the questionnaire successfully could skip straight to the application process. This proved to be a game-changer!

Consistency is Key

At this point, I was ready to create a style tile, which would serve as an important branding guideline for the website. A UI style guide was then developed using the style tile as a reference. It was important to develop this prior to building out any further pages, as it would help in maintaining consistency throughout.

I found myself constantly moving back and forth between the prototypes and this style guide, iterating and updating as was necessary. After additional iterations were made to align with user feedback, a high-fidelity prototype of the main homepage was developed for both desktop and mobile. Supporting pages for desktop, tablet and mobile devices were built out to ensure that the user path could be completed and interacted with on all devices.

Stage 3 User Testing

A Usability Testing Plan helped me to ensure objectives were being met when testing this iterated, high-fidelity prototype, as well as to measure KPIs in the form of time spent completing tasks. Team members also provided valuable feedback at this point, which was considered and then iterated on before Stage 3 testing was initiated.

A total of ten participants were recruited to test both the desktop and mobile versions of the website, to ensure all objectives were met, as well as to document and iterate on the prototype if needed.

The Results

KPI’s were measured by documenting the time it took for users to complete each task on the original website, and again after the website was redesigned (on desktop and mobile devices).

My team and I were thrilled to find that after applying the new UI Design, users on average took 138 seconds or just over 2 minutes less time to navigate and complete tasks.

 

KPI’s showing drastic reduction in the time users took to complete tasks

 

Final Thoughts

While analyzing the original website, I came to realize how easy it is to overlook important aspects of the design which could impact the user experience in a negative way, e.g. if the site is not as accessible, legible or easy to navigate as it could be. Sometimes small, but impactful changes or updates can affect the user experience in a big way.

Truly understanding user pain points and reasons why they come to the website in the first place, helped me determine the complexity (or simplicity) of the design. Ultimately, simplicity (in terms of the information architecture and visual design) was the best solution for creating a better experience, especially in the case of a government website, where content can be very heavy and overwhelming.

I’ve learned that user testing before, during and after the design process, as well as constant iteration, is crucial if we are to design a better experience in every way for our users.

AGCER website - BEFORE REDESIGN

 

AGCER website - AFTER REDESIGN