Overview

Year: September 2023
Timeline: 4 weeks
Role: Lead Product Designer and Researcher

Background

Bonnet enhances sustainable transport by offering a unified app for easy access to numerous EV charging networks across Europe, catering to both individual drivers and B2B partners. This case study addresses the need to improve Bonnet's rewards system.

Problems to solve
  1. Improving customer retention and satisfaction.
  2. Enhancing the app's rewards system.
  3. Differentiating Bonnet in a competitive market.
  4. Designing within iOS and Android technical constraints.
The goal

Boost loyalty, stand out in a competitive market, and positively impact key metrics like churn rate and customer satisfaction.

Bonnet's mission is to fight climate change by accelerating the uptake of sustainable transport. They’re doing this by making EV charging accessible to all - not just those with a private driveway. Using innovative technology, they are connecting EV drivers to the best charging networks available and turning every charge point into an accessible and simple charger that any driver can use.


Bonnet connects 25+ of the best EV charging networks in Europe, with more being added continuously. This means one app, one account, one payment all being centralised through the Bonnet mobile app.

Bonnet has B2B partners and tens of thousands of EV driver customers that trust them to create the best EV charging experience possible - so that's exactly what I’m helping them to achieve. Bonnet has an IOS, Android, and Web app (B2B).

With the plan subscriptions rolling out in July (Starter, Standard, and Plus), we need to understand how customers navigate through our account types and plans and the user journey they would expect to take to sign up and find relevant information.

The problem we were trying to solve for Tiltsta's clients was how to maximise their marketing budget and generate greater revenue in their social channels on mobile.

Let’s start with the problem – why focus on rewards to drive growth?

This started as a wider picture - addressing how do we inspire loyal customers and brand advocates. We approached this problem through holding workshops. We wanted to focus on retaining and boosting customers on subscriptions. This came after a downwards trend of active subscribers. To summarise our key problems we addressed in the workshops:

  • There’s a lot of choice in the market, what makes Bonnet stand out above others
  • The Bonnet reward system isn’t designed to reward ongoing loyalty
  • I want the most cost effective charging service
  • How do we encourage new users to sign-up?
  • How do we inspire subscribers?

Collaborative workshops (from whiteboard to Figjam)

Some examples of the workshop sessions below:

The outcome of the workshops

Looking at improving rewards was identified as a high impact - low effort direction. The low effort was supported by already having a reward system in place. This made it easier from a technical perspective, as there was already back-end infrastructure to build on. We also had complete control over this aspect of the app and don’t rely on any third-party services to process rewards. It was also deemed high impact so it would alleviate charging costs to users. High costs is the most reoccurring negative theme when analysing our NPS surveys (from the detractor group score between 0-6), and when conducting user interviews.


So what does this look like?

We introduce weekly charging rewards. These rewards needs to be differentiated from regular rewards as they refresh weekly, are related to charging only, and are only available to subscribing customers. ‘Other rewards’ refer to account rewards, which will disappear once a user has completed them. Charging rewards contribute to points. Once a user reaches 8 points, they receive a monetary reward, in the form of a charging credit.

Success metrics

We established success metrics with all stakeholders. We used these are benchmarks to determine the success of the new features:

  • Boost churn (subscription) to reduce at least 10% from previous months

  • Amount of reward points given away through new rewards to grow by 7% WoW for first 8 weeks
  • Continue to monitor and see increase in score for NPS and customer success surveys 

  • Continue to improve customer effort scores (CES) 

  • Continue to improve Customer Satisfaction Score (CSAT)

Starting to look at visual research

At the beginning of the design process, I look at what we currently have in-app and within our design system. I also so competitor research and source inspiration through various resources.

User research

Although, we had a tight timeline, we could use existing data and surveys to understand how our users currently engaged with existing rewards.

Looking at quantitive data, we could see trends. We discovered the fluctuations aligned with the releases of customer surveys. This was insightful as it indicated  that users are familiar with this repetitive behaviour and would consistently visit the same screens when informed of a new way to earn reward points.

We also documented all current data points, such as total amount of reward point claims in-app so we could compare the impact post-launch.

Screen shot from Google Analytics
See more work

The discovery in the design process

Using user journey maps, we can identify all relevant touch points and get a more holistic perspective of the changes required.

Discovery continued

After we’ve identified all journey touch-points, we’re now in Figma, looking at how we can solve this design and user problem. We combine research and inspiration to start brainstorming ideas. Feedback and reviews are part of this step: combining Design, Product, and engineering. Best UX laws and practises are front-of-mind during this process.

Some screenshots of various design directions created during the discovery stage
See more work

Emerging challenges

As I began to explore different design directions the challenges became clearer

  • Differentiating between weekly rewards and account rewards.
  • Giving rewards it’s own branding to be consistent across the app, where we promote rewards. For example, creating components that will scale across the app and are easily placed.
  • Working within technical constraints. This flow within the app used a lot of legacy code and flows. I worked closely with the team to understand technical limitations.
  • Understanding what differences we may face between IOS/Android
  • Designing different states/views to cater to user who have access to the rewards and users who do not have access.
  • How and where to promote the new charging rewards to users who are not currently on a subscription.

Refining the design options

Taking the learnings from the discovery stage, I moved to the ‘Flow’ stage. A few keys milestones of this step in the process include: 


  • Tidying up the Figma file so there are clear options (usually no more than three
  • If A/B tests are included - defining which ones
  • Working with engineering to understand which option is more feasible from a technical perspective - this is done through hosting workshop reviews
  • Identify any aspects of the journey that have been overlooked
  • Each option will be scored based on time/resources needed to complete it
  • All teams agree on an option, so design team can proceed to document handoff
See more work

Moving to handoff documentation

You guessed it, this step involves taking all the agreed options, and documenting each screen, flow, and new components in a very detailed way. The more detail I can provide, the easier it’s going to be for anyone in the development team to pick up these tickets and understand with ease. I always record a Loom video when I handoff a Figma file to the Engineering team. If it’s a really complex project which touches many parts of the app, I may record separate videos. Below is an example of a series of prototypes I provide to help the team. Example of Handoff can be provided.

Example of handoff page in Figma

What happens after handoff?

  • We help QA the designs and flows in staging and provide feedback where necessary
  • Once the staging build has been reviewed, and in most cases updated, the new feature or update will be released to the app store
  • We continue to analyse results from any experiments (90 day period)
  • We closely watch NPS, Customer effort scores, Customer satisfaction scores (reported monthly) to spot any releases that might negatively impact User Experience
  • We work with Product and contribute to retro sessions - where we present findings and updates on the progress of the new features and how it measures against it’s success metrics (each project will have different timelines depending on what’s being measured.
  • One crucial last step, is to document new components and screens into our global design system in Figma. We update the components for IOS and Android. This happens once everything has gone live. This way, we can ensure what’s being added to the design system is 100% accurate to what users see in the live app.
See more work

Results and takeaways

Overall we are still measuring the success metrics that were outlined in this case study. However, at a first glance we are seeing improvements to our NPS, CES, and CSAT scores. These improved scores are also received with specific positive information on the changes to the reward experience and general UX improvements.

Some key takeaways and areas for improvement from this project were:

  • Old legacy design system (across many many screens). This came as an afterthought due to time constraints. While we did agree on a phased approach to update older areas of the app, it should have been something that was scoped out earlier.
  • It was my first time working with Lottie files across IOS/Android and I quickly realised that the same solution struggled to scaled across both platforms in the same way - next time scope this challenges upfront.

More case studies