Eva Home

How I Drove Growth at Eva through CRO Leadership and Data-Driven Design

UX/UI DESIGNConversion Rate OptimisationEcommerce Strategy

Overview

Eva designs furniture that's made for living. What started with an award-winning mattress-in-a-box has grown into a range of high-quality furniture, thoughtfully designed in Melbourne and delivered right across Australia. My journey at Eva began in its early stages of growth.

I started at Eva as a Digital Designer - the third member of a growing team. Like many hyper growth startups my role comprised of wearing multiple hats, which ranged from digital design, to user research, front-end dev, and even email strategy. After multiple rebrands, photoshoots, and subsequent UI makeovers, we settled on the current look and feel of the brand. I was able to narrow my focus onto UX/UI and leading out our design/tech projects. The key focus, growth through A/B testing, and with this I was able to help kick-off Eva's Conversion Rate Optimisation Program, leading a team of 3 to sucessfully run this program over the course of 2.5 years, contributing to the overall performance and revenue growth of the Eva brand.

My Role

Lead Digital Designer

- UX/UI Design

- Research & Analysis

- CRO Strategy & Implementation

Team

Digital Designer

Full Stack Dev

Copywriter

Duration

2.5 years: CRO activity

How did I get so involved in the CRO and A/B testing side of things?

I had to learn the hard way. Eva's first rebrand and site relaunch to be totally frank was quite volatile. A key challenge we faced, was a drop in conversion rate. Which upon later reflection was largely due to the adjustment our users were having to make to the new look and feel of the brand, and subsequent UX updates. With sheer determination we rode the wave, got through the teething issues and eventually got things back on track. This exercise showed us the risks that come with rebrands, and the adjustment period needed for users to relearn and trust.

Off the back of this experience, a new brand ethos was introduced - 'data driven decisions' Hence for our next rebrand, we took a very measured approach. Having survived the trenches of a rebrand and site relaunch, we decided to drip feed our new look and feel, and also stick to our current Shopify theme to reduce changes in the user experience. But the key activity that helped make this second time around a successful launch was we began to experiement with Google Optimise (which was free, and existed at the time haha).

One of the first tests I ran, was the wording on our homepage banner that highlighted our flagship product the Eva mattress. With the banner image and headline text remaining the same I focused on testing variations of the call-to-action, these were 'View the mattress' vs 'Shop the mattress' . The result of this test lead to a higher purchase conversion rate for users who were served the 'Shop the Mattress' variation. Our first attempt at A/B testing was a huge success.

What we gained from such a simple test was thousands of dollars in additional revenue. From here on in, the company were convinced, I was sold, and we were all in! After some resaearch and with confidence from key stakeholders, I had Eva invest in Visual Web Optimiser (VWO) an advanced A/B testing tool, and with this I lead a CRO strategy that would help Eva grow!

Strategy

Eva's strategy was centered on growth, particularly from a digital perspective. In my role as a data-driven designer, every new idea and feature affecting high-traffic pages or new products underwent thorough scrutiny aligned with company goals.

This approach entailed ongoing analysis utilising Google Analytics and Shopify for quantitative insights, and Heatmaps, Video Recordings, and surveys for qualitative perspectives as primary data sources. We closely monitored metrics like purchase conversion rate, add-to-cart rate, and overall page engagement, considering them pivotal for fueling Eva's revenue growth. Guided by a keen focus on these key numbers, our team devised strategies to improve these metrics and optimize the Eva website.

While experimentation was a key aspect, not everything was subject to it. For instance, during a new product launch, we ensured a solid foundation for information hierarchy, adhering to our brand and product page flow. Critical elements such as product images, purchase information, and add-to-cart functionality were placed prominently above the fold. Following this, we incorporated unique selling propositions, lifestyle imagery, technical specifications, and FAQs. The subsequent focus was on elevating the page for enhanced conversion.

Proccess

The proccess for conversion rate optimisation I developed at Eva involved the following steps - Research, Ideation, Implementation, Analysis, Learning & Review. The two main tools I utlised were Airtable for organising ideas/results and Visual Web Optimisesr (VWO) for A/B testing. Below is a sample of how I collected ideas, prioritised them and monitored the results. Key aspects to the table were the hypothesis, success metrics, and prioritisation method. The priortisation methodology I chose to use was a modified version of ICE which I called VICE. V = Velocity, I = Importance, C = Confidence and E = Ease. Each idea was scored 1 - 10 based on each category and with some handy Airtable configurations, we were able to automatically filter the highest scoring ideas from top to bottom. From here it was a matter of implementing tests and going through the cycle of review, analysis and ideation from there.

We managed to implement an average of 3 - 4 tests a month, which often depended on the level of complexity for each test.

What I learnt

Not all tests are winners

Not all tests will lead ot a winning result. If I were to put rough ratio to the amount of wins versus losses I'd say it was roughly 1 to 4, whereby 1 out 4 tests were winners. The misconception I often faced was a losing test was a wasted effort. It was common to have people who had a limited understanding of A/B testing assume a failed test was bad. What I learnt and eventually began to teach was that the activity of testing in an of its self, regardless of the wins and losses is always good. Why is it always good? To put simply A/B testing provides the oppurtunity to not only learn and grow, but also prevents costly mistakes.

It is better to test a new idea, rather then launch it and assume its having a positive impact. If you can't measure it then your essentially shooting for targets in the dark. The occasional wins were always great and looked impressive, but I learnt that they should not overshadow the tests that wern't sucessful, as these were equally important to sustaining and protecting Eva's growth.

Statistical Signficance can slow you down

In the first 6 months of our CRO strategy, we began to encounter tests that required more time to elicit a statistical significant result (We set our stat sig numbers to 95%, meaning there is a 95% chance that the observed difference is real and not due to random variation). This meant running tests taht took up to 2 months. Which in some cases would ultmately end up being inconclusive.

The problem we faced, was seeking a statistical significant result on every test drastically slowed down our frequency of tests, and we were forced to wait long periods of time before running new ones. To combat this, I implemented a strategy that looked at 'probability over a period of time'. If the probability of success was above 80% or therabouts, over a two week period with a solid number of test users accumulated. We were comfortable rolling the variation out.

The thinking was if a test was a having generally positive impact, and not showing any significant drops. Then we should feel confident rolling it out. Of course the aim was always to strive towards 'Stat Sig' results. But ultimately we wanted to be able to run tests, and run tests frequently, and so this approach helped us move efficiently and at speed.

Keep it lean, and take everyone on the journey

We kept it lean at Eva, and from a CRO perspective this helped us move at lightning speed. It was me on the CRO strategy/analysis/design along with a absolute gun of a developer that helped implement new features and tests. I acknoledge there are pros and cons to this approach. But my experience was less cooks in the kitchen combined with trust and autonomy lead to focus and quicker results!

In terms of taking everyone on the journey - on a monthly basis I provided Slack updates on our notable test results. They were quick, they were dirty but they did the job. It helped everyone from marketing, support, and product development, get a snapshot of what we were acheiving. It also added to the culture of testing and experimentation.

Below is an example of one of the quick updates I'd share on Slack on a monthly basis to keep the team informed. A few screenshots on a Google slide was enough to tell the story.

The Impact of CRO

Eva's most significant transformation stemmed from adopting a data-driven approach, yielding invaluable insights throughout the website. The journey involved both successes and challenges in our testing, sparking a plethora of ideas that lead to well thought out design updates. At the core, knowledge paired with incremental improvement emerged as the ultimate gain. Whereby A/B testing became our guiding light, preventing us from navigating blindly.

Through conversion rate optimisation, I managed to achieve 15-20% improvement in overall purchase conversion rate. Along with this we sustained a healthy add-to-cart rate and fortified our design and development initiatives for positive impact, consistently helping the Eva brand grow. Below is an example from the numerous A/B tests that yielded a positive result.

A/B Test Case Studies

More Projects