Partner name: MightyHive
Partner Website URL: https://www.mightyhive.com
MightyHive has a dedicated team of CRO specialists, Developers, Web Analysts, UI/UX and Creative Designers, and Account Managers to identify opportunities and develop advanced roadmaps for optimization and personalization experiments while following best practices. MH helps clients define and quantify their business goals, create data-driven experiments to maximize ROI across their web and mobile properties, and up-level internal teams to become more sophisticated digital marketers.
Client name: Intice
Client website: www.intice.com
Home Page URL: www.intice.com (case study end-client auto-dealership: www.feldmannimports.com)
Intice provides a full suite of lead conversion tools and marketing automation tools for automotive dealerships. It is dedicated to help convert more leads, increase showroom traffic, and close more deals.
Section 1: Client Overview and Challenge
Intice is a marketing technology vendor which offers a suite of tools to auto dealerships that help to increase website engagement, dealer visitation, and conversions. Intice engaged MightyHive to test design variants of its ‘Leadmaker’ engagement tool in order to maximize conversion rates of anonymous website visitors into leads and encourage them to share their information and commit to visiting the dealership. MightyHive’s CRO team reviewed the product architecture, various design elements, and identified multiple variables for testing based on their projected performance magnitude. After implementing the experiment on a single dealer site and testing for a period of 18 days spanning 5,239 Experiment Sessions, the winning variant led to 132% increase in click conversions overall and 183% increase on Tablet and Desktop devices only. Following the initial successful test, Intice plans to scale the Leadmaker experiment to over 8 dealerships across the US, and has engaged MightyHive to design similar tests across its Trademaker and Dealmaker products.
MightyHive spearheaded the process of improving the conversion rate of site visitors to leads in the overall customers life cycle for a dealership. Through this A/B/n test, we optimized the first step of the customer journey and now looking forward to optimizing the Trademaker and Dealmaker engagement tools. We had the scope of testing colors and fonts while staying on-brand. Also, we got the approval from the client’s legal team because we were offering a digital copy of the $25 Visa Reward card. On the technical side, deploying Optimize in a custom CMS (content management system) needed to be consistently compatible with all end-user dealership sites that used the CMS.
Section 2: Testing Strategy & Plan
1. Identify problems based on business, marketing or conversion goals: Defined business objectives, website goals, Key Performance Indicators (KPIs), and Target Metrics.
2. Develop hypothesis from data to solve these problems.
3. Create and test page variants based on the above hypothesis: Prioritize proposed action items using a scoring system. While prioritizing, we kept in mind the ease of implementation, understood the maximum potential for one element to skyrocket the conversion as opposed to providing only minimal increments.
4. Understand and interpret experiment results: Analyze post-test results and map the results against primary goal. We looked for the accuracy of our test results which was expressed in the form of statistical confidence. We ran the experiment until it achieved at least 95% confidence interval and maintained the significance level of at least 95% for a few days.
Test Approach and Plan:
1. Performed research and learned about Feldmann Imports’ users
2. Identified the problem
3. Created hypothesis for solving the problem identified
4. Implemented Optimize using Google Tag Manager and connected the Google Analytics view. Created an experiment
5. Got comfortable with the layout and style of the website
6. Performed priority scoring of the possible changes that can be done in the leadmaker engagement pop-up
7. Created and proofread the variants: Made changes in the call-to-action and button background color. Designed elements so that visitors’ attention remains focused and guided towards the call to action
8. Set the objectives and targeting rules along with experiment description
9. Allocate the proper traffic to the variants
10. Execute the experiment 11. Monitor and understand the performance of the variants
12. End the experiment when the statistical significance is achieved and remains statistically significant for a couple of days to ensure the reliability of the result
13. Take action on new information achieved through the results of the experiment: Implement the winning variant on the websites of other dealerships
Section 3: Optimize Container & Test Implementation
We deployed Google Optimize snippet using Google Tag Manager by creating a new tag of ‘Google Optimize’ configuration type and setting the ‘Google Analytics Setting’ Variable to ensure consistency between Tag Manager and Analytics tags. Also, we triggered it on ‘All Pages - Pageview’. Then, we submitted the changes and published the container.
We worked on the leadmaker tool pop-up on the homepage of the www.feldmannimports.com and used EDIT ELEMENT feature within the Optimize Editor. Altogether, the business goal was to create variants with a combination of button color and button text. We finalized 4 different variants including the original. Here are the 4 variants:
The home page looked like below where the variants appeared during the experiment as part of pop-up:
Below are the four variants:
Objectives: The landing page on the GET OFFER button click was used for creating the experiment objective which helped us in understanding the performance of the variants. We used custom objective type ‘Pageviews’ with the rule ‘Page’ RegEx expression as part of the destination page URL as highlighted in the below screenshot. The experiment had a proper description so that every user having access to this account can easily understand the experiment and follow its details. This objective is eventually related to visitor to lead form goal completion. Here’s the screenshot of the experiment objective:
Targeting: As the goal was to increase clicks on the GET OFFER button, we targeted every desktop, mobile, and tablet user who visited www.feldmannimports.com to maximize the scope of the experiment at all times during the day without any pause. We ran the experiment on multiple weekends and for more than two weeks to ensure that we collect all kinds of traffic. We targeted 25% audience to each of the variants from the very beginning so that every variant is equally likely to perform with respect to incoming traffic.
Section 4: Results & Business Insight
Performance by variants :
1. Variant ‘ Green - Claim Your Reward Card ’ outperforms control variant by achieving probability to beat baseline of 96% and outperforms other variants by becoming the best as highlighted by ‘ probability to be best’ with value of 92%. The winning variants had statistically significant performance for more than 3 days.
2. The winning variants has the least overlap of conversion rates in the middle 95th and middle 50th range. Hence, the probabilities go up for this variant making it the best performing amongst all.
3. The winning variant also shows the improvement of up to 279% over the original. This is achieved using the Bayesian Inference models which takes into consideration the accuracy of the performance of the variants over time. Google Optimize is better able to model all of the factors that can impact the test results and can provide results more quickly in low-traffic experiments as well, since it doesn’t require minimum sample sizes and can rely on other aspects of the experiment results.
We can find the below report in the Optimize experiments reports in Google Analytics. Clearly, the winning variant highlights the improvement of more than doubling the CTR over a span of 18 days. There has been an increase of 132% overall.
All Users (GA reports) : Also, the winning variant has increased the Search Form completions (lead form completions) from 1.02% to 1.71% and increased the overall Goal Conversion Rate from 38.93% to 41.11%
All Users Site Usage (GA Report) : The winning variant decreased the bounce rate from 15.47% to 13.21%.
Tablet and Desktop Users (GA Report) : The winning variant has increased the Search Form completions (lead form completions) from 1.36% to 2.57%
New Users (GA Report) : The winning variant has increased the Search Form completions (lead form completions)
from 2.41% to 3.65%
This experiment helped Intice finalize a more intuitive design update in their leadmaker conversion product which will help dealerships gain more leads in short amount of time and which will eventually help in optimizing the customer journey. The results for Feldmann Imports will act as a catalyst to implement the similar design on other dealership websites as well. With Optimize, not only we can increase the overall conversion rate, but also, now we will be able to continuously improve upon those results and prove it with a level of detail that we simply could not have done previously.
1. Implement the changes per the winning variant across all dealerships to see the growth in leads conversions across US.
2. Increase the count of strategic Optimize experiments on dealership websites to minimize CRO and UI/UX bottlenecks.
3. Run multiple experiments simultaneously (not interfering with each other in functionality and performance) on various dealership websites so that we can find the best variants across the web touch points and implement the results (wherever possible) to other dealerships. This will increase overall lead count, increase the quality of the leads converted, and get more shoppers in your showroom.