Turning Gut Feel Into Data: How Exit-Intent Prompts Drove a 45–50% Conversion Lift

Overview

If a user is about to leave your website, would showing them a call to action like a popup keep them on the website?

This is a thought shared by our client and ourselves, could a prompt not only keep them active but ideally put them on the path to conversion. 

The idea was fairly simple, but we needed a way to determine if a user was about to leave the website, and if detected run the experiment.

There would be a  control group but additionally we  only wanted to serve the experiment to users that were eligible e.g. those who looked like they were going to leave and this is where some complexity and nuance came into play.

Detecting when users are “likely” leaving

What are the possible signals a user might make if they intend to leave the website?

Our team landed on the following key behaviours:

  • When a users mouse leaves the browsers viewport e.g. navigating to the browser tool bar / tabs

  • If a user is inactive for ‘x’ seconds on a mobile view e.g. no touch, scroll, click or key interactions.

While there are other scenarios and behaviours that could be considered,  we felt the above would be suitable for the experiment we wanted to run.

Putting it  into practice was a  little more involved. While we could use our experiment platform to inject a variation with Javascript to monitor this behaviour, it would mean the experiment would be given to the user regardless of their eligibility. We wanted these actions to occur and therefore trigger the experiment, serving either the control or popup.

Leveraging Tag Management

Ideally we would consult and work with a clients development team to include custom code logic, however this is not always practical, especially for experiments that are not necessarily in for the long term.

Luckily we have  the wonderful world of Tag Management to  offer a way forward. Of course anytime custom code is added, it should be done with transparency and people who have both the front end and tag management experience to deliver it while minimizing potential risks.

Using Google Tag Manager and  a CHTML Tag, we injected JavaScript functionality for all users, limited to trigger only on the home page.  This script monitors for the targeted user behaviour and if a user becomes eligible, a hidden <div> tag is added to the page. 

The adding of the <div> tag is leveraged in the experiment platform as the trigger to run the experiment. Therefore the logic in the experiment platform is pretty basic, simply using inbuilt options and in this case the system design to support Single Page Applications e.g. content changes. The popup logic and code (html,css, js) was also packaged and delivered by the variation.

Experiment Setup:

GTM CHTML Tag:

Tag Name: CHTML - Utility - Experiment - User Leave Popup
Trigger: Page - DOM Ready

What does it do:

  • Checks to see if a user has already had the experiment, excludes them if they dont

  • Limit the experiment to run once per page

  • Setup native event listeners for:

    • Mouseleave

    • Touchstart

    • Scroll

    • Click

    • Keydown

  • Uses timers to reset user activity 

  • Add a <div> to the <body> if a user becomes eligible with a specific class “experiment_eligible_home_popup” for use in detection from the experiment platform

  • Push to the dataLayer if a user becomes eligible for reporting and debug purposes.

Code:

View on gist.github.com

Tracking Support

Using Google Analytics, both via the custom dataLayer and inbuilt triggers, we were able to place additional monitoring on the experiment. Due to its more complex setup, we wanted a way to check that the experience was working as intended.

 These were the supporting GA event we captured:

  • Event: experiment_custom

    • event_context: home popup;eligible;timeout

    • event_context: home popup;eligible;mouseleave

    • event_context: popup;show

    • event_context: popup;close | event_label: overlay close / icon close

    • event_context: popup;click | event_label: get a quote

As you can see we concatenated strings together and tracked under a fairly generic event name. As the purpose of these events was to inform, we didn't want to use up additional custom dimensions or create a series of new events, rather something we could reuse for other experiments in the future. 

Experiment Design

Outcome

Displaying the popup based on the leave conditions had a significant positive  impact. The conversation rate improvement over the control was about 45-50% based on the different goals we had set up. The experiments had a very high confidence score and after the trail period we moved the variation to serve all users given the success.

The provided CHTML Tag/Script  (with some minor modifications) could be used to work for your own experiments and or analytics tracking. If you have any questions or issues, reach out to me on linkedIn.

Takeaways

  • Intercepting intent at the moment of abandonment can  give a last relevant nudge instead of interrupting engaged users.

    • Showing the popup at leave moments lifted conversion 45–50% vs not showing them a prompt

  • Use a combination of platforms and tools to achieve your goals, leverage what each tool is great at and you can create some very interesting experiments.

  • Add additional tracking insights to your experiments, they can help debug your setup and  give you ideas on how to shape your experiments design and approach.

Next
Next

A Simple GTM Rename Helper