Implementing an experiment

In this guide, we will look at how to integrate sevensix into your custom javascript frontend. Before proceding, make sure to sign up and have your project ID ready. If you're new to Sevensix, make sure to dive deep in the sevensix method about working with CRO.

Before implementing an A/B test on your React application, ensure that you have installed sevensix-react. For installation instructions, refer to the installation guide.

1. Implementing an A/B experiment

Sevensix streamlines the process of setting up A/B tests with a developer-centric approach. Utilize the AbExperiment component to encapsulate both your control and experimental variations.

1a. Inserting the wrapper component

When creating an experiment in the Sevensix dashboard, this code snippet will automatically be provided for you to copy and paste into your project right away.

Typescript

import { AbExperiment } from 'sevensix-react'

export default function AbTest() {
  return (
    <AbExperiment
      projectSlug="example-project" // The project slug
      expId="ABX-10" // The experiment ID
      Original={() => <>Original code ...</>} // The original code goes here
      Variant={() => <>Variant code ...</>} // The new variant goes here
    />
  )
}

The <AbExperiment /> component will automatically render 50/50 split of the original and variant code. The Sevensix dashboard will track the performance of each variant and provide you with the results.

Internally, the component utilizes the React useEffect hook to establish a cookie during the initial render and to dispatch an event to track exposure.

1b. Tracking the conversion

Sevensix streamlines conversion tracking by providing an easy-to-use function that you can call at the point of conversion, such as during a form submission or a specific user action. It's important to integrate this tracking function into both your original and variant code to measure performance for both variants.

Typescript

import { conversion } from "sevensix-react";

// Handle conversion where you want to track it (e.g. form submit).
// Remember to call this function in both the original and variant code.
conversion({ experimentId: "ABX-10", projectId="example-project" });

2. Verifying an experiment

Once the experiment is configured, a verification step is required to confirm its proper implementation. This involves integrating the experiment into your application and then validating its operational status through the experiment setup workflow on the dashboard.

Technically, the experiment is verified when there has been registered 4 types of events: exposure A, conversion A, exposure B and conversion B. After that, the experiment can be started.

3. Starting the experiment

Upon completing the setup, you can activate the experiment via the dashboard, which will commence the tracking of each variant's performance. The experiment will continue to collect data until it reaches the predetermined number of sessions.

Please note that for efficiency, the application's code does not check the experiment's status and will continue to display the original and variant components in an equal 50/50 split even after the experiment's conclusion in the dashboard. Therefore, you must manually remove the experiment from your code once it has concluded.