Guide experiments allow you to compare up to two guides to determine which guide performed better against a specified metric. You can conduct experiments with different activation options, display methods, and platforms (for example, mobile) within the same application.
Note: This feature is in opt-in open beta and will be gradually added to Pendo subscriptions.
Prerequisites
To use guide experiments, you must meet the following requirements:
- Pendo Guide Creator or Content Editor user role.
- Pro or Enterprise Pendo subscription.
Set up an experiment
Select guides
Guides can only be added to one experiment at a time, and only previously created guides can be added to an experiment. We recommend fully building out and testing the guide before using it in an experiment.
Guides can’t be edited while the experiment is active. After it’s added, the guide inherits the settings of the experiment, including:
- The guide status is set to Draft, even if the guide status was originally set to Public.
- The guide segment is set to Everyone until changed in the experiment.
- The guide schedule is replaced with the experiment duration.
While the experiment status is set to Draft, all guide content, activation, and localization can still be edited. After the experiment status is set to Active, the Guide can't be edited.
You can create a new experiment from the experiments page or from the details page for a guide you want to add to the experiment.
From the Experiments page
- Go to Guides > Experiments and select Create experiment.
- Select the first guide you want to compare from the guides list, then select Add to experiment.
- Select the second guide you want to compare from the guides list, then select Create experiment.
From the guide details page
- Go to the details page for the guide that you want to add to the experiment.
- Select Create A/B test.
- Select the second guide you want to compare from the guides list, then select Create A/B test.
Configure experiment settings
After you’ve selected your two guides to compare in this experiment, you're prompted to configure the experiment settings.
To do so, from the experiment details page, select Edit, then set the following.
Select the key metric and attribution window
The Comparison metric is the conversion event that the user completes after viewing one of the guides. This is the criteria for determining which guide in the experiment performed better. For example, if you're comparing two abandoned cart messages aimed at users who didn't complete their purchase, the conversion event might be a click on the purchase button. The comparison metric can be a Page, Feature, or Track Event.
Next, set the Attribution window. This is the defined period of time during which the guide view is counted toward the conversion goal. The maximum Attribution window you can set is 14 days.
Note: The attribution window starts based on the last time the user saw the guide. If repeat views are enabled, or an embedded guide is selected, then the attribution window resets based on the last time the guide was viewed.
Set the experiment duration
The Duration is how long the experiment runs after it's started. After the duration window ends, all guides in the experiment are automatically set to Disabled and the experiment completes. The maximum duration is three months.
Set the segment and distribution
Select a Segment that includes the desired audience for this experiment. For more information about segments, including steps to create one, see Segments.
For each guide, set the Distribution percentage. This is the percentage of time that each guide is shown to visitors included in the segment for this experiment. If you’re unsure of what percentage to set, we recommend keeping the default 50/50 distribution.
Confirm settings
After all of these configurations have been set, select Save to confirm and close the configuration window.
Conduct the experiment
Activate the experiment
When you're ready to start the experiment, switch the experiment status to Active. Doing so automatically sets the guide status to Public, publishing them for the segment set for this experiment.
Note: Guides in the experiment can't be edited while the experiment is Active. If you need to edit guides in an active experiment, you must manually switch the experiment status from Active to Completed.
Complete the experiment
The experiment will automatically move to the Completed status when the experiment duration has expired or when you manually switch the experiment status from Active to Completed. When the experiment is moved to the Completed status, all guides in the experiment are automatically set to Disabled.
While in the Completed status, data may still need time to complete processing before it's ready. This is to account for users who saw a guide on the last day of the duration for the experiment, but who still have time to complete the conversion event per the experiment’s attribution window.
Guides in the experiment are now “released” from the experiment and can be edited and added to new experiments.
Determine the winning guide
After an experiment is set to Active, data starts to populate as visitors view the guides. We recommend waiting until the end of the duration before drawing any conclusions based on the data.
This data shows the rate at which users are performing the conversion event within the attribution window after viewing either Guide A or Guide B. The guide with the higher conversion rate is the recommended winner, though you can manually choose the winning guide.
The conversion rate for each guide is calculated by dividing the number of visitors who saw the guide and successfully completed the conversion event by the total number of visitors who saw the guide but didn't complete the conversion event.
Promote the winning guide
After the experiment ends, you may choose to promote the winning guide. This automatically publishes the guide to the selected segment, and unpublishes the other guide.
- Select Promote next to the guide you wish to publish.
- A pop-up appears asking you to confirm if you want to publish the guide to the remainder of the segment originally selected, or choose a new segment.
- Select Promote to confirm the segment, publish the selected guide, and unpublish the unselected guide.
Manage experiments
View all experiments in the experiments list
To view a list of all experiments, go to Guides > Experiments.
Delete an experiment
- Go to Guides > Experiments.
- From the list of experiments, scroll to or search for the experiment you want to delete.
- Select the more menu next to the status dropdown, then select Delete experiment.