# Testing

Crobox gives you multiple ways to test your campaigns — from A/B testing with control groups to testing different messages based on persuasion principles. This guide explains the available testing options, when to use them, and how to set them up step by step.

#### Why Test Campaigns?

Testing allows you to:

* **Prove campaign effectiveness** before scaling
* **Compare messaging approaches** using persuasion tactics
* **Optimize continuously** based on real-world performance and your audience

You can implement testing at three levels in Crobox Campaign set up, find the details below.

***

## A/B Test with a Control Group

**📍Where to set up**: `Setup` tab → “Performance Tracking” section

#### What It Does

Splits your traffic into:

* **Crobox group**: Sees your campaign
* **Control group**: Sees no campaign

This helps you measure the **true impact** of a campaign by comparing exposed vs. non-exposed users.

#### How to Set It Up

1. Go to your campaign’s **Setup** tab
2. Ensure **“Enable Performance Tracking”** is turned ON
3. Adjust the **Traffic Ratio** slider (e.g., 50% exposed, 50% control)
4. Save the campaign draft

{% hint style="success" %}
Best for validating whether your campaign is having a meaningful impact vs. showing nothing at all
{% endhint %}

***

## Linked Campaign Testing (Advanced Settings)

For advanced testing setup, you can compare two or more campaigns directly by linking them and aligning their priority. This method supports both standard A/B testing as well as A/B/C testing with a control group.

**📍Where to set up**: `Set up` tab → “Advanced”

### **Option 1: A/B Test Without a Control Group**

***Linked Campaigns, Performance Tracking Off***

You can run a head-to-head comparison between two (or more) campaigns in the same placeholder without using a control group. To Implement:

1. Turn off Performance Tracking in each campaign’s Setup tab
2. Set both campaigns to the same Campaign Priority (copy Campaign A priority and paste into Campaign B within the advanced section)
3. Activate the toggle for "**Link this campaign performance to other campaigns**", and select the campaign/s
4. Ensure campaigns target the same placeholder
5. When published, users will see either Campaign A or Campaign B based on Crobox’s internal rotation logic

{% hint style="warning" %}
This method doesn't show you a true control group baseline — it only compares the two active campaigns
{% endhint %}

{% hint style="danger" %}
Campaigns must target the same placeholder, and have the same priority set in advanced settings
{% endhint %}

### **Option 2: A/B/C Test With a Control Group**

***Linked Campaigns, Performance Tracking On***

This setup lets you compare:

* Campaign A
* Campaign B (linked to A)
* A control group with no campaign

To implement:

1. Turn on Performance Tracking in both campaigns
2. Set both campaigns to the same Campaign Priority (copy Campaign A priority and paste into Campaign B within the advanced section)
3. In Campaign A, activate toggle **“Link this campaign to another for performance tracking”** and select Campaign B
4. Ensure campaigns target the same placeholder
5. When published, users will see either Campaign A, Campaign B or no campaign based on Crobox’s internal rotation logic

With this setup:

* Users are randomly split into three groups: one sees Campaign A, one sees Campaign B, and one sees nothing (control)
* Results will include clear control group data for performance benchmarking

***

## Testing Content Variants with Persuasion Messaging

**📍Where to set up**: `Content` tab → “Personalize Content”

#### What It Does

Test **multiple message variations** within the same campaign. Ideal for experimenting with **persuasion tactics** like urgency, social proof, scarcity, or authority.

#### How to Set It Up

1. Go to your campaign’s **Content** tab
2. Click **“Personalize Content”**
3. Click **Add a new message**
4. Give each variant a clear name (e.g., “Urgency - Low Stock”, “Social Proof - Best Seller”)
5. Add your content for each message
6. *(Optional)* Apply filters to target specific segments for each meassage
7. Localize each variant as needed
8. *(Optional but recommended in pilot phase)* Make sure **Performance Tracking** is enabled in the Setup tab to measure impact

#### Example Persuasion Principle Variants

| Persuasion Principle | Example Message                                                    | Filters Applied                         |
| -------------------- | ------------------------------------------------------------------ | --------------------------------------- |
| **Urgency**          | “Hurry! Only a few left in stock”                                  | Stock < 1                               |
| **Social Proof**     | “Trending”                                                         | Top 30% bought - last month (/category) |
| **Scarcity**         | “Limited Edition – While Supplies Last”                            | Product ID / Product Tag                |
| **Scarcity**         | "Members receive an extra 15% off!"                                | Visitor Segment                         |
| **Authority**        | “Expert-recommended pick”                                          | Product ID / Product Tag                |
| **Luxury trait**     | "Iconic"                                                           | Product ID / Product Tag                |
| **Reasons Why**      | "100% secure payments with: Visa, MasterCard, PayPal, XXX"         | Page Type                               |
| **Endowment Effect** | "Almost there! You're just clicks away from owning your new gear!" | Page Type                               |

{% hint style="info" %}
Use Content Tags to group variants for better reporting across campaigns.
{% endhint %}

#### Why Use This

* Fastest way to experiment without duplicating campaigns
* Clear analytics for each message
* Ideal for **iterating on messaging** based on shopper psychology

***

## Choosing the Right Test Method

| Goal                                       | Method                   | Setup Area       |
| ------------------------------------------ | ------------------------ | ---------------- |
| Test campaign vs. no campaign              | A/B with Control Group   | Setup tab        |
| Compare two campaign strategies or formats | Linked Campaign A/B Test | Setup (Advanced) |
| Test copy, messages, persuasion tactics    | Content Variants         | Content tab      |

### Best Practices for Testing

* Use **50/50 traffic splits** for faster results
* **Test one variable at a time** (message, design, or placement)
* Let tests run long enough to be reliable (depending on traffic)
* Review results in **Analytics** and iterate
* Tag variants clearly for cleaner reporting
