LogoLogo
Product TourResourcesLog InBook a Demo
  • Welcome to Crobox
  • What's New
  • GETTING STARTED
    • First Steps
      • Crobox Snippet Implementation
        • Manual Snippet Implementation
        • GTM Snippet Implementation
      • Event Tracking Implementation
        • Pushing Events to Crobox
        • Content Security Policy Header
      • Cookie Wall Settings
    • Launch Your First Product Advisor
      • Choosing the Right Product Category
  • How to guides
    • Product Finders
      • Setup your Advisor
      • Manage the Question Flow
      • Finder Editor
        • Page Settings
        • Question Editor
        • Answer Editor
      • Translations
      • Create Activations
      • Activations: Best Practices
      • Product Quality Assurance
      • A/B Testing
      • Publishing & Versions
    • Campaigns
      • Create a Campaign
      • Testing
      • Campaign Performance
      • Adding a Campaign Category
      • What are the differences between Campaign Types?
    • Product Data
      • Setting up a Product Feed
      • Manage and Transform Product Properties
      • Product Data Enrichment
      • How to Create and Edit Product Tags
      • Adding a Property Category
    • Analytics Dashboard
      • Data Confidence
    • Product Recommenders
      • Creating a Recommender
    • FAQ
      • Performance & Security
      • Data & GDPR
      • How do I track the performance of my campaigns?
      • How do I create segments?
      • What are Smart Filters?
      • What's the difference between CTR impact and relative impact?
      • What's the difference between A/B testing, multivariate testing, and AI?
  • TECHNICAL DOCUMENTATION
    • Setting Custom Visitor Properties with Pageview API
    • Product Advisor Event Tracking Integration
    • Pre-selecting Advisor Questions
    • Custom Themes and CSS
  • Security & Compliance
    • Security Managment
    • Data Security
    • Legal
      • Cookie Policy
      • Developer Mode
      • General Terms and Conditions
  • ADMINISTRATION
    • User Management
    • Accounts and Billing
    • Troubleshooting and Support
Powered by GitBook
LogoLogo

Crobox

  • Product Tour
  • Crobox vs. The Competition

About

  • About Crobox
  • Partners
  • Careers
  • Ambassador Program

Resources

  • Trust Center
  • Blog
  • Resources
  • Privacy Policy
On this page
  • A/B Test with a Control Group
  • Linked Campaign Testing (Advanced Settings)
  • Option 1: A/B Test Without a Control Group
  • Option 2: A/B/C Test With a Control Group
  • Testing Content Variants with Persuasion Messaging
  • Choosing the Right Test Method
  • Best Practices for Testing

Was this helpful?

  1. How to guides
  2. Campaigns

Testing

Learn how to test Crobox Campaigns using control groups, linked campaigns, and persuasion-based content variants to optimize campaign performance and messaging effectiveness.

Crobox gives you multiple ways to test your campaigns — from A/B testing with control groups to testing different messages based on persuasion principles. This guide explains the available testing options, when to use them, and how to set them up step by step.

Why Test Campaigns?

Testing allows you to:

  • Prove campaign effectiveness before scaling

  • Compare messaging approaches using persuasion tactics

  • Optimize continuously based on real-world performance and your audience

You can implement testing at three levels in Crobox Campaign set up, find the details below.


A/B Test with a Control Group

📍Where to set up: Setup tab → “Performance Tracking” section

What It Does

Splits your traffic into:

  • Crobox group: Sees your campaign

  • Control group: Sees no campaign

This helps you measure the true impact of a campaign by comparing exposed vs. non-exposed users.

How to Set It Up

  1. Go to your campaign’s Setup tab

  2. Ensure “Enable Performance Tracking” is turned ON

  3. Adjust the Traffic Ratio slider (e.g., 50% exposed, 50% control)

  4. Save the campaign draft

Best for validating whether your campaign is having a meaningful impact vs. showing nothing at all


Linked Campaign Testing (Advanced Settings)

For advanced testing setup, you can compare two or more campaigns directly by linking them and aligning their priority. This method supports both standard A/B testing as well as A/B/C testing with a control group.

📍Where to set up: Set up tab → “Advanced”

Option 1: A/B Test Without a Control Group

Linked Campaigns, Performance Tracking Off

You can run a head-to-head comparison between two (or more) campaigns in the same placeholder without using a control group. To Implement:

  1. Turn off Performance Tracking in each campaign’s Setup tab

  2. Set both campaigns to the same Campaign Priority (copy Campaign A priority and paste into Campaign B within the advanced section)

  3. Activate the toggle for "Link this campaign performance to other campaigns", and select the campaign/s

  4. Ensure campaigns target the same placeholder

  5. When published, users will see either Campaign A or Campaign B based on Crobox’s internal rotation logic

This method doesn't show you a true control group baseline — it only compares the two active campaigns

Campaigns must target the same placeholder, and have the same priority set in advanced settings

Option 2: A/B/C Test With a Control Group

Linked Campaigns, Performance Tracking On

This setup lets you compare:

  • Campaign A

  • Campaign B (linked to A)

  • A control group with no campaign

To implement:

  1. Turn on Performance Tracking in both campaigns

  2. Set both campaigns to the same Campaign Priority (copy Campaign A priority and paste into Campaign B within the advanced section)

  3. In Campaign A, activate toggle “Link this campaign to another for performance tracking” and select Campaign B

  4. Ensure campaigns target the same placeholder

  5. When published, users will see either Campaign A, Campaign B or no campaign based on Crobox’s internal rotation logic

With this setup:

  • Users are randomly split into three groups: one sees Campaign A, one sees Campaign B, and one sees nothing (control)

  • Results will include clear control group data for performance benchmarking


Testing Content Variants with Persuasion Messaging

📍Where to set up: Content tab → “Personalize Content”

What It Does

Test multiple message variations within the same campaign. Ideal for experimenting with persuasion tactics like urgency, social proof, scarcity, or authority.

How to Set It Up

  1. Go to your campaign’s Content tab

  2. Click “Personalize Content”

  3. Click Add a new message

  4. Give each variant a clear name (e.g., “Urgency - Low Stock”, “Social Proof - Best Seller”)

  5. Add your content for each message

  6. (Optional) Apply filters to target specific segments for each meassage

  7. Localize each variant as needed

  8. (Optional but recommended in pilot phase) Make sure Performance Tracking is enabled in the Setup tab to measure impact

Example Persuasion Principle Variants

Persuasion Principle
Example Message
Filters Applied

Urgency

“Hurry! Only a few left in stock”

Stock < 1

Social Proof

“Trending”

Top 30% bought - last month (/category)

Scarcity

“Limited Edition – While Supplies Last”

Product ID / Product Tag

Scarcity

"Members receive an extra 15% off!"

Visitor Segment

Authority

“Expert-recommended pick”

Product ID / Product Tag

Luxury trait

"Iconic"

Product ID / Product Tag

Reasons Why

"100% secure payments with: Visa, MasterCard, PayPal, XXX"

Page Type

Endowment Effect

"Almost there! You're just clicks away from owning your new gear!"

Page Type

Use Content Tags to group variants for better reporting across campaigns.

Why Use This

  • Fastest way to experiment without duplicating campaigns

  • Clear analytics for each message

  • Ideal for iterating on messaging based on shopper psychology


Choosing the Right Test Method

Goal
Method
Setup Area

Test campaign vs. no campaign

A/B with Control Group

Setup tab

Compare two campaign strategies or formats

Linked Campaign A/B Test

Setup (Advanced)

Test copy, messages, persuasion tactics

Content Variants

Content tab

Best Practices for Testing

  • Use 50/50 traffic splits for faster results

  • Test one variable at a time (message, design, or placement)

  • Let tests run long enough to be reliable (depending on traffic)

  • Review results in Analytics and iterate

  • Tag variants clearly for cleaner reporting

PreviousCreate a CampaignNextCampaign Performance

Last updated 1 month ago

Was this helpful?