Description: This tool provides a comprehensive and detailed overview of the performance of comparison tests (A/B Tests or Split Tests) created within the system, allowing you to identify the most effective version.
What is it for? (Practical examples)
Use it to make data-driven decisions and optimize the user experience on your site:
- Winner identification: Discover unequivocally which version of an element (e.g., a headline, an image, a button) achieves the best results in terms of clicks or attention time.
- Effectiveness monitoring: Keep track of the aggregated performance of all active or historical tests, monitoring the total number of interactions generated.
- Conversion optimization: Compare engagement metrics (reading time, clicks) to maximize the effectiveness of business-critical content.
Main features
1. Test management and display
The main screen shows a list of all tests performed or ongoing. Each row provides a quick summary of performance:
- Title: The name assigned to the test.
- Elements: The number of versions (A, B, C, etc.) that were compared in that test.
- Total clicks: The sum of clicks received by all elements in the test.
- Total views: The sum of views received by all elements in the test.
It is possible to delete data related to a specific test directly from the grid if they are no longer needed for historical analysis.
2. Detailed report analysis
By selecting a test from the list, you access the detailed report comparing the performance of each individual element (version) that participated in the test.
Basic metrics per element
For each tested version, raw interaction data are displayed:
| Icon | Metric | Description |
|---|---|---|
| Eye | Number of views | How many times the element was shown to users. |
| Pointer | Clicks per view | The average clicks received per page view. |
| Reader | Average reading time | The average time (in seconds) that the mouse pointer or user attention spent on the element. |
Effectiveness Analysis (Percentages)
This section compares each element’s performance with the total test, highlighting in green (success) the version that achieved the best result for that specific metric.
| Metric | Description | Success indicator |
|---|---|---|
| Clicks | Percentage of clicks relative to total views. | The element with the highest click percentage. |
| Reading time | Percentage focus on the element relative to total time. | The element that held attention longest. |
| Number of readers | Percentage of people who focused on the element. | The element that attracted attention from the most users. |
Web Tracking (Tracked Events)
If the test is configured to track specific events (e.g., "Scroll up to 50%", "Mouse over", "Video opening"), this section shows the number and percentage of such events for each element.
This allows understanding whether an element not only generates clicks but also stimulates specific interactions contributing to the final goal.
Automatic integrations
This module is tightly integrated with the system’s Visual Builder. Data are collected automatically whenever a user views or interacts with elements configured for an A/B Test. No manual setup is required to start data collection; just create the test via the visual building interface.







