Add a Link Experiment - READ MODE

Activation, Retention, Revert Rate

WE1.2.9, FY24-25
Author
Affiliation

Irene Florez

Product Analytics, Wikimedia Foundation

Published

06/21/2025

This page in a nutshell

This analysis finds that on the mobile web platform, the Add a Link in Read Mode results are mixed: a 1.7% decrease in constructive activation, a 5.1% decrease in constructive retention, and a 1.4% decrease in revert rate.

  • The likelihood that mobile web newcomers make their first article edit (-1.7% compared to the control); these differences are not statistically significant.
  • The likelihood that they are retained as newcomers (-5.1% compared to the control).
  • A lower probability of the newcomers’ edits to be reverted (-1.4% compared to the control).

Introduction and Basics

Current full-page editing experiences require too much context, patience, and trial and error for many newcomers to contribute constructively. Can surfacing structured tasks support newcomers (and have a greater impact on newcomer activation and retention) if we surface these tasks to newcomers as they are reading articles? Structured tasks are suggested edits that can be broken down into step-by-step workflows with simple steps that make sense to newcomers, are easy to do on mobile devices, and may be assisted by machine learning. Structured Task “Add-a-link” recommendations exist on articles that need links and are underlinked; some of these may be lower view count, shorter, and/or less developed articles. We look at the impact of providing newcomers on pilot wikis (Spanish, French, Persian, Indonesian, Portuguese, Egyptian Arabic) a way to act on “Add a link” suggestions while they read an article with an available “Add a link” suggestion. Note: Spanish Wikipedia joined this experiment at the start of the experiment before turning off the feature on April 6th.

We build on the previous 2021 study involving ten Wikipedias (Arabic, Bengali, Czech, Vietnamese, Russian, French, Polish, Romanian, Persian, Hungarian) which found that newcomers who received the Add-a-Link structured task were 11.7% more likely to make a first article edit compared to the baseline in the control group.

Here we conducted a parallel-group, randomized controlled experiment in which new accounts were assigned 1:1 to a control group or a treatment group; outcome measures were assessed at the end of the experiment via edit data. Note: To be in the experiment participants needed to visit at least one Add a Link structured task eligible page. Constructive activation is our primary metric. Revert rate and constructive retention are secondary metrics in this experiment. For constructive activation, revert rate and constructive retention, we analyzed the full dataset and report on the findings. Finally, as mobile is the focus of the 1.2 KR we focus on mobile data results.

Findings Summary

We see a 33.5% constructive activation rate for those in the mobile treatment group compared to those in the mobile control group which saw a 34.1% constructive activation rate. These findings do not confirm the original Constructive Activation (article) hypothesis. However, these differences are not statistically significant. Mobile web editors in the treatment group saw a 5.1% decrease in retention (4.1%) compared to editors in the mobile control group (4.3%). The findings do not confirm the original Constructive Retention (article) hypothesis. As far as revert rates, mobile web treatment group editors experienced a 1.4% lower revert rate (23.2%) than mobile control group editors (23.6%). Revert rate findings confirm the original revert rate hypothesis.

The observed results indicate that the intervention produced a small decline in constructive activation, but this effect was limited and not statistically significant. Since the results show a negative trend while also being inconclusive, rather than roll the feature out more widely the team plans to iterate on this work and reassess with another experiment at a later time.

Figure 1: Add-a-Link: Highlighting in read mode. An edit suggestion is prompted when you click on the yellow tag that surfaces in read mode/article view.

Constructive Activation (Article) A newcomer making at least one edit to an article in the main namespace on a mobile device within 24 hours of registration, with that edit not being reverted within 48 hours of publication. In this notebook we use the variable is_const_activated_article. is_const_activated_article = (num_article_edits_24hrs - num_article_reverts_24hrs) > 0

Constructive Retention (Article) If we increase constructive activation, but that doesn’t flow into retained users, then the impact of this work will be limited. We ensure mobile web newcomer retention remains stable or improves. In this notebook we use the variable is_const_retained_article is_const_retained_article = is_const_activated_article & ((num_article_edits_2w - num_article_reverts_2w) > 0)

Revert rate The proportion of namespace = 0 edits that were reverted within 48 hours out of all such edits made, for users that edited on mobile. This is by definition 0% for users who made no edits, and we exclude these users from the revert rate analysis. We use edit tags to identify edits and reverts, and reverts have to be done within 48 hours of the edit. In this notebook we use the variable prop_rev_article_edits. prop_rev_article_edits = num_article_reverts_24hrs + num_article_reverts_2w) / num_total_article_edits if num_total_article_edits > 0; otherwise it is set to 0. Essentially we take the average revert rate across users in the specified group.

Detailed Findings: Activation

Hypothesis: For new logged in account holders (account <24hrs), on wikis in this experiment, if we introduce the “Add a Link” Structured Task in Wikipedia articles, then we expect to increase the percentage of new account holders who constructively activate on mobile web by 10% compared to the control group, because the tools provided guide editors step by step to begin contributing.

Constructive Activation (Article) is this experiment’s primary metric.

Takeaway

The findings from analyzing the full dataset do not confirm the original hypothesis. When analyzing the full data set we see a 33.5% constructive activation rate for those in the mobile treatment group compared to those in the mobile control group which saw a 34.1% constructive activation rate. These differences are not statistically significant.

Show the code
display_html(as.character(t9))
Summary - Constructive Activation (Article) Aggregations by Platform
platform Group Count Percent
desktop Control 1825 40.5
desktop treatment 1797 38.7
mobile Control 1359 34.1
mobile treatment 1326 33.5
Show the code
display_html(as.character(t12))
Summary Comparison: Constr. Mobile Web Activation (Article)
Control<->Treatment %Changes
Metric Value
Mobile Control Percent 34.1
Mobile Treatment Percent 33.5
AbsoluteChange -0.6
PercentagePointChange -0.6
PercentChange -1.7
Show the code
constructive_activation_article_namespace_mobile

Detailed Findings: Retention

Hypothesis: Mobile web users who receive the Add a Link structured task will have a 3% or higher retention rate than mobile web users who do not.

If we increase constructive activation, but that doesn’t flow into retained users, then the impact of this work will be limited. Thus we look to ensure newcomer Constructive Retention (article) remains stable or improves.

Retention is a secondary metric we track in this experiment.

Takeaway

Mobile web editors in the treatment group saw a 5.1% decrease in retention (4.1%) compared to editors in the mobile control group (4.3%). The findings do not confirm the original hypothesis.

Show the code
display_html(as.character(aggr_tbl_retention_platform_render))
Summary - Constructive Retention (Article) by Platform
platform Group Count Percent
desktop Control 311 6.9
desktop treatment 315 6.8
mobile Control 172 4.3
mobile treatment 162 4.1
Show the code
display_html(as.character(aggr_tbl_retention_platform_render_comp_render))
Summary Comparison: Constructive Mobile Web Retention (Article)
Control<->Treatment %Changes
Metric Value
Mobile Control Percent 4.3
Mobile Treatment Percent 4.1
AbsoluteChange -0.2
PercentagePointChange -0.2
PercentChange -5.1
Show the code
constructive_retention_article_namespace_mobile

Detailed Findings: Revert Rate

Hypothesis: “Add a Link” Structured Task mobile web participants will not experience a higher edit revert rate than mobile web editors in the control group.
Revert Rate is a secondary metric we track in this experiment.

When it comes to reverts, we again focus on the Article namespaces because that is where Add a Link asks newcomers to edit. Secondly, it does not make sense to measure reverts for users who make no edits, so this analysis is limited to users who made at least one edit in those namespaces in the first two weeks after registration.

While revert rate is our best measurement of the quality of edits, it is important to note that conversations with communities have indicated that revert rate may not be a fully accurate proxy for the quality of Add a Link edits:

  • Add a Link edits frequently add multiple links to an article. In some cases when most of the links are an improvement and some are not, patrollers may not go through the process of reverting the whole edit, and rather let a partially good edit stay.
  • In those same situations, patrollers may manually remove some of the links while keeping others. These “partially reverted” edits are not detected as reverts by our analysis.

Lastly, edits made via “Add a Link” surfaced in read view specifically have a revert rate which is lower than the overall rates presented here. Turnilo “newcomer task: read view suggestion” edits and reverts charts:

Takeaway

Mobile web treatment group editors experienced a 1.4% lower revert rate (23.2%) than mobile control group editors (23.6%). The findings confirm the original hypothesis.

Note: These figures reflect the revert rates for all article edits made by users in each group on the respective wiki rather than only edits made through the “Add a Link” structured task.

Show the code
display_html(as.character(avg_rr_by_platform_render))
Revert Rate per Group - mean, by Platform
platform Group n Prop_Rev_Article_Edits Percent
desktop Control 2348 0.19 18.8
desktop treatment 2351 0.19 19.5
mobile Control 1791 0.24 23.6
mobile treatment 1735 0.23 23.2
Show the code
display_html(as.character(avg_rr_platform_lift_render))
Summary Comparison: Revert Rate Mobile Web
Control<->Treatment %Changes
Metric Value
Mobile Control Percent 23.6
Mobile Treatment Percent 23.2
AbsoluteChange -0.3
PercentagePointChange -0.3
PercentChange -1.4
Show the code
rr_mobile

Methodology

Here we conducted a parallel-group, randomized controlled experiment in which pilot wiki new accounts were assigned 1:1 to a control group (accounts could read articles where Add-a-Link is available as normal but with no highlighting and no add-a-link suggestions) or a treatment group (accounts could read articles where Add-a-Link is available—and said articles had an Add-a-Link yellow highlighted Structured Task somewhere within the article) for ~3 months; outcome measures were assessed at the end of the experiment via edit data. In effect, we knew that we couldn’t get suggestions in front of all users, so to assess the potential impact of Add-a-Link in Read Mode, we focused only on users that might see suggestions in this experiment.

Because constructive activation is the primary experiment metric, we sampled constructive activation data, analyzed the full dataset and modeled the relationships. Leveraging generalized linear models (GLMs) with a binomial link function and logistic regression–based inference, we reviewed the outcomes of those that received the feature (treatment) and those that did not (control) and modeled the data to understand how treatment/control and desktop/mobile factors influenced the odds of constructive activation.

  • Cues: This experiment involved a deliberately selected non-intrusive user experience.
  • Target audience: Welcome Survey data suggests that many newly registered users create accounts with a specific edit or article creation in mind. They may be less likely to respond to suggestions not matching their initial intention.
  • Analysis was based on intent-to-treat and included all users assigned to the treatment arm, regardless of whether they actually saw or interacted with the feature, biasing results toward the null.
  • Technical constraints such as the feature residing below the fold or in an unfurled article section may have prevented some assigned users from encountering the treatment. While we offered read mode link suggestions, we could not guarantee that mobile editors in the treatment group of this experiment saw the suggestions.
  • One-sided noncompliance (users assigned to treatment choosing not to engage) further dilutes the estimated effect.
  • Allocation vs. Interaction Point: We applied the same exposure-aware analytical framework used in previous Add a Link experiments to ensure consistency and enable direct comparisons across studies.
  • To be in the experiment participants needed to visit at least one Add a Link structured task eligible page. This triggered or parallel-group, between-subjects approach mitigated dilution and exposure-imbalance bias.

During the experiment period, 20% of pilot wiki new accounts visited at least one page where a structured task was available. Of that 20%, we narrowed analysis to accounts where…

  • users created an account themselves on mobile web or desktop,
  • account names & ids are not on the known users to exclude list for each wiki (test accounts, etc.),
  • variant assignment was derived from a newcomers numerical user id and was recorded at account creation and when they viewed a page where a structured task was available
  • the view of a page where a structured task was available was within the first 24hours after their account was created,
  • the account viewed 500 or fewer task eligible pages during the experiment period (this threshold was used to exclude extreme outliers).

Note: For this experiment, there are two ways for participants to encounter the “Add a Link” task:

  • when viewing an article that has an “Add a Link” suggestion available.
  • via the Homepage

For this experiment, we include editors that visit the homepage as that is the default experience and it will not be removed.

Analysis in this experiment was carried out using edit and revert data from the pilot wikis for those in the experiment, focusing on edits to articles in the main namespace made on any device. Mobile and desktop platform splits in this analysis are based on the platform where the user first registered an account (based on the reg_on_mobile field in the user dataset). This method follows previous practices for similar analysis. While we don’t expect that there are many users that switch platforms and so didn’t search for edits where array_contains(revision_tags, 'mobile web edit') here, in future Growth analysis we will search for mobile web edit edits specifically to match queries now utilized by the editing team and others.

The data sources are:

  • event.mediawiki_product_metrics_growth_product_interaction: Get user treatment/control assignments
  • event.homepagemodule: Find out which users visited the home page module; useful for data exploration
  • user: Grab the user IDs of known test accounts so they can be added to the exclusion list
  • user, user_properties, user_groups: Use these three tables to get the user registrations for users registered between the deployment to a given group and the end of data gathering, separately for each group of wikis, while excluding known bots and excluding those on the exclusion list.
  • event_sanitized.serversideaccountcreation: Identify all self-created, non-app created, non-bot registrations using ServerSideAccountCreation.
  • event_sanitized.mediawiki_revision_create, event_sanitized.mediawiki_revision_tags_change: Gather data to answer questions about our high level metrics: activation, retention, productivity, and revert proportions for those user ids in the experiment.

We’re looking to analyze the Add a Link experiment, and therefore start by grabbing a canonical dataset of users who were part of the experiment.

  • The EPIC phab task for this work is T362584
  • Deployment task information is in ticket T385343.
  • The analysis is in ticket T377098.

Instrumentation tickets:

  • T377097 - Decision to use link_suggestion_interaction to track alpha test info where active_interface IN (‘readmode_suggestion_dialog’,‘readmode_page’,‘readmode_article_page’)
  • T387286 - Decision to use “action_source”: “BeforePageDisplayHook” from the link_suggestion_interaction stream, for the variant assignment at the time of page impression of a page with recommendations
  • T388622 - Decision to increase experiment targeting and add anyone that lands on a page with <100 edits to the experiment

Experiment Dates and Group Counts

Show the code
cat("Experiment enrollment began on:",format(exp_start_ts, "%Y-%m-%d %H:%M:%S"), "\n") 
cat("Experiment enrollment ended on:",format(exp_end_ts, "%Y-%m-%d %H:%M:%S"), "\n")
cat("We collected edit data from experiment start up to:", format(end_date_plus15, "%Y-%m-%d %H:%M:%S"), "\n")
cat("We collected revert data up to:", format(edits_plus_two, "%Y-%m-%d %H:%M:%S"), "\n")
Experiment enrollment began on: 2025-03-04 21:53:00 
Experiment enrollment ended on: 2025-05-28 00:00:00 
We collected edit data from experiment start up to: 2025-06-12 00:00:00 
We collected revert data up to: 2025-06-14 00:00:00 
Show the code
#Convert to a character string containing the HTML code & display
display_html(as.character(tbl_html))
Summary: Group Sizes
platform Group Group_count Experiment_total_count Percent
desktop Control 4510 9158 49.2
desktop treatment 4648 9158 50.8
mobile Control 3983 7935 50.2
mobile treatment 3952 7935 49.8

Appendix

Constructive Activation Overall (Control vs Treatment)

Show the code
display_html(as.character(aggr_tbl_const_activation_article_render))
display_html(as.character(aggr_tbl_const_activation_article_comp_render))
constructive_activation_article_namespace
Summary - Constructive Activation (Article) Aggregations
Group Count Percent
Control 3184 37.5
treatment 3123 36.3
Summary Comparison: Constructive Activation (Article) Control<->Treatment % changes
Metric Value
Control Percent 37.5
Treatment Percent 36.3
AbsoluteChange -1.2
PercentagePointChange -1.2
PercentChange -3.1

Show the code
display_html(as.character(f))
Summary - Mobile Constructive Activation (Article) by Wiki
Group wiki_db Percent
Control eswiki 28.8
treatment eswiki 27.2
Control fawiki 49.8
treatment fawiki 49.2
Control frwiki 31.1
treatment frwiki 31.8
Control idwiki 31.7
treatment idwiki 26.4
Control ptwiki 31.9
treatment ptwiki 31.6
Control arzwiki 60.0
treatment arzwiki 60.0

Constructive Retention Overall (Control vs Treatment)

Show the code
display_html(as.character(aggr_tbl_const_retained_article_render))
display_html(as.character(aggr_tbl_const_retained_article_comp_render))
constructive_retention_article_namespace_overall
Summary - Constructive Retention (Article)
Group Count Percent
Control 483 5.7
treatment 477 5.6
Summary Comparison: Constructive Retention (Article) - Control<->Treatment %Changes
Metric Value
Control Percent 5.7
Treatment Percent 5.6
AbsoluteChange -0.1
PercentagePointChange -0.1
PercentChange -2.5

Revert Rate Overall (Control vs Treatment)

Show the code
display_html(as.character(avg_rr_render))
display_html(as.character(avg_rr_lift_render))
rr_overall
Revert Rate per Group - Mean
Group n Prop_Rev_article_Edits Percent
Control 4139 0.21 20.8
treatment 4086 0.21 21.1
Summary Comparison: Revert Rate Control<->Treatment %Changes
Metric Value
Control Percent 20.8
Treatment Percent 21.1
AbsoluteChange 0.2
PercentagePointChange 0.2
PercentChange 1.1