阴茎不够硬吃什么药| 尿酸高什么东西不能吃| 何首乌长什么样子| 虾米是什么意思| 幽门螺杆菌是什么病| 为什么月经期有性冲动| 毛孔大什么原因形成的| 气血不足吃什么补最快| 1月2日是什么星座| 降龙十八掌最后一掌叫什么| 潮吹是什么意思| 爱母环是什么形状图片| 血红蛋白是查什么的| 阳暑吃什么药| 吃完螃蟹不能吃什么| 淋巴细胞偏低是什么原因| 农历二月是什么月| 兄弟左右来是什么生肖| 什么是天丝面料| 肾虚吃什么药好| 国家一级演员是什么级别| 喉咙干咳吃什么药| 儿童结膜炎用什么眼药水| 梦到谈恋爱预示着什么| 91岁属什么| 什么是单反相机| 胎盘血池是什么意思| 过敏性鼻炎喷什么药| 阴茎瘙痒是什么原因| 司空见惯的惯是什么意思| hev是什么意思| 鳖吃什么| 梦到和别人吵架是什么意思| 阑尾炎是什么症状| 肩膀疼挂什么科室最好| 利可君片是什么药| 为什么会勃起| 人脉是什么意思| 老头疼是什么原因导致的| 嘴唇发白什么原因| 晚饭吃什么好| 梦见自己小便是什么意思| 月经期头晕是什么原因| 宽粉是什么做的| 琴酒是什么酒| 腿部发痒是什么原因引起的| 治疗阳痿吃什么药| 记忆力不好是什么原因| 舒坦是什么意思| 笙字五行属什么| 治疗阳痿早泄什么药最好| 兹禧属什么生肖| 慈母手中线的下一句是什么| 宁字属于五行属什么| 维生素b2起什么作用| 洗银首饰用什么清洗| gpa什么意思| 乳腺癌长在什么位置| 乳房结节挂什么科室| 婴儿第一次发烧叫什么| 日斤读什么字| 墨绿色是什么颜色| 卒中中心是干什么的| 中考送什么礼物| 三个力念什么| 空调有异味是什么原因| 运六月有什么说法| 先算什么再算什么| 定性是什么意思| 气蛋是什么病| 公元400年是什么朝代| 家属是什么意思| 艳羡是什么意思| 九头身什么意思| 上午12点是什么时候| 为什么爱放屁| 泡脚用什么东西泡最好| 雨水是什么意思| 肾脏炎有什么症状| 周六左眼跳是什么预兆| 脾大对身体有什么影响| 促甲状腺素低是什么原因| 体检前一天晚上吃什么| 玫瑰花可以和什么一起泡水喝| 冬瓜不能和什么一起吃| 霜对什么| 饿了手抖是什么原因| 我是小姨的什么人| mm代表什么单位| 反常是什么意思| 逐年是什么意思| 凯撒沙拉酱是什么口味| 精神洁癖是什么| doris什么意思| futa是什么意思| 平菇不能和什么一起吃| 回民为什么不吃猪| 非礼什么意思| 万事达卡是什么卡| 什么是甲沟炎图片| 合肥有什么特产| des是什么意思| 小肚子胀痛什么原因| 2023年是什么生肖年| 茉莉花茶有什么功效| 盐酸多西环素片治什么病| 蜂蜜和什么食物相克| 部长是什么职位| 耳石症是什么症状| 鸡拉绿色粪便是什么病| 小酌怡情下一句是什么| 艾草治什么病| 黄柏是什么| 悟性是什么意思| 早泄是什么症状| 高铁上不能带什么| 梦见自己打胎是什么意思| 结膜囊在眼睛什么位置| 排卵期后面是什么期| 难为你了是什么意思| 海参不能和什么一起吃| 血小板低吃什么补的快| 男人为什么会得前列腺炎| 气虚是什么意思| 地钱是什么植物| 测怀孕什么时候最准| 政协主席是什么级别| 红代表什么生肖| 股骨头坏死有什么好办法治疗吗| 藜芦是什么| 什么东西不导电| 什么能什么力| 日的偏旁有什么字| 公共关系是什么意思| 今天什么时辰立秋| 三牛读什么| 聚酯纤维是什么料子| 0x00000024蓝屏代码是什么意思| 颈动脉斑块吃什么药好| 余事勿取是什么意思| 里正相当于现在什么官| 萘是什么| 去香港需要准备什么| 流鼻血是什么原因| 砂仁为什么要后下| 六三年属什么生肖| 尔昌尔炽什么意思| 阴唇为什么会长痘痘| 例假来的是黑色的是什么原因| 91网站是什么| 孕激素高会有什么影响| 宝宝热疹用什么药膏| 胸口疼痛什么原因| 花甲不能和什么一起吃| 鬼最怕什么东西| 健身吃蛋白粉有什么好处和坏处| 蓝莓是什么味道| 南瓜皮可以吃吗有什么作用| 咖啡是什么做的| 梦见表姐是什么意思| 吃什么清理脑血管堵塞| 牙齿发炎吃什么消炎药| 肠胃不好可以吃什么水果| 局灶癌变是什么意思| 卧推80公斤什么水平| 感想是什么意思| 青光眼有什么症状| ep什么意思| 老人住院送什么东西好| 人渣是什么意思| 布蕾是什么| 农历五月二十四是什么日子| 吃什么长得高| 汗多尿少是什么原因| 肝囊肿吃什么药| 单脐动脉是什么意思| 嗜睡什么意思| 盆腔积液是什么症状表现| 大腿肌肉酸痛是什么病| 土豆不能和什么食物一起吃| 干白是什么酒| 七月一号是什么节| 鱼漂什么牌子的好| 荨麻疹是什么症状| 尿酸高适合吃什么食物| 月子吃什么补气血| 体寒是什么意思| 做大生化挂什么科| 颧骨高适合什么发型| 10月30号是什么星座| 腹部胀气是什么原因| 月经期血块多是什么原因| 雌激素过高吃什么药| 公务员是做什么工作的| 假借是什么意思| 一姐是什么意思| 尿酸检查什么项目| 锁骨疼挂什么科| 矢量图是什么格式| 紫笋茶属于什么茶| 征信对个人有什么影响| 特派员是什么级别| 花痴是什么意思| 肠胃不好吃什么药| 疱疹用什么药膏最有效| 氨气是什么| 阿司匹林有什么副作用| 关节炎吃什么药| 钾高吃什么可以降下来| 蟾蜍吃什么| 肚子疼挂什么科室| c13阴性是什么意思| 12月是什么星座的| 钠低是什么原因| 头顶发热是什么原因| 缺钾吃什么食物好| 空心菜什么人不能吃| 望尘莫及什么意思| 三代试管是什么意思| 黑加出念什么| 梦到猪肉是什么预兆| 吃什么药能让月经推迟| 三原色是什么| 左肺上叶钙化灶什么意思| 低密度脂蛋白胆固醇偏低是什么意思| 什么是隐匿性包茎| 什么人不适合喝骆驼奶| 转学需要什么手续| 猴跟什么生肖相冲| 鳖吃什么食物| 新生儿湿疹用什么药膏| 怀孕前三个月不能吃什么| 耳石症挂什么科| 眼球出血是什么原因引起的| 什么是一线城市| 正月初一是什么生肖| 为什么子宫会隐隐作痛| 半套是什么意思| bpa是什么意思| 肺肾两虚吃什么中成药| 夸瓜读什么| 扁桃体发炎吃什么中成药| 亵渎什么意思| 腰椎间盘突出吃什么好| 吃木耳有什么好处| 武汉有什么玩的| 梦见小黑蛇是什么预兆| 便秘吃什么好| 八方来财是什么意思| 气胸挂什么科| 他克莫司软膏治疗什么| 适得其反什么意思| 榴莲不能和什么一起吃| 病毒感染吃什么消炎药| 头晃动是什么病的前兆| 蚩尤姓什么| 食是什么生肖| 新鲜的乌梅长什么样| 桐字五行属什么| 静脉曲张是什么意思| vivo手机是什么牌子| 肾小球滤过率是什么意思| 满月红鸡蛋用什么染| 不知道为什么| 百度Jump to content

北京大学人民医院清掏、疏通及下井作业服务招标公告

From mediawiki.org
This page is a translated version of the page Growth/Personalized first day/Newcomer tasks and the translation is 15% complete.
Tugas pendatang baru di Wikipedia
百度 借贷平台声称的10万元借款额度迟迟没有下文,但是手机上每个月却会按时收到支付利息的信息。

This page describes the Growth team's work on the "newcomer tasks" project, which is a specific project under the larger "Personalized first day" initiative. This page contains major assets, designs, and decisions. Most incremental updates on progress will be posted on the general Growth team updates page, with some large or detailed updates posted here.

You can quickly see what the team built by looking through these mockups (use arrow keys to navigate):

Desain dan perencanaan untuk proyek ini dimulai pada 24-07-2019. Versi pertama dirilis pada empat wiki tanggal 20-11-2019.

In December 2020, we published results showing the positive impact that newcomer tasks have on engagement. See this page for the details.


Status saat ini

  • 2025-08-06: pertemuan tim pertama untuk membahas tugas-tugas bagi pendatang baru
  • 2025-08-06: pertemuan tim untuk membahas konsep desain.
  • 2025-08-06: Phabricator tasks created for engineering work
  • 2025-08-06: desktop user tests complete
  • 2025-08-06: mobile user tests complete
  • 2025-08-06: V1.0 deployed to Czech, Korean, Arabic, and Vietnamese Wikipedias
  • 2025-08-06: first variant test ("initiation") deployed to Czech, Korean, Arabic, and Vietnamese Wikipedias
  • 2025-08-06: testing the addition of topic matching, to be deployed the week of 2025-08-06.
  • 2025-08-06: the option to select topics of interest was added to the suggested edits module
  • 2025-08-06: topic matching upgraded to use ORES models
  • 2025-08-06: results from first variant test
  • 2025-08-06: switched all newcomers to Variant A
  • 2025-08-06: deployed guidance
  • 2025-08-06: deployed Variants C and D
  • 2025-08-06: Published Newcomer tasks experiment analysis
  • Next: Continue improvements to Newcomer tasks and possibly develop new Structured tasks

Summary

Difficulty filters for newcomers tasks.

We think that newcomers should have every opportunity to succeed when they first arrive at the wiki. But frequently, newcomers attempt a task that is too difficult for them, can't find a task they want to do, or can't find ideas for how to remain involved after their first edit. This leads to many of them leaving and not coming back. There have been successful attempts in the past at recommending tasks to editors, and so we believe that the newcomer homepage is a potential place to recommend relevant tasks for newcomers.

We'll need to keep in mind a few things:

  • Many newcomers arrive with something specific in mind they're trying to accomplish, like add a specific photo to a certain article. We don't want to get in the way of them accomplishing their goal.
  • Newcomers build up their skills over time by progressing from easier edits to hard ones.
  • When newcomers are successful early on, they are more motivated to continue editing.

Taking those things into account, we want to recommend tasks to newcomers that arrive at the right place and time for them, teach them skills they need to be successful, and relate to their interests.

A valuable tool we have for helping tasks be relevant to newcomers is the welcome survey, which was originally built specifically for this purpose: personalizing the newcomer's experience. We'll plan to use the optional information newcomers give about their goals and interests to recommend appropriate tasks for them.

One of the largest challenges is going to be figuring out how to gather tasks that are appropriate for newcomers to do. There are many existing sources, such as templates that call for work on articles, recommendations in the Content Translation tool, or suggestions from tools like Citation Hunt. The question will be which of those options help newcomers accomplish their goals.

At first, we'll focus on using the newcomer homepage as the place to recommend tasks, but in the longer term, we can imagine building features that extend into the editing experience to recommend and help newcomers accomplish recommended tasks.

Also in the longer term, we'll be thinking about ways to tie task recommendations into other parts of the newcomer experience, such as the impact module on the homepage, or into the help panel.

Why this idea is prioritized

We know from research and experience that many newcomers fail early in their editing journey for one of these reasons:

  • They arrive with a very challenging edit in mind, such as writing a new article or adding an image.

Those tasks are difficult enough that they likely fail and don't return.

  • They arrive without knowing what to edit, and can't find any edits to make.

We also know that on the newcomer homepage, the most frequently clicked-on module is the "user page" module -- the only thing on the page that encourages users to start editing. This makes us think that many users are looking for a clear way to get started with editing.

And from past Wikimedia endeavors, we've seen that task recommendations can be valuable. SuggestBot is a project that sends personalized recommendations to experienced users, and is a well-received service. The Content Translation tool also serves personalized recommendations based on past translations, and has been shown to increase the volume of editing.

For all these reasons, we think that recommending specific editing tasks for newcomers will give them a clear way to get started. For those newcomers that have an edit in mind that we want to do, we'll encourage them to try some easy edits first to build up their skills. For those newcomers who do not have a specific preference on what to edit, they'll hopefully find some good edits from this feature.

Glossary

There are many terms that sound similar and can be confusing. This section defines each of them.

"Newcomer tasks"
The entire workflow that recommends edits for newcomers and guides them through the edits.
"Suggested edits"
The name of the specific module that the newcomer tasks workflow adds to the newcomer homepage.
"Task recommendations" or "Task suggestions"
Lists of articles that need editing work, suggested automatically to users.
"Personalized"
Software that adapts automatically to each user to fit their needs.
"Customized"
Software that the user adapts to fit their needs.
"Topic"
A content subject, such as "Art", "Music", or "Economics".
"Topic matching"
The ability to find tasks for newcomers that match their topics of interest.
"Guidance"
Features that help the newcomer complete the suggested task while they are working on it.
"Maintenance template"
Templates that are put on articles indicating that work needs to be done on them.

Recommending tasks

The core challenge to this project is: Where will the tasks come from and how will we give the right ones to the right newcomers?

The graphic below shows our priorities when recommending tasks to newcomers.

As shown in the graphic above, we would give newcomers tasks that...

  • ...arrive at the right time and place for a newcomer's journey.
  • ...teach relevant conceptual and technical skills.
  • ...gradually guide users to build up their editing abilities.
  • ...be personalized to their interests.
  • ...show them the value and impact of editing.
  • ...motivate them to participate continually.

For instance, we do not want to give newcomers tasks that are irrelevant to what they hope to accomplish. If a newcomer wants to write a new article, then asking them to add a title description will not teach them skills they need to be successful.

We're splitting this challenge into two parts: the sourcing the tasks and topic matching.

Sourcing the tasks

There are many different places we could find tasks for newcomers to do. Our team listed as many as we could think of and evaluated them for whether they seem to be achievable for the first version of the feature. Below is a table showing the many sources of tasks that we evaluated in coming to the decision to start by using maintenance templates.

Source of task Explanation Evaluation
Maintenance templates Most wikis use templates or categories to indicate articles that need copyediting, references, or other modifications. These are placed manually by experienced users. Easily accessible. Already used in SuggestBot and GettingStarted .
Work on newest articles New articles may be good candidates for work because they likely could be improved or expanded. They are also more likely to be about current topics. Easily accessible, but most new articles are created by experienced users, and may not need help from newcomers.
Add images from Commons There are articles that have images in some language Wikipedias but not in others. This could be a good task for a newcomer who created their account in order to add an image of their own. An idea with high potential, but would require a lot of work to build interfaces. There are also questions about how to identify whether an article needs an image, and which one to recommend.
Expand short articles Many articles are stubs that could be expanded. This task is probably too open-ended and difficult for a newcomer.
Link to orphan articles Many articles have no incoming links from any other articles. Users could find articles to link to the orphan articles. Easy to identify orphans, but may be confusing for a newcomer to have to go find other articles in order to do the task.
Add references Many articles are in need of additional references or citations. Probably a challenging task for a newcomer. Frequently covered by maintenance templates.
Add categories Categories are used for many purposes on the wikis, and adding them to articles that don't have them could be a low-pressure way to contribute. Newcomers may not have good judgment when it comes to adding categories. This also does not teach editing skills that they need for other tasks.
Content translation The Content Translation tool could be a good way to structure the editing experience and help newcomers write new articles without having to generate all the content on their own. An integration here could be great -- we may want to use the welcome survey to distinguish which newcomers are multilingual.
Add sections There are algorithms in development that can recommend additional section headers based on similar articles. Writing a new section from scratch may be too challenging a task for a newcomer.
Specific link recommendation Adding wikilinks is one of the best tasks for newcomers. It would be powerful if we could not only tell a newcomer that an article needs more links, but indicate which specific words or phrases should become an link (internal and/or external, depending on local policies). Some research has been done on this idea that the team will be looking into, as this idea could be a perfect first edit for a newcomer.
Copy editing Many articles need copyediting, but it would be a better experience for newcomers if we could suggest specific changes to make in article, such as words that are likely misspelled or sentences that likely need to be rephrased. While this would be an excellent experience for the newcomer, we don't have a way to approach this. Perhaps experienced could flag specific copy edit changes instead of fixing them.
External link cleanup Help ensure articles follow external link policies. Could be populated by the external links cleanup maintenance category.
Neutral point of view Offer people suggestions for how they can "neutralize" subjective text (T376213) Previous research indicates that algorithms could be used to recommend edits that enhance the neutrality of articles.

Version 1.0: basic workflow

In version 1.0, we will deploy the basic parts of the newcomer tasks workflow. It will recommend articles to newcomers that require different types of edits, but it will not match the articles to the newcomers' topics of interest (version 1.1), and it will also not guide the newcomers in completing the task (version 1.2).

Maintenance templates

We're going to be starting by using maintenance templates and categories to identify articles that need work. All of our target wikis use some set of maintenance templates or categories on thousands of articles, tagging them as needing copyediting, references, images, links, or expanded sections. And previous task recommendations software, such as SuggestBot, have used them successfully. These are some examples of maintenance categories:

Example of maintenance template on English Wikipedia

In this Phabricator task, we investigated exactly which templates are present and in what quantities, to get a sense of whether there will be enough tasks for newcomers. There seem to be sufficient numbers for the initial version of this project. We are likely to incorporate other task sources from the table below in future versions.

It's also worth noting that it could be possible to supplement many of these maintenance templates with automation. For instance, it is possible to automatically identify articles that have no internal links, or articles that have no references. This is an area for future exploration.

During the week of October 21, 2019, the members of the Growth team did a hands-on exercise in which we attempted to edit articles with maintenance templates. This helped us understand what challenges we can expect newcomers to face, and gave us ideas for addressing them. Our notes and ideas are published here.

Design

Comparative review

Our team's designer reviewed the way that other platforms (e.g. TripAdvisor, Foursquare, Amazon Mechanical Turk, Google Crowdsource, Reddit) offer task recommendations to newcomers. We also reviewed Wikimedia projects that incorporate task recommendations, such as the Wikipedia Android app and SuggestBot. We think there are best practices we can learn from other software, especially when we see the same patterns across many different types of software. Even as we incorporate ideas from other software, we will still make sure to preserve Wikipedia's unique values of openness, clarity, and transparency. The main takeaways are below, and the full set of takeaways is on this page:

  • Task types – bucket into 4 types: Rating content, Creating content, Moderating/Verifying content, Translating content
  • Incentives – Most products offered intangible incentives mainly bucketed into the form of: Awards and ranking (badges), Personal pride and gratification (stats), or Unlocking features (access rights)
  • Reward incentives – promote badges or attainments of specific milestones (e.g., a badge for adding 50 citations)
  • Personalization/Customization – Most have at least one facet of personalization/customization. Most common customization is user input on surveys upon account creation or before a task, most common system-based personalization type is geolocalization
  • Visual design & layout – incentivizing features (stats, leaderboards, etc) and onboarding is visually rich compared to pared back, simple forms to complete short edits.
  • Guidance – Almost all products reviewed had at least basic guidance prior to task completion, most commonly introductory ‘tours’. In-context help was also provided in the form of instructional copy, tooltips, step-by-step flows,  as well as offering feedback mechanisms (ask questions, submit feedback)

Mockups

Our evolving designs can always be found in two sets of interactive mockups (use arrow keys to navigate):

Those mockups contain explorations of all the difference parts of the user journey, which we have broken down into several parts:

  1. Gathering information from the newcomer: learning what we need in order to recommend relevant tasks.
  2. Feature discovery: the way the newcomer first encounters task recommendations.
  3. Task recommendations: the interface for filtering and choosing tasks.
  4. Guidance during editing: once the newcomer is doing a task, the guidance that helps them understand what to do.
  5. User feedback: ways in which the newcomer can indicate that they are not satisfied with the recommended task.
  6. Next edit: how we continue the user's momentum after the save an edit.

Below are some of the original draft design concepts as the team continues to refine our approach.

Pengujian pengguna

Desktop

During the week of September 16, 2019, we used usertesting.com to conduct six tests of the desktop newcomer tasks prototype with internet users unaffiliated with the Wikimedia movement. In these tests, respondents are compensated for trying out the mockups, speaking aloud on what they observe, and answering questions about the experience. The full results can be found in this Phabricator task. The goals of this testing were:

  1. Gauge the discoverability  of the newcomer tasks module
  2. Identify improvements to the usability of the tasks module:
    1. Apakah pengguna memahami cara memilih dan meninjau saran pada artikel?
    2. Apakah pengguna memahami cara memilah menurut minat dan tingkat kesulitan tugas?
    3. Apakah mereka tahu cara mulai menyuntig artikel yang disarankan?
  3. Gauge user reactions to the suggestions and expectations about guidance through the task.
Summary of findings
  • All users thought it made sense and intuitive to get suggestions based on their topics of interest.
  • Similarly, the different task difficulties was positively received by all participants.
  • Overall usability of the suggested edits module was extremely high. People knew how to click to view more articles, use the filter to change topics and task levels, and knew to click on the card to open a suggestion for editing.
  • 4/6 Participants did not initially realize that they should click on “See suggested edits” as a way to help them achieve their goal of writing a new article. This seemed to be a common mental model where users separated "Editing" as different from "Creating a new page".
  • Start module is clearly the starting point for all participants. Moreover many were drawn to “See suggested edits” button as a way to follow the progression of activities in the start module.
  • Users had a clear understanding and expectation they would be shown suggested articles for editing based on the intro dialogs to add topics and introducing task levels.
  • Everyone was able to select the popular topics and add their own topic easily.
  • Everyone understood the purpose of the suggested edits module.
  • Two people were confused/assumed that they could not create a new article until completing easy and medium tasks.
  • 5 of 6 participants knew to click on the help panel button for guidance once they entered the editor mode.
  • Four people expected to be able to contact their mentor in the help panel.
  • Task tips lacked sufficient level of guidance for a couple of participants.
Recommendations
  • Improve copy and more over user education that creating new content is a form of editing.
  • Make updates to the Impact module as tested here to aid user understanding of suggested edits.
  • Provide good in-edit context help. It’s very important for users trying an edit.
    • Include a “checklist” for users to revise in the help panel’s task tips.
    • Provide short examples of what to do.
    • Indicate to users they do not have to copy edit for an entire article.
  • Including real-time filtering results helps users connect suggestions as article edits and encourage use of the filtering to find matching articles.

Mobile

During the week of September 30, 2019, we used usertesting.com to conduct six tests of the mobile newcomer tasks prototype. The full results can be found in this Phabricator task. The goals of this testing were the same as with desktop, but with the added goal of understanding how the mobile experience should differ from the desktop experience. Mobile user testers were prompted with the scenario of intending to add an image to Wikipedia (whereas desktop respondents were prompted with the scenario of intending to create a new article).

Summary of findings

  • Overall users found the start module (redesigned) clearly laid out the guided steps to begin.
  • The extra “Suggested edits” module below, while not especially confusing, was still not where users expected to go to help them with their task to add an image.
  • Suggested edits was quite intuitive to use, with participants understanding how its different elements (filtering, seeing more articles, etc) worked.  However, users do not see the value of doing Suggested edits beyond learning or boredom.
  • Several people wanted more granular topics to be available than the broad topics listed.
  • Having the detailed difficulty info was educational, but potentially discouraging. All were surprised “Adding images” was classed as hard, with varying degrees of frustration about this fact.
  • Memilah berdasarkan minat adalah nilai jual terbesar.
  • 3 people towards the end of the test assumed there was some “verification” or requirement to do  some Easy tasks before Medium/Hard tasks could be achieved
  • Everyone understood the purpose of the Suggested edits as giving edits that would users learn to edit, and also emphasize that it showed them some edits were harder to do.
  • All users struggled to use the guidance we offered through the help panel while they were editing. This is a major area we need to think hard about designing before we begin to build it.

Rekomendasi

  • Suggested edits call to action is inside start module, not its own card.
  • Improve copy and user education imagery to better convey that there is real world value in trying suggested edits beyond learning and that task difficulty is a guide only and tasks can be tried out of order.
  • Add an overlay specifically to introduce personalized introduction to suggested edits.
  • Including real-time counting of filtered results on both task and topic filters.
  • Incorporate more granular searching by interest topics by users.
  • Reiterate when a user opens a suggestion that it is a real, impactful edit.
  • Update design of the in-task help panel so that all available help content is clearly accessible.

Versi 1.1: pencocokan topik

Past research and development shows that users are more likely to do recommended tasks if the tasks match their topical interests. SuggestBot uses an editor's past editing history to find similar articles, and those intelligent results are shown in this paper to be executed on more often than random results. The Content Translation tool also recommends articles based on a user's previous translation history, and those recommendations have increased the translation volume.

In looking at the usage of V1.0 of newcomer tasks, which does not contain topic matching, we see that there are users who navigate through many suggested articles, and end up clicking on none. There are also users who navigate through many, and end up editing only the ones they happen to find that belong to a certain topic, such as medicine. These are also good indicators that topics can be valuable to help newcomers find articles they want to edit.

Our challenge with newcomers is a "cold start problem", in that newcomers do not have any edit history to use when trying to find relevant articles for them to edit. We want to have an algorithm that says what the topic is of each article, and use that to filter the articles that have maintenance templates.

Algoritma

Screenshot of ORES topic selection filter on desktop

There are multiple approaches with which we might find articles that match a user's stated topic of interest. While our team identified many, we built prototypes for three methods and tested them:

  • morelike: assign a seed list of articles that represent each topic area (e.g. "Art" might be represented by the articles for "Painting", "Sculpture", "Dance", and "Weaving".) Use that seed list to find other articles that are similar to those in the seed list by using a similarity algorithm called "morelike".
  • free text: instead of choosing from a set list of topics, allow newcomers to type in any phrase they want to indicate a topic. Use regular Wikipedia search to surface articles relevant to that phrase.
  • ORES: ORES is a machine learning service that – among other things – can return a predicted topic for any article. Though this prediction service only works in English Wikipedia, there are ways to translate predictions from English to other wikis.

In this Phabricator task, we evaluated the three methods, and decided to proceed with the ORES model. The Growth team worked with the Scoring team to strengthen the model, and with the Search team to make the model predictions available to the newcomer tasks workflow. During the time that this work was happening, we deployed the somewhat worse-performing morelike algorithm, and switched to the ORES model about a month later.

The ORES model we use now offers 64 topics, and we chose to expose 39 of them to newcomers. The evaluation in four different languages showed that on average, 8.5 out of 10 suggestions for a given topic seem like good matches for that topic.

Design

In designing interfaces that allow newcomers to choose topics of interest, these are some of the considerations:

  • How to make a long list of about 30 topics not overwhelming to the user?
  • How to handle multiple layers of topics (e.g. if "Science" has sub-topics of "Biology", "Chemistry", etc.)
  • Whether users can give feedback when a topic does not match what they selected?

These mockups contain our current designs for this interface. You can navigate with your keyboard's arrow keys. Below are some images of the mockups:

Versi 1.2: pemanduan

Guidance was deployed on 2025-08-06. For a guide to translating the messages in this feature, see this page.

After newcomers have selected an article from the suggested edits module, they should receive guidance about how to click edit and complete the edit successfully. While it is exciting that some portion of newcomers are completing suggested edits without guidance, we're confident that by adding guidance, we will substantially increase how many newcomers edit.

We decided to repurpose the help panel as the place to deliver this guidance. Reusing the help panel will allow us to build quickly. The guidance contains three phases:

  1. When the user has arrived on the article and before they click edit.
  2. After clicking edit and before saving an edit.
  3. After saving an edit.

Some of the ideas we considered implementing included:

  • Guidance tailored to each type of edit, varying depending on whether the suggested edit is a copyedit, adding links, adding references, etc.
  • Reminder that an edit can be small, and that the user does not have to edit the whole article.
  • Step-by-step walkthrough that is like a checklist for completing the edit.
  • Highlighting the maintenance templates in the article so that the user can see why the article was suggested.
  • An indicator that encourages the user to click the edit button.
  • A place to put videos that demonstrate how to complete the edit.
  • Suggestions for additional edits after saving the initial edit.
  • Ability for the user to notify their mentor that they have done an edit, so the mentor can check their work and thank them.

During the last week of December 2019, we user tested desktop and mobile prototypes, which can be found below. We will post the user test results after assembling them.

Below are some images of the prototype:

Variant testing

After deploying the first version of newcomer tasks, we want to start testing different variants of the feature, so that we can improve it iteratively. Rather than just having one design of newcomer tasks, and seeing if newcomers are more productive with it than without it, we plan to test more than variant of newcomer tasks at a time, and compare them. We have compiled an exhaustive list of all the ideas of variants to test -- but we will only end up testing perhaps 10 per year, because of the effort and time it takes to build, test, and analyze.

In March, April, and May 2020, we'll be testing variants that aim to get more users into the newcomer tasks flow.

See this page for the list of variant tests and their results.

Measurement and results

Controlled experiment

In December 2020, we published the results of a controlled experiment showing that newcomer tasks have a positive impact on engagement. These are our most important results, and give us confidence that these features should expand to more wikis. Lihat halaman ini untuk rincian lebih lanjut.

Penggunaan

Starting in December 2019, we have been tracking several key metrics from newcomers tasks. The graphs shown in this section are our main charts of those metrics as of 2025-08-06.

Summary

Since deploying newcomer tasks in November 2019, we have seen steady increases in both the number of edits from the feature and the number of editors using the feature. These increases are due to two elements: (a) improvements to the feature, and (b) expanding the feature to more wikis.

Specific charts

Conversion funnel for newcomer tasks as of 2025-08-06

Conversion funnel: the first graph is the most important to our team. Each line shows how many newcomers arrive at each stage of our "conversion funnel", meaning how far they progress into the newcomer tasks workflow, as a percentage of newcomers who visit their homepage. We want the users to move through the stages of:

  1.   interacting with the module,
  2.   selecting an article,
  3.   clicking edit on the article,
  4.   saving an edit.

In general, we want to see all the lines go up.

  • Since the early days of the feature, the percentage of users who have clicked edit and who have saved edits have steadily gone up. In January 2020, something like 2% of newcomers who visited their homepage saved a suggested edit. In August 2020, that has grown to 5.3%, which is more than double.
  • In August, almost all users who selected a task clicked edit, which can be seen by the closeness of the red and orange lines.
  • We think that these improvements are due to the two major features we deployed between January and August: topic matching (which allows newcomers to find more interesting articles) and guidance (which encourages them to click edit and explains how to complete the edit).

Edits: the second graph shows the number of newcomer task edits completed each week, with a separate line for each wiki and a "total" line in black. From December to August 17, there have been 15,126 edits completed through newcomer tasks. It is clear that this has grown over time, which is certainly to be expected because we have gone from 4 wikis to 12 between January and August.

But looking at the individual wikis' lines, it is possible to see growth over time.

Number of edits from newcomer tasks as of 2025-08-06
  • It is common for the number of suggested edits completed each week on a wiki to vary a lot. One of the reasons is that a small number of enthusiastic newcomers can create dozens or hundreds of edits in a short time, but then may not be on wiki on other weeks.
  • Arabic Wikipedia, being one of the largest wikis that has the feature, consistently creates the most edits.

Editors: in addition to tracking the number of edits, we also want to make sure that increasing numbers of newcomers are participating. The third graph shows the number of users completing newcomer tasks each week, broken out by wiki.

Number of editors using newcomer tasks as of 2025-08-06
  • Similarly to the graph of edits, this number also has increased steadily, and the addition of new wikis (such as French Wikipedia on week 21 and Persian Wikipedia on week 32) are clearly visible.
  • We believe that the effect of "guidance" is visible. This was released before week 25. There have been over 100 users of newcomer tasks every week since its release, whereas only three weeks had previously reached that level.


Kualitas suntingan

The Growth team's ambassadors have gone through over 300 edits saved by newcomers and marked whether or not each edit was productive (meaning that it improved the article). We are happy to see that about 75% of the edits are productive. This is similar to the baseline rate for newcomer edits, and we're glad that this feature has not encouraged vandalism. Most of the edits are copyedits, with many also adding links, and some even adding content and references. About a third of users who make one suggested edit go on to make additional suggested edits. Many also go on to make edits that are not suggested by the feature, which is behavior we are happy to see.

Suntingan yang kami lihat berkualitas tinggi, mendorong kami untuk meningkatkan fitur agar lebih banyak pendatang baru memulai dan menyelesaikan alur kerja mereka.

宣府是现在的什么地方 为什么会呼吸性碱中毒 干预是什么意思 6月19日是什么日子 肛周脓肿吃什么消炎药
比劫是什么意思 云吞面是什么面 什么是手机号 2015年是什么生肖 亡羊补牢的亡是什么意思
冠状动脉钙化是什么意思 红糖不能和什么一起吃 fsw是什么意思 pvc是什么意思 打眼是什么意思
什么是滑膜炎 打喷嚏流清鼻涕是什么感冒 梦见自己家被盗有什么预兆 肝血管瘤是什么原因引起的 海棠果什么时候成熟
志字五行属什么hcv8jop6ns1r.cn hbsag是什么hcv8jop1ns5r.cn ibm是什么hcv8jop7ns6r.cn 经常恶心干呕是什么原因hcv9jop0ns4r.cn 深海鱼都有什么鱼hcv8jop1ns3r.cn
煤气是什么气体zsyouku.com 花旦是什么意思zsyouku.com 国历是什么意思hcv8jop8ns1r.cn 交叉感染是什么意思hcv9jop3ns2r.cn pfs是什么意思hcv9jop3ns2r.cn
孙俪什么学历hcv9jop3ns6r.cn 喝断片了是什么意思hcv8jop1ns6r.cn 女人抖腿代表什么意思helloaicloud.com 秀女是什么意思hcv9jop7ns0r.cn 两岁宝宝坐飞机需要什么证件baiqunet.com
看包皮挂什么科hcv8jop7ns4r.cn 早上九点半是什么时辰hcv9jop1ns8r.cn 补钙吃什么yanzhenzixun.com 什么症状要查心肌酶hcv9jop5ns6r.cn 1941属什么生肖hcv7jop4ns8r.cn
百度