Expert review: Difference between revisions
jotting down some initial structured notes |
|||
Line 49: | Line 49: | ||
** based on open authentication/authorization standards such as Shibboleth? |
** based on open authentication/authorization standards such as Shibboleth? |
||
* May need to interface with WikiProjects for purposes such as selecting articles requiring review, outreach, etc. |
* May need to interface with WikiProjects for purposes such as selecting articles requiring review, outreach, etc. |
||
* Should not cannibalize editing - reviews should be coupled with an invitation to improve the article directly |
|||
** But: reviewers must be independent and not substantially involved in the article's development. Reviewing a revision you haven't touched and improving it going forward would be less problematic than reviewing a revision you've substantially contributed to. |
|||
== Notes == |
== Notes == |
Revision as of 09:41, 11 January 2011
Purpose of this page: Elaborate on a lightweight model which will allow us to receive and store reviews by credentialed experts of specific Wikipedia[1] article revisions, and to surface such reviews through the UI and through the API.
Rationale
Assertions by credentialed experts that an article does or does not represent the best available scholarship on a topic can help readers (by making it easier to find high quality content) and editors (by incentivizing high quality work, surfacing problems, etc.), as well as individuals or groups seeking to use subsets of Wikipedia for educational purposes (by allowing extraction and use of reviewed subsets of Wikipedia content). This does not require any changes to Wikipedia's editorial process, and indeed, capturing, storing and making available such reviews does not imply any additional valuation of the credentialed expertise or even its relevance to the article at hand.
Precedents
Actual practice:
- Experiment with Encyclopedia of Life, Rfam and Pfam
- Magnus Manske's user script that surfaces available expert reviews on relevant articles
- Magnus Manske's proof-of-concept tool to create metadata for printed books, PDF or ODT exports
- Example API query to return expert review metadata for the "Gulf Snapping Turtle" article in JSON format, based on Magnus Manske's toolserver database storing expert reviews for multiple partners
- the article feedback pilot is a MediaWiki extension currently deployed to a small subset of English Wikipedia articles which allows readers to rate articles (no comments/extended reviews yet). Reviews are associated with specific revisions, but visualization is currently only in aggregate. A future version of the widget will ask readers to self-ascertain their level of expertise. The codebase could likely be extended to support credentials metadata and other product requirements associated with expert reviews.
Proposals:
- strategy:Proposal:Expert review - more comprehensive proposal that encompasses additional complex and controversial ideas, such as giving experts more influence in the editorial process
- Referees - comprehensive proposal for multiple levels of quality assessment
- Sifter project - historical proposal from the Nupedia days
What you can do right now
If you're a hacker:
- Make Magnus Manske's user script better and help develop it to gadget quality
- Create other proof-of-concept applications that tie into the existing toolserver database / API developed by Magnus
- Help extend the ArticleFeedback MediaWiki extension to support product requirements articulated below
If you're good at analyzing and documenting:
- Help us develop draft product specifications for how expert reviews should be supported by MediaWiki (UI and workflow mock-ups very much welcome)
If you're great at outreach into academia:
- Help identify opportunities for mobilizing large numbers of experts to join the review process
Product requirements
Very basic notes.
- Should surface expert review information in an unobtrusive manner, ideally integrated with other review information such as reader feedback, FlaggedRevs, etc.
- Recognizing reviewer organizations with logos and links would be a nice incentive for organizations to join the program
- Should allow for short and extended comments
- Should enable a simple yes/no decision on whether an article should e.g. be included in an offline distribution
- Should recognize that multiple fields of expertise may apply to any given article
- Should support e-mail as a delivery mechanism for new articles requiring review (to simulate peer review process)
- Should capture conflicts of interest by reviewers
- Should initially require only minimal effort towards credentials validation, likely by offloading this requirement to other organizations
- e.g. in the current Encyclopedia of Life prototype, EOL has its own criteria for curator expertise
- Should have one or multiple authentication mechanisms
- based on rights-assignment to individual users, possibly granted at sign-up through separately authenticated sign-up form?
- based on URL keys that can be used for e-mails etc.?
- based on open authentication/authorization standards such as Shibboleth?
- May need to interface with WikiProjects for purposes such as selecting articles requiring review, outreach, etc.
- Should not cannibalize editing - reviews should be coupled with an invitation to improve the article directly
- But: reviewers must be independent and not substantially involved in the article's development. Reviewing a revision you haven't touched and improving it going forward would be less problematic than reviewing a revision you've substantially contributed to.
Notes
- ↑ In future, we may extend the model further to cover other projects, especially Wikibooks, Wikiversity