Wikipedia:Bots/Requests for approval
- العربية
- Arpetan
- Asturianu
- Avañe'ẽ
- تۆرکجه
- বাংলা
- Башҡортса
- Беларуская
- भोजपुरी
- Български
- Bosanski
- Català
- Čeština
- Corsu
- Dansk
- الدارجة
- Deutsch
- ދިވެހިބަސް
- Español
- Esperanto
- Estremeñu
- Euskara
- فارسی
- Føroyskt
- Français
- Galego
- ГӀалгӀай
- 贛語
- ગુજરાતી
- 한국어
- Հայերեն
- हिन्दी
- Hrvatski
- Ido
- Igbo
- Bahasa Indonesia
- Interlingua
- Íslenska
- Italiano
- עברית
- ಕನ್ನಡ
- ქართული
- Қазақша
- Кыргызча
- Ladino
- ລາວ
- Latviešu
- Lombard
- Magyar
- मैथिली
- Македонски
- Malagasy
- മലയാളം
- Malti
- मराठी
- مصرى
- Bahasa Melayu
- ꯃꯤꯇꯩ ꯂꯣꯟ
- Minangkabau
- မြန်မာဘာသာ
- Nederlands
- नेपाली
- 日本語
- Нохчийн
- Norsk bokmål
- Occitan
- Oʻzbekcha / ўзбекча
- پنجابی
- ပအိုဝ်ႏဘာႏသာႏ
- پښتو
- Piemontèis
- Plattdüütsch
- Polski
- Português
- Qırımtatarca
- Română
- Romani čhib
- Русский
- Shqip
- Sicilianu
- Simple English
- سنڌي
- SiSwati
- Slovenčina
- Slovenščina
- Soomaaliga
- Српски / srpski
- Srpskohrvatski / српскохрватски
- Suomi
- Svenska
- தமிழ்
- ၽႃႇသႃႇတႆး
- తెలుగు
- ไทย
- Tsetsêhestâhese
- Türkçe
- Українська
- اردو
- Vèneto
- Tiếng Việt
- Walon
- ייִדיש
- 粵語
- 粵語
- 中文
This is an old revision of this page, as edited by Xeno (talk | contribs) at 17:03, 7 July 2010 (Wikipedia:Bots/Requests for approval/SunCreatorBot to trial). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
All editors are encouraged to participate in the requests below – your comments are appreciated more than you may think! |
New to bots on Wikipedia? Read these primers!
- Approval process – How these discussions work
- Overview/Policy – What bots are/What they can (or can't) do
- Dictionary – Explains bot-related jargon
To run a bot on the English Wikipedia, you must first get it approved. Follow the instructions below to add a request. If you are not familiar with programming consider asking someone else to run a bot for you.
Instructions for bot operators | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
|
Bot-related archives (v·t·e) |
---|
1, 2, 3, 4, 5, 6, 7, 8, 9, 10 11, 12, 13, 14, 15, 16, 17, 18, 19, 20 21, 22, 23, 24, 25, 26, 27, 28, 29, 30 31, 32, 33, 34, 35, 36, 37, 38, 39, 40 41, 42, 43, 44, 45, 46, 47, 48, 49, 50 51, 52, 53, 54, 55, 56, 57, 58, 59, 60 61, 62, 63, 64, 65, 66, 67, 68, 69, 70 71, 72, 73, 74, 75, 76, 77, 78, 79, 80 81, 82, 83, 84, 85, 86 |
|
Bot Name | Status | Created | Last editor | Date/Time | Last BAG editor | Date/Time |
---|---|---|---|---|---|---|
BaranBOT 2 (T|C|B|F) | Open | 2024-05-27, 14:01:46 | DreamRimmer | 2024-06-07, 14:01:25 | Primefac | 2024-06-06, 15:00:25 |
Dušan Kreheľ (bot) VIII (T|C|B|F) | Open | 2024-04-07, 05:50:30 | Dušan Kreheľ | 2024-04-07, 05:56:34 | Never edited by BAG | n/a |
Dušan Kreheľ (bot) VII (T|C|B|F) | Open | 2024-02-16, 09:24:40 | EggRoll97 | 2024-06-02, 04:21:37 | ProcrastinatingReader | 2024-02-20, 13:13:05 |
BattyBot 81 (T|C|B|F) | On hold | 2024-02-07, 14:12:49 | ProcrastinatingReader | 2024-02-15, 12:09:35 | ProcrastinatingReader | 2024-02-15, 12:09:35 |
DannyS712 bot III 74 (T|C|B|F) | In trial | 2024-05-09, 00:02:12 | DannyS712 | 2024-05-09, 16:13:34 | ProcrastinatingReader | 2024-05-09, 10:58:36 |
StradBot 2 (T|C|B|F) | In trial | 2024-02-17, 03:20:39 | SD0001 | 2024-02-17, 05:58:51 | SD0001 | 2024-02-17, 05:58:51 |
PearBOT 14 (T|C|B|F) | In trial: User response needed! | 2023-12-28, 20:06:39 | Primefac | 2024-02-18, 20:11:21 | Primefac | 2024-02-18, 20:11:21 |
CapsuleBot 2 (T|C|B|F) | Extended trial | 2023-06-14, 00:14:29 | Capsulecap | 2024-01-20, 02:36:30 | Primefac | 2024-01-15, 07:40:39 |
AussieBot 1 (T|C|B|F) | Extended trial: User response needed! | 2023-03-22, 01:57:36 | Hawkeye7 | 2024-02-18, 23:33:13 | Primefac | 2024-02-18, 20:10:45 |
DoggoBot 10 (T|C|B|F) | In trial | 2023-03-02, 02:55:00 | Frostly | 2024-02-21, 22:41:18 | Primefac | 2024-01-15, 07:40:49 |
Qwerfjkl (bot) 30 (T|C|B|F) | Trial complete | 2024-06-05, 20:51:40 | Qwerfjkl | 2024-06-07, 07:11:03 | SD0001 | 2024-06-06, 04:07:27 |
PrimeBOT 39 (T|C|B|F) | On hold | 2023-05-11, 12:48:50 | Primefac | 2023-09-22, 10:51:59 | Headbomb | 2023-07-02, 17:38:58 |
Current requests for approval
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: EdoDodo (talk · contribs)
Automatic or Manually assisted: Automated, actual runs are not supervised but messages it delivers are (see below for details).
Programming language(s): PHP (using Peachy)
Source code available: Will be (partially) available later on. Available on request.
Function overview: The bot will deliver messages based on requests from a web interface. These requests will be checked manually before running. Trusted users will be marked as 'confirmed' on the website, and will be able to review other people's requests, as well as their own.
Links to relevant discussions (where appropriate): Task is very similar to that of other message/newsletter delivery bots, so no discussion is necessary (uncontroversial).
Edit period(s): Once a day, at non-peak times Four times daily. Hourly, or on demand (if there are checked requests waiting to be delivered).
Estimated number of pages affected: Depends on how many requests are made, currently I really have no idea so cannot make an estimate.
Exclusion compliant (Y/N): Yes
Already has a bot flag (Y/N): No
Function details: Anybody will be able to request a delivery here. The requests will checked before the bot runs them. Regular users can register on the website, and once verified by me or another administrator will be able to help check requests, as well as checking their own. The bot will normally run once a day at a non-peak time to deliver all checked messages. If a message is marked as 'urgent' then the bot may be manually started to run out-of-schedule to deliver that message. Feel free to look around the interface to see how it works, most of the functioning should be fairly self-explanatory. Feel free to submit test requests, I won't check any of them so none of them will run. If you would like confirmed user status to look at the restricted parts of the interface then drop me a message on my talk page or talk to me on IRC. If I do give you checked user status please note that the bot IS completely functional, so if you verify a request it WILL run (at midnight UTC), so make sure you unverify it, since the bot is not approved yet.
Discussion
In case any of you are wondering why I didn't request this as a new task for DodoBot, it's because this bot will, in a sense, be operated by multiple people, since any trusted user could verify a request and the bot would run it, even without me seeing what goes on. Although I realize that I am still responsible for the bot's actions I didn't want it to have my name in its username, since that would imply that I am the bot's only operator. - EdoDodo talk 09:38, 7 July 2010 (UTC)[reply]
- EdwardsBot does something that's roughly the same, so the idea in principle is fine. I'm curious why the source code would only be partially released. The login / register form should make it explicit that Wikimedia login credentials should not be used. How will you be verifying that the user on the Toolserver tool is the same user on the Wikimedia site? Is there going to be some sort of sanity check to ensure that some users don't abuse the bot? "Abuse" is a strong word, but sometimes people go around needlessly trying to find posts to spam, mostly because they're bored and it's summertime. --MZMcBride (talk) 15:56, 7 July 2010 (UTC)[reply]
- Well, I'd be happy to publish the complete source code... but I'm a bit worried about the security of what I've developed. I'm sure the code isn't entirely bullet-proof, and there are security issues in it somewhere, and publishing it would only make it easier for malicious users to find these. I put the partially in brackets because once I've properly checked the code and I'm confident it's secure I will then make it completely available (with the exception of the files including the database connection and bot login information, of course). I will add a couple of lines to the login form to make sure users don't use there Wikimedia credentials later today. As for verifying that the user is the same, I'm not sure how I would do this - or even if it is necessary. If the message delivery is uncontroversial then there is no reason why it shouldn't be run - remember that all runs are manually checked. If it is controversial, and could be seen as spam, then there is a field to include a link to a relevant discussion that shows consensus for it. With that said, if there is consensus that it is necessary to have users confirm their identities, I can add a field to have the user provide a diff of an edit on Wikipedia confirming that he is the same person. - EdoDodo talk 16:56, 7 July 2010 (UTC)[reply]
- Or perhaps the system could be changed so that only the approved users can request new tasks. This would go a bit against the original idea I had of a more open message delivery system, but then again it would probably always be the same few people involved with newletters requesting deliveries, so there wouldn't be too much harm in requiring registration and approval. Opinions? - EdoDodo talk 10:38, 8 July 2010 (UTC)[reply]
- Regarding your concerns over posting the source code, read security through obscurity.
- I think verifying that the username entered actually belongs to the requesting user is important; as you say, it will probably be the same few people requesting deliveries, and you wouldn't want some troublemaker to try to capitalize on the trust you will probably develop in those few by pretending to be them. This doesn't have to be anything terribly complicated; one possible method is to require each prospective submission be confirmed by having the requesting account edit a specific page in the bot's userspace to contain a non-predictable randomly-generated string specific to that submission. The submission form could then easily enough output a form with all hidden fields and a "submit" button that targets the edit form for that page with the necessary data pre-filled, and even submit that form with Javascript; see the "Do stuff for me" button on tools:~dispenser/view/Checklinks for an example of this technique.
- It also needs to be possible to determine who approved the message for delivery, per WP:BOTPOL#Bots operated by multiple users; small print or even a <!-- comment --> at the end of the delivered message should suffice, IMO. I would also recommend a fully-protected page on-wiki and linked from the bot's userpage listing all the trusted users, such that an admin can remove someone from this list to prevent any more messages approved by this user from being sent (otherwise, if someone approves bad messages then the bot will have to be blocked until you manually remove that person's access). You should also check if the requester/approver is currently blocked, and similarly refuse to post any messages requested/approved by that person. Both of these, of course, should periodically be rechecked during long delivery runs (you could probably check these in the same API request you use to check bot exclusion).
- You may also want to include in your submission form the ability to specify a value for the optout parameter to {{bots}}, although at this time none of the predefined values are likely to apply. Anomie⚔ 21:22, 9 July 2010 (UTC)[reply]
- Thank you for all these excellent suggestions. I will work on implementing them all in the next few days. - EdoDodo talk 06:26, 10 July 2010 (UTC)[reply]
- I have a small question... would a kind of blacklist (of users who's approvals should be ignored) on Wikipedia instead of a whitelist (of approved users) also be acceptable? As the system is currently laid out this would be significantly easier to implement, and also I wouldn't have to go looking for an admin every time I want to approve a new user. If it would be desirable to show a list of all approved users then I can do this on the website instead. - EdoDodo talk 07:20, 10 July 2010 (UTC)[reply]
Done I've implemented all of the changes discussed above, with the exception of the ability to set the optout parameter because, as Anomie said, none of the predefined values really apply anyway. As well as this the bot now follows redirects to other "User talk:" pages, leaving a note saying where it was redirected from. The bot also posts a log to the requesters talk page, either saying that everything went fine or stating which users the bot encountered errors on. The message about a successful delivery is only sent if the user chose to have it sent when he was making the request. The notification about errors is always sent (unless the requester has {{nobots}} on his page, or has it protected, in which case the error log is sent to me instead, so I can take a look at it). - EdoDodo talk 21:33, 10 July 2010 (UTC)[reply]
I have done some test edits in my namespace and the bot seems to function as expected. Examples:
- Normal message
- Redirected
- Successful delivery (only if user has opted in when making request)
Errors delivering (page had {{nobots}}) - accidentally had one newline too much in there, has been removed now.Was a malfunction, so much for functioning as expected, here's a good edit: [1]
All edits went as expected, however more suggestions on how this could be improved are always welcome. - EdoDodo talk 11:26, 11 July 2010 (UTC)[reply]
- I'm not sure about the error reporting. I see you have a reason at the end of the message, but the reason given is rather vague and if there is more than one failed user that wouldn't make a whole lot of sense anyway. I'd rather see each list entry as something like "* Example - bot excluded" or "* Example - page protected". Also, does it need some sort of reference to the message that was being delivered? I'm not sure. Anomie⚔ 15:04, 11 July 2010 (UTC)[reply]
- Whoops, the edit I linked was actually a malfunction (the message at the end wasn't supposed to be there). I have added a note stating the reason why the message delivery failed, as well as a reference to the title, so that if the user has placed more than one request he will know which one I am talking about. For an example of how this looks see this edit - EdoDodo talk 15:53, 11 July 2010 (UTC)[reply]
It'd be nice if it used a standard edit summary so that you get the section anchor links (from /* foo */) in things like watchlists. --MZMcBride (talk) 19:41, 11 July 2010 (UTC)[reply]
- I'm sorry, I don't think I understand what you mean. You mean have the edit summary be something like:
- /* $messageTitle */ Delivering message from $requester
- Because having the /* */ around the username of the requester wouldn't really help much since it would just result in a broken section link. - EdoDodo talk 08:44, 12 July 2010 (UTC)[reply]
I have changed the edit summaries as discussed above. As well as this I have changed the login system to use TUSC (user has to have a valid TUSC account that is approved in order to login). This ensures that all usernames are linked to a Wikipedia account of the same name, and allows better identification of approved users. I have also added a page that will list all approved users here so that if somebody needs to contact a user about their request, or an admin needs to see who can operate the bot, they will know. - EdoDodo talk 08:26, 18 July 2010 (UTC)[reply]
- Cool. This is atypical of a bot task, so take as much time as you want for you trial, maybe a week or so of full functionality. If you need a bit more, just leave a note here explaining why. Good luck. Tim1357 talk 23:15, 20 July 2010 (UTC)[reply]
- Approved for trial. Please provide a link to the relevant contributions and/or diffs when the trial is complete. Tim1357 talk 23:15, 20 July 2010 (UTC)[reply]
I might be sending the announcement for the end of the GOCE backlog elimination drive month (ongoing discussion here). Otherwise Mono and Xeno both expressed interest in having message(s) delivered. So, should be getting some test edits pretty soon (around end of month). - EdoDodo talk 20:35, 22 July 2010 (UTC)[reply]
Okay, I've done a trial run delivering WikiProject NASCAR's first newsletter ( {{Wikipedia:WikiProject NASCAR/Newsletter/201007}} ). The delivery went mostly smootly, although there was an error with an improperly handled exception that stopped the bot half way through (see the break between leaving Recury the message and leaving it to Royalbroil). This error has been fixed and should not occur again. As mentioned above, there is the possibility of more messages being delivered soon, so if you would like to keep the bot in trial to make sure that the error doesn't occur again in future deliveries, I have no objections. - EdoDodo talk 21:08, 25 July 2010 (UTC)[reply]
Just ran another request, for Xeno, delivering WikiProject Malaysia's newsletter. This time there were no issues (ran the bot out of schedule so I could keep an eye on it while it ran). Xeno also suggested that I run the bot more frequently than once a day, perhaps four times a day. Would anyone have an issue with this? It really shouldn't change much for Wikipedia's servers, the bot won't even log in if there aren't requests waiting. - EdoDodo talk 16:55, 29 July 2010 (UTC)[reply]
- I don't see any issue with it attempting to run four times a day (or even more often than that). It will only run if there is something to do, so... –xenotalk 17:23, 29 July 2010 (UTC)[reply]
Trial complete. Have done a couple of delivery requests, and made about 50 edits. Basically done full functionality of the bot for a week, as requested, so I would say that trial is complete. See above for how the trial went. - EdoDodo talk 17:03, 29 July 2010 (UTC)[reply]
Have also delivered the final newsletter for the GOCE backlog elimination drive. That went very well, although it did deliver the message to a user that didn't exist ([2]). The bot has now been changed so that if a user doesn't exist, or is blocked, it will report this to the requester instead of just delivering the message anyway. - EdoDodo talk 18:36, 1 August 2010 (UTC)[reply]
It is now possible for users to run the bot manually, and see the output from it on a web page as it runs. Approved users can find the link under 'Run bot' at the bottom of the list of links, once they are logged in. The bot will not run if it is already running (on somebody elses web browser, or as a scheduled task), or if it severely malfunctioned (unhandled exceptions, etc.) at the last run. - EdoDodo talk 12:02, 2 August 2010 (UTC)[reply]
- Oh, I forgot to mention, even when running "in somebody's web browser" (it's not really, they're just being sent the output real-time), complete logs of the run are kept on the Toolserver for debugging and stuff. - EdoDodo talk 12:07, 2 August 2010 (UTC)[reply]
- I've just tested this new feature and I gotta say, it's quite slick. EdoDodo is a responsible operator and this bot seems to be chugging along quite nicely. I see no barriers to approval. –xenotalk 19:51, 2 August 2010 (UTC)[reply]
Approved. MBisanz talk 15:14, 3 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Traveler100 (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): AutoWikiBrowser
Source code available: user:People-photo-bot/source-v1
Function overview: Move article talk pages currently in category Wikipedia requested photographs of people into sub-categories so that the number of articles in a specific category are of a manageable number.
Links to relevant discussions (where appropriate): Notification at:
- Wikipedia talk:WikiProject Biography#photo requests
- Wikipedia talk:WikiProject Images and Media#photo requests
- Template talk:WPBiography#more categories for biographies
- Category talk:Wikipedia requested photographs of people
- Category talk:Wikipedia requested photographs of sportspeople
Edit period(s): Periodically
Estimated number of pages affected: In the end over 20000 talk pages but over a long period. Estimate about 100 a week.
Exclusion compliant (Y/N):
Already has a bot flag (Y/N):
Function details:
- Go through article talk pages in
- If article talk page contains a know WikiProject
- removed yes from need-photo parameter in WPBiography template
- If reqphoto template exists on the talk page
- add to reqphoto template category based on WikiProject found.
- if reqphoto contains parameter people or sportpeople
- remove parameter
- else
- add reqphoto template with category shown in list for WikiProject found.
Discussion
Can we assume the total underwhelming comments on this subject where it has been notified and the fact that no one has tackled the required for photographs of people category for year that it would be fine to go along with this proposal? --Traveler100 (talk) 09:51, 11 July 2010 (UTC)[reply]
Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Sorry for the wait. — The Earwig (talk) 16:13, 21 July 2010 (UTC)[reply]
Did a few extra so that all templates can be tested. Minor change to AWB code. --Traveler100 (talk) 15:03, 27 July 2010 (UTC)[reply]
Trial complete.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Withdrawn by operator.
Operator: Droll (talk · contribs)
Automatic or Manually assisted: Manually Assisted. I will only be using AWB to start.
Programming language(s): AutoWikiBrowser for now.
Source code available: I don't think any is required to use AutoWikiBrowser in normal mode. I'm fairly proficient in regular expressions.
Function overview: Mostly I've been migrating and cleaning up templates. Fairly recently I edited all articles which transclude {{Infobox mountain}}. Most of the parameter names where in the process of being updated. There are over 9,000 articles that transcluded the template. I hope to continue to work on templates related to geography. I'm also interested in zoology, especially ornithology.
Links to relevant discussions (where appropriate): Currently none.
Edit period(s): Periodic and one time.
Estimated number of pages affected: My regular user account Droll has about 58,000 total edits mostly using AWB. So I expect I might make at least 10,000 edits a year with this account.
Exclusion compliant (Y/N): Yes. AWB is exclusion compliant by default.
Already has a bot flag (Y/N):
Function details: None currently.
Discussion
Other users who watch pages that I work on would benefit if this account had a bot flag. They could hide the pages that I edit and thus be better able to monitor edits that are more likely to need revision. I have a very high speed internet connection and even though I try not to exceed the expected number of edits per unit of time, I was advised once to slow down. As I understand it, bot accounts are allowed to edit a little faster. Periodically I come across tasks that are useful but that are totally mindless in execution. I'd love to be able to do this kind of work robotically. I've learned the hard way that it is important to be meticulous. Early on I made a mistake and had to revert a large number of articles. I don't want to repeat that experience. I'm a retired programmer. I got a degree is CS back in 1985. I hope that in the future I can make a more substantial contribution to Wikipedia. Not necessarily in volume but in quality. I hope this includes the information you need. Thanks for your consideration and patience. I can be too wordy at times, I guess.DrollBot (talk) 09:09, 6 July 2010 (UTC)[reply]
- Hey Droll. Im glad youve taken an intrest in bots! It seems you have enough experience arround here to be responsible enough with your bot. However, before we can aprove any bot task, we require that the bot opperator know the Bot policy. The second thing you should do is try to refine what it is you want to do with this task. "Cleaning up templates" is a bit vauge for a Bot task (note you may always open more BRFAs for the same bot, or bundle multiple similar tasks into one BRFA). The third, and most important thing you need to do is to prove that there is community concensus for the task to be preformed by a bot. This may come in the form of a Village pump discussion, policy page, wikiproject talk page, ect. I look forward to working with you. Tim1357 talk 23:22, 16 July 2010 (UTC)[reply]
- A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) Do you wish to continue with this request? MBisanz talk 04:55, 10 August 2010 (UTC)[reply]
- Withdrawn. MBisanz talk 03:53, 12 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Denied.
Operator: Sahimac (talk · contribs)
Automatic or Manually assisted: Semi Automatic
Programming language(s): Standard pywikipedia AND Full python language.
Source code available: Create links interwiki - Insert welcome in user talk for new users!
Function overview: Create links interwiki - Insert welcome in user talk for new users!
Links to relevant discussions (where appropriate):
Edit period(s): irregularly
Estimated number of pages affected: there are currently 46
Exclusion compliant (Y/N): Yes
Already has a bot flag (Y/N): No
Function details: Create links interwiki - Insert welcome in user talk for new users! and Performed by pywikipedia.
Discussion
Hi there, Sahimac. As you know, the bot was blocked because you were running it without approval. No big deal; it's both excusable and understandable considering you come from another wiki that has different bot rules. Let's talk about your tasks. The welcome task can't be done, unfortunately – see this helpful page for information why, but interwiki links are definitely possible. Have you familiarized yourself the local bot policy, especially the section on interwiki links? What wikis will you be linking from, and what namespaces will you be editing? I look forward to hearing from you. — The Earwig (talk) 04:08, 4 July 2010 (UTC)[reply]
- Hi, Yes, I'm familiar with robot policies, and I read more in that area which you said. I hope to have the necessary ability in managing the bot.
- I'm going to edit in main, category, and template namespaces, and my activity is almost in all wikis, but mostly from Persian wiki to English and vice versa.sahim (talk) 09:39, 4 July 2010 (UTC)[reply]
- You do not seem to have read and understood the section on interwiki links, or you would not be intending to run in the template namespace with interwiki.py. This does not give me confidence that you will be a contentious bot operator and will be able to appropriately respond to any complaints. Anomie⚔ 16:53, 5 July 2010 (UTC)[reply]
- but I was told that: (Interwiki bots should not run unsupervised in Template namespace unless specifically designed to run on templates.) Had read. However, I am waiting.sahim (talk) 18:24, 5 July 2010 (UTC)[reply]
- Exactly. And it also says "As of May 2009, the standard interwiki module in pywikipedia does not meet these requirements". Which means you may not run interwiki.py in the Template namespace, as interwiki.py is not specifically designed to run on templates. Anomie⚔ 20:10, 5 July 2010 (UTC)[reply]
well,Can be said:{I'm going to edit in main, category, [and template( And ,So we can say that: Using Python language or I will not use patterns from space or I delete this section)] namespaces, and my activity is almost in all wikis, but mostly from Persian wiki to English and vice versa.}== Now will result without Python language(No standard pywikipedia): (I'm going to edit in main, category namespaces, and my activity is almost in all wikis, but mostly from Persian wiki to English and vice versa.).Finally ,What is your opinion now? this good or bad?--sahim (talk) 00:08, 6 July 2010 (UTC)[reply]
- Exactly. And it also says "As of May 2009, the standard interwiki module in pywikipedia does not meet these requirements". Which means you may not run interwiki.py in the Template namespace, as interwiki.py is not specifically designed to run on templates. Anomie⚔ 20:10, 5 July 2010 (UTC)[reply]
- but I was told that: (Interwiki bots should not run unsupervised in Template namespace unless specifically designed to run on templates.) Had read. However, I am waiting.sahim (talk) 18:24, 5 July 2010 (UTC)[reply]
- You do not seem to have read and understood the section on interwiki links, or you would not be intending to run in the template namespace with interwiki.py. This does not give me confidence that you will be a contentious bot operator and will be able to appropriately respond to any complaints. Anomie⚔ 16:53, 5 July 2010 (UTC)[reply]
- Changed to Standard pywikipedia AND Full python language For Removing ambiguity!sahim (talk) 07:28, 6 July 2010 (UTC)[reply]
- According to the above, you expect to edit 46 pages. Why is a bot needed? Johnuniq (talk) 10:21, 6 July 2010 (UTC)[reply]
- I see that you now intend to write your own interwiki code that will correctly handle templates. Please post this code so it may be reviewed. Anomie⚔ 17:13, 6 July 2010 (UTC)[reply]
- please look see!--sahim (talk) 07:21, 7 July 2010 (UTC)[reply]
- There is some serious confusion here because the page that you link to contains the standard (not modified) interwiki.py from Pywikipediabot. Johnuniq (talk) 10:53, 7 July 2010 (UTC)[reply]
- Yes, this is interwiki.py for interwiki in all parts of the wiki.Be sure, the other does not problem in creat interwiki exist. it is "nightly wikipedia" in metawiki And the Python language.sahim (talk) 20:20, 7 July 2010 (UTC)[reply]
- I see no reason to prolong this. The proposed bot seems unsuitable per Anomie's comments above regarding usage for templates. When asked for the source for "your own interwiki code that will correctly handle templates", the operator posted the unchanged interwiki.py with no indication that it is an unpatched duplicate, nor any understanding of how confusing it would be for people reviewing this, not to mention pointless. Whatever the reason, it is clear that this bot is not sufficiently justified and the operator is not able to suitably communicate on English Wikipedia. Johnuniq (talk) 00:55, 10 July 2010 (UTC)[reply]
- Yes, this is interwiki.py for interwiki in all parts of the wiki.Be sure, the other does not problem in creat interwiki exist. it is "nightly wikipedia" in metawiki And the Python language.sahim (talk) 20:20, 7 July 2010 (UTC)[reply]
- There is some serious confusion here because the page that you link to contains the standard (not modified) interwiki.py from Pywikipediabot. Johnuniq (talk) 10:53, 7 July 2010 (UTC)[reply]
- please look see!--sahim (talk) 07:21, 7 July 2010 (UTC)[reply]
- Why say, I do not have the ability to run the robot? The first time I told Pywikipedia and Python languages. What is the other problem? I Python language - unless otherwise no part of the Python language? Otherwise use the problem for python language? If not, then what other problem? . I'd do all the things that I create interwiki, not something else.Please Take a look here to see the contributions to the robot what was in the Persian Wikipedia (create more interwiki). I guarantee that I will not do wrong - otherwise the robot close forever If the wrong! . However, final decisions with your managers in wiki English ,anyway thank you will answer that.sahim (talk) 03:50, 10 July 2010 (UTC)[reply]
Denied. Johnuniq's analysis above saves me having to write the same thing. Anomie⚔ 12:04, 14 July 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Arielbackenroth (talk · contribs)
Automatic or Manually assisted: Automatic and supervised
Programming language(s): Python
Source code available: available here
Function overview: Will add external links from authors and their works to the Open Library.
Links to relevant discussions (where appropriate):
Edit period(s): not sure yet. likely an inital run to correlate wikipedia articles to their corresponding openlibrary pages, and then occasional runs to update new articles.
Estimated number of pages affected: in the first run, probably on the order of tens of thousands of articles for authors whose works are available via the Open Library. Expanding the scope of the program to link between works on wikipedia and OL will be done later.
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): N
Function details: This bot will insert interwiki links from wikipedia to Open Library using the templates Template:OL_author and Template:OL_work. The links between wikipedia articles to Open Library articles will be restricted to:
- Works available at Open Library as to read online such as Frankenstein.
- Works available in the print disabled collection at Open Library such as Harry Potter and the Philosopher's Stone.
- Authors whose works are available at Open Library per the conditions above such as Allan Cunningham, Oscar Wilde, and J. K. Rowling.
Discussion
- Personally, I don't see the good of this task. When a link added by this user to Divine Comedy showed up on my watchlist and I went to the Open Library page to check it out, I failed to see that the link added any value to the article. Some of the OL content may be useful in certain cases, I guess, if they have digital copies of books that aren't available elsewhere online; but that wasn't so in this case, where links to booksellers seemed the main feature. My reading of WP:ELMAYBE and WP:ELNO suggests that botspamming these links indiscriminately across Wikipedia is almost certainly a bad idea. Deor (talk) 02:38, 1 July 2010 (UTC)[reply]
- Thanks for the feedback. Sorry for the link on the Divine Comedy - I added a few links to gauge the utility of this proposal and while the link from Divine Comedy to Open Library may not be immediately useful, others are. Please have a look at the links added to the articles mentioned in the proposal.
- Based on your feedback I've revised the proposal to be more restrictive and consistent with other interwiki links such as IMSLP. Links will only be inserted per the revised proposal, where there is a book to read at Open Library. Arielbackenroth (talk) 18:43, 1 July 2010 (UTC)[reply]
Needs wider discussion. Bot-adding of external links is often quite controversial, as some editors see it as spam. I notice that Open Library is affiliated with the Internet Archive, so you probably have a chance. Take the discussion to WP:Village pump (proposals), and post invitations to the discussion at WP:External links/Noticeboard, WT:WikiProject Books, WT:WikiProject Spam, maybe WT:External links, and anyplace else that seems appropriate. See WP:Publicising discussions for more info. Unless that discussion shows a strong community support for these links, this bot cannot be approved. I am also a bit concerned that you have just started editing Wednesday, and all your edits have been focused on adding these links to Wikipedia; prospective bot operators normally have much more experience with Wikipedia and its policies and guidelines as normal editors. Anomie⚔ 04:41, 3 July 2010 (UTC)[reply]
- I totally agree with Deor and Anomie. A quick look makes me think that the cause is noble, but a very wide discussion and solid consensus would be required before a bot starts adding what many will regard as promotional links. Johnuniq (talk) 10:29, 6 July 2010 (UTC)[reply]
- Discussion moved to WP:Village pump (proposals). Edward (talk) 22:45, 6 July 2010 (UTC)[reply]
- I see the discussion there fizzled, but of the two commenters one didn't seem to notice that links will only be added when Open Library has full text available and the other changed to support after that was pointed out. I'd like to review the code (I'd hate for there to be objections due to a code error), then I'll approve a trial to try to flush out more comments. For the purposes of the trial, I will be asking you to include a clear link to this page in the edit summaries (e.g. "
([[Wikipedia:Bots/Requests for approval/OpenlibraryBot|discuss this trial]])
") and, if at all possible, to choose the most popular books/authors and a variety of different genres for the trial edits. Anomie⚔ 17:05, 15 July 2010 (UTC)[reply]
- I see the discussion there fizzled, but of the two commenters one didn't seem to notice that links will only be added when Open Library has full text available and the other changed to support after that was pointed out. I'd like to review the code (I'd hate for there to be objections due to a code error), then I'll approve a trial to try to flush out more comments. For the purposes of the trial, I will be asking you to include a clear link to this page in the edit summaries (e.g. "
- Can you do what Anomie is asking? MBisanz talk 04:54, 10 August 2010 (UTC)[reply]
- Sorry for the delayed response, I've been on vacation. This sounds good. I'll start writing up the code for your review and once that's done I'll run it as suggested. Arielbackenroth (talk) 21:04, 18 August 2010 (UTC)[reply]
- Excellent, thanks for the response. MBisanz talk 01:54, 19 August 2010 (UTC)[reply]
- I've updated the link above, to the VPR discussion.
- I'm here to comment, because I saw the bot addition to Kevin Kelly (editor) on my watchlist. I endorsed the original idea at the VPR thread, and based on this test run, I reiterate my support. HTH. -- Quiddity (talk) 01:40, 20 August 2010 (UTC)[reply]
- Excellent, thanks for the response. MBisanz talk 01:54, 19 August 2010 (UTC)[reply]
- I've done a test run of the bot on 15 or so wikipedia pages using the edit summary suggested. The pages I selected are mostly well known authors who have works readable on Open Library. I also chose a couple of books. The code is available for your review on github. Any feedback is appreciated. I'm gonna go through and add a README and some comments now. I tried as best as possible to preserve whitespace and only insert into obvious looking lists of external links but there are likely more tweaks needed. Arielbackenroth (talk) 23:57, 20 August 2010 (UTC)[reply]
- Sorry it took so long for me to look at the code. I see your check for if the link template is already in the page is a bit broken, for example it will miss
{{OL author|...}}
(i.e. leaving out the underscore). It might be best to just use the API'sprop=extlinks
to check for the link directly (of course, that depends on there not being a prohibitive number of equivalent URLs to check for). I do like how it ensures that an "External links" section exists before trying to do anything. - You really shouldn't have made any trial edits without approval here, which will be given using the {{BotTrial}} template. I don't see any problems other than that in the edits, though. Anomie⚔ 01:08, 29 August 2010 (UTC)[reply]
- Thanks for the feedback - I've updated the code to both check extlinks and templates for any existing links. Let me know what the next steps are. Arielbackenroth (talk) 21:00, 30 August 2010 (UTC)[reply]
- Looks good. The next step is to give it a trial: Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Let's see if we can flush out some comments. Anomie⚔ 00:46, 31 August 2010 (UTC)[reply]
- Ok - so the 50 trial edits are done (I accidentally did 2 more and forgot to count them while testing so I actually did 52 - sorry about that). I've also revised the script (source code available at github) to be completely supervised to ensure that the bot is running within the guidelines specified here (it does). Also - as i mentioned to Sadas, we will be reciprocating all links from Open Library to Wikipedia - this is a high priority project that I'm actively working on. Let me know what the next steps are. Thanks. Arielbackenroth (talk) 19:02, 22 September 2010 (UTC)[reply]
- Looks good. The next step is to give it a trial: Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Let's see if we can flush out some comments. Anomie⚔ 00:46, 31 August 2010 (UTC)[reply]
- Thanks for the feedback - I've updated the code to both check extlinks and templates for any existing links. Let me know what the next steps are. Arielbackenroth (talk) 21:00, 30 August 2010 (UTC)[reply]
- Sorry it took so long for me to look at the code. I see your check for if the link template is already in the page is a bit broken, for example it will miss
- Sorry for the delayed response, I've been on vacation. This sounds good. I'll start writing up the code for your review and once that's done I'll run it as suggested. Arielbackenroth (talk) 21:04, 18 August 2010 (UTC)[reply]
This looks like a great Idea, I was actually about to suggest something at one of the Village pumps but you all beat me to it, the examples I looked at looked pretty good. Sadads (talk) 17:35, 22 September 2010 (UTC)[reply]
- The bot should only add links to individuals (like Jack London = Jack London, 1876-1916), not to names (like "Franklin W. Dixon", no date of birth [3]). We could start with the OL authors who have a link "Wikipedia". --Kolja21 (talk) 00:23, 23 September 2010 (UTC)[reply]
- Why shouldn't it add a link on Franklin W. Dixon? Anomie⚔ 20:37, 29 September 2010 (UTC)[reply]
- Because it's not a link to a PERSON (or in this special case: "the pen name used by a variety of different authors"), it's a link to a NAME with no further information. Wikipedia writes about a "Franklin W. Dixon", who wrote The Hardy Boys novels. This "person" (in this special case a group) links to the name:
- OL's "Franklin W. Dixon", that has got a link to the Wikipedia-Hardy-Boys-Dixon, is a different one:
- "Franklin W. Dixon" has also written books like Experiments Intro Physics I and Nonlinearoptimization. Are these mystery series for teens? I dought it. Conclusion: First you have to identify the OL record (adding basic infos to the name), then - in a second step - you can link the OL record with a Wikipedia article. --Kolja21 (talk) 01:31, 9 October 2010 (UTC)[reply]
- I see your point - this is a shared pen name for multiple authors. Also - the open library page for the author/pen name is not particularly high quality - it looks like multiple authors got merged into one and a different Franklin W Dixon got merged in and it needs to be cleaned up. Arielbackenroth (talk) 23:34, 12 October 2010 (UTC)[reply]
- what is the current status of this request? ΔT The only constant 01:01, 11 November 2010 (UTC)[reply]
- I'm not sure what the current status is - I've completed the 50 trial edits, and with the exception of Franklin W Dixon haven't had any negative feedback and all edits made by the bot are still there. I'm looking for guidance from the bot approval group as to what the next steps are. Arielbackenroth (talk) 21:16, 17 November 2010 (UTC)[reply]
Approved. I was waiting to see if anything more happened with the Franklin W Dixon question, but it seems that is resolved. I suggest performing the edits (at least to start) at a relatively low rate of speed, and of course if anyone raises major concerns please stop the bot until the concerns are resolved. Anomie⚔ 23:19, 17 November 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Withdrawn by operator.
Operator: Chris G (talk · contribs)
Automatic or Manually assisted: Auto
Programming language(s): PHP, my classes
Source code available: soon
Function overview: Deleting "Images available as identical copies on the Wikimedia Commons," per Wikipedia:CSD#F8
Links to relevant discussions (where appropriate): Wikipedia:Botreq#Adminbot_to_deal_with_the_CAT:NCT_Backlog - will also spam the notice boards once I have posted this
Edit period(s): daily
Estimated number of pages affected: 50-100 a day? (unsure, once the major backlog is gone, the amount deleted daily will be reduced significantly - I think I will introduce a hard limit of max 200 deletions per day, just to spread the load abit)
Exclusion compliant (Y/N): Yes, obeys templates such as {{do not move to Commons}} and {{hangon}} as well as nobots
Already has a bot flag (Y/N): N
Function details:
- Gets images from Category:Wikipedia files with the same name on Wikimedia Commons
- If it complies with Wikipedia:CSD#F8 it deletes the image, otherwise the bot ignores it
Discussion
The only major problem I see would be determing whether "The image's license and source status is beyond reasonable doubt, and the license is undoubtedly accepted at Commons.", but I feel this could be done correctly by ensuring that the image is tagged with a license accepted at commons and checking that the image does not contain any non-free image tags (e.g. sometimes linux screen shots are tagged with both GPL and non-free tags, in these such cases the bot would leave it for a human), or any deletion/license dispute tags. There will obviously be the risk of false positives, but I feel the need for such a bot to clear the backlog outweighs the small amount of images that may be deleted on commons and have to be undeleted here. --Chris 10:08, 8 June 2010 (UTC)[reply]
- As an admin who occasionally helps out in CSD#F8, I think this is long overdue. OhanaUnitedTalk page 11:44, 8 June 2010 (UTC)[reply]
- I don't like this and am very much opposed to it. A gracious plenty of images that get moved to Commons are obvious copyvios where someone just slapped a tag on them and a well-meaning, but ignorant of image policies, user moved the image over to Commons. Others are copied there without OTRS data or with incorrect attribution information that makes it sound like the Commons user transferring the image is the uploader. While I agree that the backlog is long and needs to be dealt with, blindly processing it is not the answer. Perhaps if it were only used to delete images that were moved by some limited list of trusted users or some such thing that would be ok, but there are far too many of them that are just plain incorrect. I randomly clicked on a few images just to prove my point. The first one I clicked on was File:The Firm 2009 film.jpg, a fair use image which some time ago was deleted at Commons. The second one was File:DSCN5920-w-d-e-close 300x400.JPG, which is a user's personal photo. Even though yes, it's a free license and we can move it, etc, we have always respected that a user may not want their personal photos moved to Commons and respected their wishes in that matter. A bot would blindly delete it, but an admin would ask the user if he really wanted it to be moved. The next one was File:Wakefields knuckleball.jpg, which I see a lot of these so I'm sure it's a bot that does it, but the Commons information page gives as the date, the date that the image was transferred to Commons. A human deleting the Wikipedia image would look at this and correct the date to be the one in the EXIF data or the date it was uploaded to Wikipedia or something more useful. File:WaterlooLogo.jpg is a flickr image and the bot uploading it credits the en uploader rather than the flickr author. Sorry, but every single one of these I click on has something that needs to be fixed. A bot isn't going to do that. --B (talk) 13:21, 8 June 2010 (UTC)[reply]
- Ok
- File:The Firm 2009 film.jpg - the bot would have ignored this image as it has nonfree copyright tags on it
- File:DSCN5920-w-d-e-close 300x400.JPG - the bot could leave a notice on the uploaders' talk page warning of the impeding deletion - if the user hasn't removed the deletion tag in seven days the bot deletes the image?
- File:Wakefields knuckleball.jpg - the commons date clearly states the date is the date when the image was uploaded so I don't think there would be any confusion there, however it would be possible for the bot to determine when there is an inconsistency in the date shown on the description page and the date in the image Metadata
- File:WaterlooLogo.jpg - actually it preserves the source information in the description field, so it is easy to tell that the image is from flickr
Oh and another point about the first one, the bot would ignore any images that no longer exist on commons --Chris 10:01, 9 June 2010 (UTC)[reply]
- Support in principle: Is this something a bot should do? Yes. Is this something a bot can do? Maybe. B makes some good points about why bots working in the area should be careful. But as a bot operator myself, I can see that there are possibilities here. I think that really, asking admins to check everything that B suggests is going to lead to no-one bothering to clear the already large backlog. Why don't we trust the people that moved the files? Wakefields never had the true date of its capture in its description. Why put the onus on the admin who has to delete the duplicate image once it's already been transferred to fix that? (Why don't we/Commons instead, say, have a bot that runs on exif data adding it to FileDescs?) Would a deleting admin really bother to add it anyway? WaterlooLogo is and remains attributed to its Flickr author, just not in the author field. In fact, it's attributed better on Commons because Commons is more used to handling Flickr files. So why shouldn't we delete the dupe here to reveal that. And so on. Let's just make it a bloody good bot. - Jarry1250 [Humorous? Discuss.] 13:44, 8 June 2010 (UTC)[reply]
- Compare the upside with the downside. The upside is we have a backlog cleared. While clearing a backlog is an achievement, I suppose, there's really no downside to this particular backlog - these images are all free (supposedly) so there's no copyright issue ... nothing is really deleted anyway so we're not saving disk space ... so the upside is we get the "backlog cleared" feather in the cap. The downside is that we send junk to Commons and potentially hide the correct information such that instead of any person on the planet being able to fix it, and instead of a process that guarantees at least one human in whom the community has expressed confidence will see it, the image with potentially incorrect data is on Commons where it may never be noticed and only en admins can fix it. It's better to correctly edit the description when moving them to Commons to begin with, but unfortunately, we can't control that process, so the next best thing is for a human admin to review and fix it. I don't think it has to be an admin - I think that we could have something analogous to the Commons "trusted user" idea where non-admins who understand what they are looking for and aren't just blindly copying things over can be depended upon to copy the correct data ... but I don't think the bot should blindly be led by the blind. --B (talk) 14:26, 8 June 2010 (UTC)[reply]
- How about allowing any user to review the transfer, fixing things as they see fit, and then tagging the image in some way, giving an adminbot the heads up to delete? Something like {{nowcommonsthis|bot=yes}}? The problem is that only admins can deal with this problem at present, which makes the backlog pile up. We already have Orphaned image deletion bot deleting user-tagged unused media, so why not this? NauticaShades 14:39, 8 June 2010 (UTC)[reply]
- Deleting fair use orphans isn't really a problem because we're not dumping them off on someone else. I think allowing the bot to delete images that have been flagged by a trusted user would be fine. Another possibility would be for the bot to fix problems (missing OTRS, incorrect attribution, obviously incorrect dates, etc) then flag the image. Something else that might take out a lot of the copyright violations is to not delete images uploaded by people who have had over some threshold of their images deleted as copyvios. The uploads of such users obviously need increased scrutiny. --B (talk) 14:46, 8 June 2010 (UTC)[reply]
- How about allowing any user to review the transfer, fixing things as they see fit, and then tagging the image in some way, giving an adminbot the heads up to delete? Something like {{nowcommonsthis|bot=yes}}? The problem is that only admins can deal with this problem at present, which makes the backlog pile up. We already have Orphaned image deletion bot deleting user-tagged unused media, so why not this? NauticaShades 14:39, 8 June 2010 (UTC)[reply]
- Compare the upside with the downside. The upside is we have a backlog cleared. While clearing a backlog is an achievement, I suppose, there's really no downside to this particular backlog - these images are all free (supposedly) so there's no copyright issue ... nothing is really deleted anyway so we're not saving disk space ... so the upside is we get the "backlog cleared" feather in the cap. The downside is that we send junk to Commons and potentially hide the correct information such that instead of any person on the planet being able to fix it, and instead of a process that guarantees at least one human in whom the community has expressed confidence will see it, the image with potentially incorrect data is on Commons where it may never be noticed and only en admins can fix it. It's better to correctly edit the description when moving them to Commons to begin with, but unfortunately, we can't control that process, so the next best thing is for a human admin to review and fix it. I don't think it has to be an admin - I think that we could have something analogous to the Commons "trusted user" idea where non-admins who understand what they are looking for and aren't just blindly copying things over can be depended upon to copy the correct data ... but I don't think the bot should blindly be led by the blind. --B (talk) 14:26, 8 June 2010 (UTC)[reply]
- The bot should yield to the {{c-uploaded}} tag. These images shouldn't be in the speedy deletion category to begin with, but just in case. Shubinator (talk) 15:15, 8 June 2010 (UTC)[reply]
- I'm opposed to this idea in general, Deletions are something that, in my view should be manually checked by the administrator deleting them to guarantee that there are no false positives (I know they can be deleted but that shouldn't be necessary). I also wish to draw the bot proposer's attention back to the policy in regards the name that they have choose, It should be named after the operator or the function that it runs where possible. Peachey88 (Talk Page · Contribs) 09:08, 9 June 2010 (UTC)[reply]
- In an ideal world we would have the numbers to have admins do all deletions manually and asses the situation properly, however that is not the case. At the moment I think we have around six adminbots approved to do deletions and I think this is an area that is botable and could use the help of a adminbot to help clear the backlog. That specific section of policy is poorly worded and intends to dictate common practice. The intention of it to ensure that a bot account is easily identifiably as a bot account and it is also easy to identify the specific bots' task. This bot fits the first requirement both in that it has 'robot' in its username, and (as with all my bots) will have the text (adminbot) in it's deletion summaries. As for fitting the second requirement once the bot is approved and we have worked out the specific details of what images the bot will delete and what images it will leave for humans etc, I will update the userpage with these details as well as a link to this brfa so it is easy for a passing user to determine the bots task and whether it is acting in its approved scope, this I feel is more informative than a username such as 'CommonsDeletionBot' which would fit the policy's requirements but fail to inform the user of the extent bot's scope and function. --Chris 09:48, 9 June 2010 (UTC)[reply]
- I am sufficiently nervous of this proposal that I am presently opposed. Checking for non-free licences and delete tags is nowhere near sufficient for dealing with "license and source status is beyond reasonable doubt". Would the bot check to see if the criteria "to avoid deletion on Commons" are met? The image description on Commons would have to be tested by parsing potentially free text. How can the bot reliably determine "Country where the artistic work represented by the image is situated, or where it was first published" and be sure any countries mentioned are intended to signifify the location of the artistic work and not, say, the nationality of the artist? In free text, how does the bot distinguish dates of death, creation, publication etc? Or, pardon my ignorance, is all this information required to be specified in templates these days? Is it suitable to delete files whose Commons descriptions are not machine-parseable? Will the bot honour the stipulations in Category:Wikipedia_files_with_the_same_name_on_Wikimedia_Commons#Files_not_to_delete?, particularly the first stipulation? Warning the uploader is not satisfactory since important images may have been uploaded by users long departed from WP: images are unlike articles. There is a least a possibility the bot will delete all sorts of images inappropriately. Is there a general requirement on bots that a reversion process is quickly and easily available or does that have to be stipulated for each bot (again, showing my ignorance)? Thincat (talk) 11:24, 9 June 2010 (UTC)[reply]
- This could lead to inappropriate deletions, but human interaction doesn't fully guarantee otherwise either. For such a huge backlog, though, probably the unambiguous images (I'm not sure how widespread is {{information}}'s use currently, but images using it would be a good start) will still be numerous enough for this bot to provide a noticeable relief. Admins could then focus on the cases really requiring human intervention. --Waldir talk 11:59, 9 June 2010 (UTC)[reply]
- Perhaps if we tried to better restrain the Commons uploading process itself and come up with a way to have one {{ncd}} tag for trusted users that understand image policies and another one for people who have not yet earned that level of trust. A trusted user would need to check over the latter, but the bot could just delete the former. If we could get a handle on people blindly uploading garbage to Commons without copying all of the relevant data or exercising a modicum of discretion over whether it really belongs there, then having a bot delete it would be less likely to be a problem. Until then, having a bot delete the image takes out our only opportunity for a (theoretically) trusted user to look at the image before deleting it. --B (talk) 13:08, 9 June 2010 (UTC)[reply]
- Checking these images is a task which should be done by humans. This bot can't replace humans. multichill (talk) 18:37, 9 June 2010 (UTC)[reply]
Strongly oppose. Images should NOT be deleted until it has been checked that all info has been transfered correctly to Commons. Once the image is deleted only enwiki admins can check the info. As long as original is still visible on enwiki then everyone can check the info. I have found many examples where users "copy" the image to Commons and change a PD-self to GFDL or something like that. Therefore admins should NOT delete the image untill they have checked that everything is ok. When they have done that they can also click the "Check now!" link on the image on Commons. Right now thousands of images need a check in Commons:Category:Files moved from en.wikipedia to Commons requiring review but because enwiki admins has deleted the originals we can not check the images. That mean that enwiki admins should ALSO check the images in this category.
- I suggest:
- Transfer process is improved. Either a better transfer bot or a funcktion just like "move" so it is possible to move images to commons without loss of information.
- Admins do NOT delete images before they are checked. That way more users can help check the images. That way info can be corrected on Commons and then it is much easier for deleting admin to make sure it is ok.
- If enwiki admins can not manage the job alone then all Commons admins also get admin rights on enwiki so they can help check the images. If you do not like the idea that commons admins get full acces (ability to block users etc.) then invent some "file (un)deletion admin"-account. --MGA73 (talk) 18:49, 9 June 2010 (UTC)[reply]
- There is a user access level called researches that allows users to review deleted file histories. Granting this permission to Commons admins might be a very good idea. Regarding the rest of it, if admins are blindly deleting images without removing them from the various bot-generated categories, that's a problem and should be pointed out to them, but it would obviously only be worse with bots deleting the images. I think some number of admins (maybe a SWAG of 75%) do the appropriate due diligence before deleting an image, but if a bot starts doing it, that drops to zero. --B (talk) 22:02, 9 June 2010 (UTC)[reply]
- This is a very interesting proposition. Besides being able to view deleted revisions of images, it would also be good that commons admins could actually delete images here. Probably a new user group would have to be created for that, but I think it's worth it, as that would help a great deal in clearing the backlog. What do you guys think? --Waldir talk 16:33, 14 June 2010 (UTC)[reply]
- Acutally it would be quite easy for the bot to work around this problem. First of all the bot can ensure that the license tag(s) on enwiki matches the tag(s) on the commons image, if it does match the bot ignores the image (or it could list it on a page for human review, or whatever). Secondly before the bot deletes the image on enwiki, it could copy the text on the enwiki description page over to commons (place it in a new section on the image's talkpage or something), thus allowing commons admins to verify the transfer (this would help solve the problem of enwiki admins deleting the images before commons admins can verfiy them, although it would also mean the bot would have to be approved to run on commons - but since it won't need any admin rights that shouldn't be too hard). --Chris 10:07, 10 June 2010 (UTC)[reply]
- Would it be workable for the bot to simply move the entire image description page over to Commons, just to ensure information does not get lost? (Perhaps in a separate section.) Ucucha 16:46, 11 June 2010 (UTC)[reply]
- How about this? The bots checks the common image history for {{BotMoveToCommons}}, and checks for a diff of it bieng removed by a commons user. This ensures that the information transferal has been verified. NauticaShades 16:56, 13 June 2010 (UTC)[reply]
- If the bot checks that {{BotMoveToCommons}} has been removed by a commons user and perhaps also if license is the same then the risk of errors should be reduced to a limited level. Perhaps we should make a test to see which images a bot like that would delete? Then we have something to check for any errors.
- Sadly that only works with the "new" deletions. We still have thousands of "old" deletions so if it is possible to give commons admins acces to deleted versions it would be nice. --MGA73 (talk) 18:24, 13 June 2010 (UTC)[reply]
- I commented on this idea above, where you first proposed it. --Waldir talk 16:33, 14 June 2010 (UTC)[reply]
- I'm not sure I understand you, MGA. Would you mind explaining? NauticaShades 18:44, 15 June 2010 (UTC)[reply]
- I guess MGA73 is talking about deleted image review. multichill (talk) 20:05, 15 June 2010 (UTC)[reply]
- Doesn't X! run the adminbot EyeEightDestroyerBot to do exactly the same thing? (apologies if I'm not allowed to edit here) Set Sail For The Seven Seas 291° 10' 30" NET 19:24, 23 June 2010 (UTC)[reply]
- No, that's only for files tagged with {{PBB Image citation}}, but yeah, probably the code could be reused, or the bot be approved for a second, broader, task. But I still think giving commons admins file deletion permissions on enwiki would greatly help on the cases where a bot couldn't unambiguously decide. --Waldir talk 06:39, 26 June 2010 (UTC)[reply]
- Strong oppose. Images need to be checked by a human for conformity with CSD#F8. Very often images are tagged with free licenses that are actually copyrighted (e.g. nearly every modern American sculpture photograph on Wikipedia). These images need to have their tags changed to fair use tags instead. If this bot is run, the images will simply be moved to Commons where they will be summarily deleted. Kaldari (talk) 01:17, 20 July 2010 (UTC)[reply]
- What do you think of #Another approach (no deletions, just tagging)? multichill (talk) 04:33, 20 July 2010 (UTC)[reply]
Test
- {{BAG assistance needed}}Question? Would it be possible to get a test? I think we should give it at try. --MGA73 (talk) 19:11, 27 June 2010 (UTC)[reply]
/Log gives a basic idea of what would be deleted and what would be left (note, the bot also does other checks such as making sure both the commons & enwiki images have the same hash (i.e. are the same image), however they aren't shown in the log because they were never triggered) --Chris 09:34, 11 July 2010 (UTC)[reply]
- I checked the first images and so far it looked good. Ofcourse some images still look a little bit messy on Commons and could need a cleanup. I would prefer that the bot skipped images if version on Commons still had a "Bot Check needed" like [4]. This image [5] also had a {{Cc-by-sa-3.0-migrated}} and therefore had both a GFDL with and without disclaimers. Such images should also be skipped to start with. We can always see how many will be left after the bot has deleted all the others (if it gets approved). However, it is to late now so I have not checked all images - I would like to check some more before I say "Lets roll". --MGA73 (talk) 22:34, 11 July 2010 (UTC)[reply]
I checked some more images and have these comments:
- I do not think we should pass files with “PD-self” on Commons unless the user is the same.
- Files should not be passed if they are nominated for deletion on Commons (example Commons: File:Tyumen Wooden Buildings 01.JPG)
- Commons: File:Trimerliamide.jpg is cc-by-sa-3.0 but enwiki is both cc-by-sa-3.0 and GFDL. File:Ranchi.jpg is PD on enwiki but PD+GFDL+CC-BY on Commons. Files should only be passed if the licenses are the same.
- File:Rephotography-crocker.jpg is from Flickr. We should be careful with these (it is PD on enwiki but All rights reserved on Flickr). Perhaps it is best to skip files from Flickr
- File:Pocketp.gif is from the web – not sure we can check that by bot except if we skip all images containing www or http.
I do like the idea and would hate to see this project dropped because it is too hard to make a bot that can check all possible issues. But I would also not like a bot to delete too many images, that should not be deleted. Perhaps we could start this project in a few steps:
We create a template for the bot to add on passed files. The template should say something like this:
- Attention! This file has been reviewed by “Hamlet Prince of Robots” and the bot found this image to be transferred correctly to Commons. This image can therefore probably be deleted per F8. However, this bot is still in test so admins should NOT delete this image without a review [link to relevant place] and any problems should be reported to [link to special page or this bot request].
Template should put images in Category:Wikipedia files reviewed on Wikimedia Commons by Hamlet Prince of Robots and the category should have a warning telling admins to check before they delete (something like the template).
Then we could make the bot do a run on 100 files and if that goes as planned we could run on 500 files and if that also works good then we could either run on the rest this way or allow the bot to delete if it works with almost no errors.
Even if the bot does not delete images I think this is a good help. It makes it easier to spot files that is expected to be easy to check. After that we could concider to let the bot review files with one of the “errors” on the list. Example if it is from Flickr. That way we get a lot of files with a similar problem to check. --MGA73 (talk) 12:09, 12 July 2010 (UTC)[reply]
Those comments are valid and I am working on incorporating extra checks in the bot to deal with them. I would rather not have to run the bot through another BRFA sometime down the track to get it approved to delete, so I would like to suggest that we run an extended trial (somewhere between a few weeks, to a month or two), during which the bot tags images with a template like you suggested. Afterwards I think we will be able to better determine the bots effectiveness and decide whether it should be approved only for tagging, or if it is safe enough to delete images. --Chris 11:52, 16 July 2010 (UTC)[reply]
- If a new test works just fine I would be happy to let the bot delete the images but I think that a bot need more than just my vote ;-) --MGA73 (talk) 17:07, 16 July 2010 (UTC)[reply]
Another approach
Loop over all images marked with {{NowCommons}}
- If it's reviewed, skip this image
- Look if the file is available at Commons
- If not, remove {{NowCommons}}
- If it is, check if BotMoveToCommons is still on it
- If it is, skip this image
- If it's not, search the history to find out which user checked the image (=removed the template)
- If you can't find it, skip the image
- If you can find it, check if the removing user has a SUL account with an active account on the English Wikipedia
- If that's not the case, skip the image
- If that's the case, mark the image as reviewed by that user. You probably want to set the bot field too
This will dig up a lot of images. This will also make it easier for users. If I review a file at Commons I know the bot will catch up and mark the image at the enwp as reviewed. It also makes it easier to hunt down sloppy reviewers. This should be pretty straightforward to implement (at least in pywikipedia it is). multichill (talk) 11:23, 17 July 2010 (UTC)[reply]
- That should also work if deleting admins on en-wiki remember to check images before they delete to see if "the reviewer" knows how to do it right. Not all reviewers has the same quality of work so batch deleting is a bad idea unless we are pretty sure that reviewer is doing a good work.
- We also have to be sure only to delete images with the same name or unused images. If image has a different name and is used then usage should be replaced before deletion. --MGA73 (talk) 19:01, 17 July 2010 (UTC)[reply]
Withdrawn by operator. Sorry, I've been neglecting this way too much. I thought I would have time to work on this, but I don't. --Chris 04:46, 7 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
Requests to add a task to an already-approved bot
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Magioladitis
Automatic or Manually assisted: Automatic, supervised
Programming language(s): AWB
Source code available: AWB is open source.
Function overview: WP:CHECKWIKI error fixing
Links to relevant discussions (where appropriate): WP:CHECKWIKI
Edit period(s): Weekly
Estimated number of pages affected: Some hundreds per week
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): Y
Function details: Yobot will run weekly in articles generated by CHECKWIKI toolserver (http://toolserver.org/~sk/cw/enwiki/index.htm) to do AWB general fixes. This will enable auto-fixing in at least 29 WP:CHECKWIKI errors. More details on which errors can automatically can be fixed can be found in User:Magioladitis/AWB_and_CHECKWIKI. Autotagging will be done at the same time.
Recall that User:FrescoBot fixes 1 or 2 Checkwiki errors too and that Yobot has approved to do generalfixes+autotagging in other occasions (BRFA 11, BRFA 13, etc.)
AWB will run with "Skip if no changes", "Skip if only whitespace", "Skip if only casing is changed", etc. activated. -- Magioladitis (talk) 13:42, 6 July 2010 (UTC)[reply]
Discussion
{{BAGAssistanceNeeded}} Magioladitis (talk) 15:07, 13 July 2010 (UTC)[reply]
Which Checkwiki functions/fixes will be it processing? Peachey88 (Talk Page · Contribs) 09:53, 14 July 2010 (UTC)[reply]
- Check list in User:Magioladitis/AWB_and_CHECKWIKI which I mentioned in the description. All the ones with green and yellow. Errors: 2, 6, 9, 17, 18, 20, 22, 25, 26, 37-39, 44-46, 48, 51-55, 57, 61, 63-66 (64 is done by FrescoBot too), 76, 77, 86, 88-91. -- Magioladitis (talk) 14:47, 14 July 2010 (UTC)[reply]
- How it will work: Load all lists of pages that have are reported having the specific issues and run genfxies+autotagging with appropriate skip options. -- Magioladitis (talk) 15:57, 19 July 2010 (UTC)[reply]
- Won't this mean that, at times, you will be making edits contrary to WP:AWB#Rules of use (i.e. insignificant changes)? –xenotalk 14:42, 20 July 2010 (UTC)[reply]
- "Skip if only minor genfixes" and "skip if only whitespace changed" will be activated. I don't think there will be any insignificant changes. We can make some test edits to see and we discuss it again. -- Magioladitis (talk) 14:47, 20 July 2010 (UTC)[reply]
- Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. –xenotalk 14:54, 20 July 2010 (UTC)[reply]
Trial complete. I loaded pages with error 53 (Interwiki before last category). I don't have time to give diffs right now. I 'll do later. Please note: Moving interwikis is a minor genfix for AWB so I had to disactivate the "Skip if only minor fixes" for this one. -- Magioladitis (talk) 15:07, 20 July 2010 (UTC)[reply]
Diffs of the first 12 edits: [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16], [17]. For more diffs check edits of User:Yobot from 14:57, 20 July 2010 until 15:03, 20 July 2010. All seem fine. -- Magioladitis (talk) 18:14, 20 July 2010 (UTC)[reply]
- Excepting #11, they appear to be fairly insignificant changes that made no change on the rendered text... ? Here is a list of the 50 contribs built using offset [18]. –xenotalk 12:43, 21 July 2010 (UTC)[reply]
- Since they are in the CHECKWIKI list someone will/has to fix them. Some people do this manually and this is slow. Right now it's a big backlog to this particular error. -- Magioladitis (talk) 12:50, 21 July 2010 (UTC)[reply]
- Will defer to my learned colleagues on this, but personally I think that edits like this that just move around the bottom business should really be done only in conjunction with more necessary edits (which is why it's included in AWB as a general fix). –xenotalk 13:10, 21 July 2010 (UTC)[reply]
- We could limit ourselves to certain errors then? -- Magioladitis (talk) 13:13, 21 July 2010 (UTC)[reply]
- Yes, my thoughts are that only the critical and honest-to-goodness problems should be botted. Of course, if the other minor fixes can be done while those "more important fixes" are done, that's fine - but just moving around undisplayed elements doesn't strike me as a good use of resources. –xenotalk 19:16, 21 July 2010 (UTC)[reply]
- For example, I think we agree that all DEFAULTSORT errors (errors 6 and 88-91) should be fixed. I am waiting for suggestions for more. I used to spent my Saturdays fixing CHECKWIKI errors semi-manually so I would prefer if a bot was doing it for me. -- Magioladitis (talk) 21:08, 21 July 2010 (UTC)[reply]
- I think it's important to reduce the number of articles with all these problems so new users don't inherit bad habits. -- Magioladitis (talk) 12:31, 22 July 2010 (UTC)[reply]
- I have some sympathy with that view, however the reality is that for a number of reasons (some more valid some less so) changes below a certain threshold are not acceptable en-masse. What would be useful, however, would be to find a way to monitor a given error, presumably once it is in GFs or being fixed relatively widely it will be exponential decay plus a smattering of new cases, then at some point, maybe 20%, maybe 5%, allow the remaining cases to be swept up. In many instances this fix could then be removed 9when we are supporting a deprected way of doing things for example). Rich Farmbrough, 12:46, 22 July 2010 (UTC).[reply]
- They are monitored a bit. All the errors above are fixed by GFs. Checking http://toolserver.org/~sk/cw/index.htm you can see if they go up or down. Especially the minor errors go up because they remain unmaintained by bots. At least we could do the following:
- Regular runs for all defaultsort related errors (errors 6 and 88-91) and some more (for example error 17 -cat dupe has 17,026 pages backlog!)
- One one-off run for the rest to reduce the backlog
- Two more errors are maintained already by FrescoBot (wikilink equal to text and spacing)
- They are monitored a bit. All the errors above are fixed by GFs. Checking http://toolserver.org/~sk/cw/index.htm you can see if they go up or down. Especially the minor errors go up because they remain unmaintained by bots. At least we could do the following:
- -- Magioladitis (talk) 12:57, 22 July 2010 (UTC)[reply]
- I have some sympathy with that view, however the reality is that for a number of reasons (some more valid some less so) changes below a certain threshold are not acceptable en-masse. What would be useful, however, would be to find a way to monitor a given error, presumably once it is in GFs or being fixed relatively widely it will be exponential decay plus a smattering of new cases, then at some point, maybe 20%, maybe 5%, allow the remaining cases to be swept up. In many instances this fix could then be removed 9when we are supporting a deprected way of doing things for example). Rich Farmbrough, 12:46, 22 July 2010 (UTC).[reply]
- I think it's important to reduce the number of articles with all these problems so new users don't inherit bad habits. -- Magioladitis (talk) 12:31, 22 July 2010 (UTC)[reply]
- For example, I think we agree that all DEFAULTSORT errors (errors 6 and 88-91) should be fixed. I am waiting for suggestions for more. I used to spent my Saturdays fixing CHECKWIKI errors semi-manually so I would prefer if a bot was doing it for me. -- Magioladitis (talk) 21:08, 21 July 2010 (UTC)[reply]
- Yes, my thoughts are that only the critical and honest-to-goodness problems should be botted. Of course, if the other minor fixes can be done while those "more important fixes" are done, that's fine - but just moving around undisplayed elements doesn't strike me as a good use of resources. –xenotalk 19:16, 21 July 2010 (UTC)[reply]
- We could limit ourselves to certain errors then? -- Magioladitis (talk) 13:13, 21 July 2010 (UTC)[reply]
- Will defer to my learned colleagues on this, but personally I think that edits like this that just move around the bottom business should really be done only in conjunction with more necessary edits (which is why it's included in AWB as a general fix). –xenotalk 13:10, 21 July 2010 (UTC)[reply]
- Since they are in the CHECKWIKI list someone will/has to fix them. Some people do this manually and this is slow. Right now it's a big backlog to this particular error. -- Magioladitis (talk) 12:50, 21 July 2010 (UTC)[reply]
{{BAGAssistanceNeeded}} Magioladitis (talk) 15:44, 29 July 2010 (UTC)[reply]
I don't understand why this can't be done during normal editing with AWB. Making separate changes for pure wikitext fixes that change nothing about the rendering is unnecessary. --MZMcBride (talk) 04:12, 2 August 2010 (UTC)[reply]
- ListasBot for example fixes
|listas=
in WPBiography. We could run Yobot to fix errors in DEFAULTSORT like wrong capitalisation, special characters, extra blanks, etc. in order to have pages properly ordered inside categories. - Moreover, we have a backlog. That's the main problem. For example wrong place of punctuation in relation to references has some thousands pages backlog and this changes the rendering. -- Magioladitis (talk) 07:24, 2 August 2010 (UTC)[reply]
Here's a table of the possible fixes and how they affect rendering. Feel free to update/correct.
Error | Description | AWB | Changes Rendering |
2 | Article with <br\> or <\br> or <br.> | Yes | Yes |
3 | Article with <ref> and no <references /> | Partial | Yes |
6 | DEFAULTSORT with special letters | Yes | Yes |
7 | Headlines start with three "=" | Partial | Yes |
9 | Categories more at one line | Yes | No |
10 | Square brackets not correct end | Partial | Yes |
17 | Category double | Yes | No |
18 | Category first letter small | Yes | No |
20 | The article had a &dag ger; and not †.
|
Yes | No |
22 | Category with space | Yes | No |
25 | Headline hierarchy | Partial | Yes |
26 | HTML text style element <b> | Yes | No |
37 | Title with special letters and no DEFAULTSORT | Partial | Hm... |
38 | HTML text style element <i> | Yes | No |
39 | HTML text style element <p> | Partial | No |
44 | Headlines with bold | Partial | No |
45 | Interwiki double | Yes | No |
46 | Square brackets not correct begin | Partial | Yes |
48 | Title in text | Yes | Yes |
51 | Interwiki before last headline | Yes | No |
52 | Category before last headline | Yes | No |
53 | Interwiki before last category | Yes | No |
54 | Break in list | Yes | Yes |
55 | HTML text style element <small> double | Yes | Hm... |
57 | Headlines end with colon | Yes | Yes |
61 | Reference with punctuation | Yes | Yes |
63 | HTML text style element <small> in ref, sub or sup | Yes | Yes |
64 | Link equal to linktext | Yes | No |
65 | Image description with break | Yes | No |
66 | Image description with full <small> | Yes | Yes |
76 | Link with no space | Yes | Yes |
77 | Image description with partial <small> | Yes | Yes |
78 | Reference double | Partial | Yes |
80 | External link with line break | Partial | Yes |
81 | Reference tag in article double | Partial | Yes |
86 | Link with two brackets to external source | Yes | Yes |
88 | DEFAULTSORT with blank at first position | Yes | No |
90 | DEFAULTSORT with lowercase letters | Yes | No |
With yellow I have those that are partially done by AWB i.e. in almost all cases. With green I have those done by AWB in all cases. By "Hm..." I mean that it doesn't change the appearance of the page but page's placement in categories.
-- Magioladitis (talk) 08:58, 2 August 2010 (UTC)[reply]
38 errors in total. I could get approval those which change rendering for example. -- Magioladitis (talk) 11:45, 5 August 2010 (UTC)[reply]
{{BAGAssistanceNeeded}} Magioladitis (talk) 12:44, 6 August 2010 (UTC)[reply]
- I'm leaning towards approving this in the next day or two. MBisanz talk 04:51, 10 August 2010 (UTC)[reply]
- Approved. MBisanz talk 04:27, 13 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Basilicofresco (talk · contribs)
Automatic or Manually assisted: auto (after a period of testing)
Programming language(s): python
Source code available: not yet (pywikipedia + custom script)
Function overview: analizes selected articles, checks a matching target on Commons and then add {{commons}} or {{commons cat}}.
Links to relevant discussions (where appropriate):
Edit period(s): few times per year or less
Estimated number of pages affected: few thousands?
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): Y
Function details: this task aims to add to the article the link to Commons when unambiguous related media content is found.
- It will start from a offline generated list of selected articles with these characteristics:
- round brakets in the article name, eg. Alcobaça (Portugal);
- "External links" section (I plan to improve in the future the ability of the script to place the template in the right place even without "External links" section);
- article without any Commons template (exclusion regex: ([Cc]ommons|[Pp]ic|[Cc]ommonspar|[Cc]ommonspiped|[Cc]ommonsme|[Ss]isterlinkswp|[Ww]ikicommons|[Cc]ommonstiny|[Cc]ommons-gallery|[Gg]allery-link|[Cc]ommonsimages|[Ww]ikimedia[ _]Commons|[Cc]ommons-inline|[Ww]ikicommons-inline|[Cc]ommons[ _]category|[Cc]ommons[ _]cat|[Cc]ommonscat-inline|[Cc]ommons[ _]cat[ _]left|[Cc]ommons2|[Cc]ommonsCat|[Cc]ommoncat|[Cc]ms-catlist-up|[Cc]atlst[ _]commons|[Cc]ommonscategory|[Cc]ommonscat|[Cc]ommonsimages[ _]cat|[Cc]ommons[ _]cat4|[Cc]ommonscat[ _]left|[Cc]ommons[ _]and[ _]category|[Cc]ommons[ _]and[ _]cat)).
- checks Commons for a matching gallery or category with:
- same name (eg. "Alcobaça (Portugal)" --> does Commons:Alcobaça (Portugal) exist?)
- same name adding "category" (eg. Alcobaça (Portugal) --> does Commons:Category:Alcobaça (Portugal) exist?)
- same name after removing brakets (eg. Lynx (web browser) --> does Commons:Lynx web browser exist?)
- same name after removing brakets and adding category (eg. Lynx (web browser) --> does Commons:Category:Lynx web browser exist?)
- same name after replacing brakets with a comma (eg. Haren (Groningen) --> does Commons:Haren, Groningen exist?)
- same name after replacing brakets with a comma and adding category (eg. Haren (Groningen) --> does Commons:Category:Haren, Groningen exist?)
- if a redirect is found on Commons, then it takes the redirect destination
- adds the right template in the right place (eg.
{{commons|Alcobaça (Portugal)}}
or{{commons cat|Alcobaça (Portugal)}}
at the top of the External links section)
Discussion
{{BAGAssistanceNeeded}} Within 10 days I did not see any question. Can I start a test run? -- Basilicofresco (msg) 13:20, 15 July 2010 (UTC)[reply]
- Seems straightforward. It might be more straightforward to check for the presence of commons templates using the API's prop=templates than a regex, as then you don't have to worry about capitalization, space versus underscore, new redirects, and the like. Anomie⚔ 16:15, 15 July 2010 (UTC)[reply]
- Seems like most of this (except for the external links section bit) can be done with a toolserver database query. Im not sure if you have a toolserver account, but you may always ask at WP:DBR for some help. DB queries are much faster and in my oppinion, easier, than using the mediawiki API. Tim1357 talk 23:17, 15 July 2010 (UTC)[reply]
Thank you for your suggestions. Well, the query would create the list much faster... but I'm (still) not used to sql and in order to avoid mistakes I would prefer to keep strict control on every step of the task. I'm going to start from a dump generated list of pre-selected articles (step 1) and this will greatly speed up the whole process. -- Basilicofresco (msg) 07:54, 18 July 2010 (UTC)[reply]
- I'm leaving tomorrow for a trip, so I will not able to run any script until second half of August. See you! -- Basilicofresco (msg) 07:58, 21 July 2010 (UTC)[reply]
{{BAGAssistanceNeeded}} I'm back. I will run the script on my home computer so the efficiency of the list-creator script is not critical and most of all does not affect Wikimedia servers. -- Basilicofresco (msg) 14:30, 18 August 2010 (UTC)[reply]
- It all looks Basilicofresco, but I'd like to see some community discussion about a bot adding these templates. Spam a few talk pages explaining what you hope to do. Tim1357 talk 00:49, 19 August 2010 (UTC)[reply]
- Ok, done! -- Basilicofresco (msg) 11:11, 20 August 2010 (UTC)[reply]
- Could you link the discussions? –xenotalk 14:51, 3 September 2010 (UTC)[reply]
- Sure: Wikipedia talk:WikiProject Images and Media/Commons#FrescoBot 6, Template talk:Commons#FrescoBot 6, Template talk:Commons category#FrescoBot 6. No replies. If you feel I missed the appropriate talk page, feel free to start there the discussion. -- Basilicofresco (msg) 10:11, 6 September 2010 (UTC)[reply]
- Note posted @ VPR (Wikipedia:Village pump (proposals)#Proposed bot to add "Commons" and "Commons category" templates to articles). –xenotalk 13:25, 20 September 2010 (UTC)[reply]
- Sure: Wikipedia talk:WikiProject Images and Media/Commons#FrescoBot 6, Template talk:Commons#FrescoBot 6, Template talk:Commons category#FrescoBot 6. No replies. If you feel I missed the appropriate talk page, feel free to start there the discussion. -- Basilicofresco (msg) 10:11, 6 September 2010 (UTC)[reply]
- Could you link the discussions? –xenotalk 14:51, 3 September 2010 (UTC)[reply]
- Ok, done! -- Basilicofresco (msg) 11:11, 20 August 2010 (UTC)[reply]
- It seems no one cares ... Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Mr.Z-man 19:00, 2 October 2010 (UTC)[reply]
Trial complete. This morning I wrote and tested the script. Fixed 1st and 2nd edit due a stupid typo. No problems on subsequent edits. As you can see, if a redirect is found on Commons, the bot follows it and then analyze the target. -- Basilicofresco (msg) 10:42, 10 October 2010 (UTC)[reply]
I have no hurry, however after 4 months... ;) {{BAGAssistanceNeeded}} -- Basilicofresco (msg) 21:10, 19 October 2010 (UTC)[reply]
November has arrived and I had not one single complaint about this task. If you are still doubtful, the best thing to do is to approve a 500 edits trial and wait for any reaction. -- Basilicofresco (msg) 23:11, 1 November 2010 (UTC)[reply]
- For the record, the edits are here. I noticed 14 cases where your bot linked to a category when a page or redirect to a page exists on Commons, for example this edit linked to Commons:Category:Asparagus rather than Commons:Asparagus (from Commons:Asparagus (genus)). In fact, in that particular example how did it find Commons:Category:Asparagus at all?
- I also see the edit to Georgia (U.S. state) was removed without explanation, although probably because the article had {{Sister project links}}. It may be worth checking for that template too. Anomie⚔ 23:18, 10 November 2010 (UTC)[reply]
First of all, thank you for your attention.
- Categories vs. galleries: well, IMHO the link to the category is almost always a better choice over the gallery page. Gallery pages are usually poor mantained, there are just few images and the gallery itself rarely add any real value. Categories are easier to mantain and to scale up (adding sub-categories). Moreover well written and well mantained gallery pages are usually already linked from en.wiki... so I suggest to prefer categories over galleries (if both available).
- Commons vs. Sister project links: you are right, probably Tpbradbury removed the link to Commons due the {{Sister project links}}. However should be noted that {{Sister project links}} simply "provides links to the 'Search' page on the various Wikimedia sister projects". That means that it does not grant that any related content actually exist, it is just a (blind) guess. {{Commons}} and {{Commons cat}} instead state that Wikimedia Commons actually has media related to the subject and provide a link to it. This is a precious information.
Basilicofresco (msg) 20:23, 11 November 2010 (UTC)[reply]
- I only asked about the gallery versus category because your function details list checking for galleries first. As for the other, that sounds like a discussion that should be started somewhere else. Anomie⚔ 03:14, 12 November 2010 (UTC)[reply]
- I started Template talk:Commons category#Commons / Commons cat vs. Sister project links. -- Basilicofresco (msg) 00:15, 16 November 2010 (UTC)[reply]
Refined proposal
The "Categories vs. galleries" issue can be resolved using {{Commons and category}} (I almost forgot about it). So, here is the proposal:
- If a related category or page can be found on Commons (see Function details above), the bot adds the right template at the top of the External links section.
- If on Commons exist both category and page (gallery), then {{Commons and category}} should always be preferred over {{Commons}} because gallery pages are usually poor mantained, there are just few images and the gallery itself rarely add any real value. Categories are easier to mantain and to scale up (adding sub-categories). Moreover well written and well mantained gallery pages are usually already linked from en.wiki.
- The presence of {{Sister project links}} should not affect the insertion of {{Commons cat}} or {{Commons}} because should be noted that {{Sister project links}} simply "provides links to the 'Search' page on the various Wikimedia sister projects". That means that it does not grant that any related content actually exist, it is just a (blind) guess. {{Commons}} and {{Commons cat}} instead state that Wikimedia Commons actually has media related to the subject and provide a link to it. This is a precious information. It is the difference between the search function and a link.
If this proposal sounds reasonable, please write below: "uhm... sounds reasonable" and sign. ;) Thanks. -- Basilicofresco (msg) 08:45, 21 November 2010 (UTC)[reply]
- Works for me. BTW, you may want to drop a note on Template talk:Sister project links since your post at Template talk:Commons category doesn't seem to be drawing any response. Anomie⚔ 15:08, 21 November 2010 (UTC)[reply]
Approved. WP:SILENCE seems to apply to the discussions regarding {{Sister project links}} vs {{Commons cat}}. Anomie⚔ 02:38, 1 December 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
EarwigBot 18
Operator: The Earwig (talk · contribs)
Automatic or Manually assisted: Automatic, unsupervised
Programming language(s): Python
Source code available: from tools:~earwig/earwigbot: feeddailypagebot_run.py
Function overview: The bot creates daily pages for Requests for feedback.
Links to relevant discussions (where appropriate): Wikipedia talk:Requests for feedback#revamp - things needed
Edit period(s): Daily at 0:00 UTC, via cron
Estimated number of pages affected: One per day
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): Y
Function details: As stated in the function overview; creates daily pages for Requests for feedback, like Wikipedia:Requests for feedback/2010 July 2. It will run daily at midnight UTC, and will create pages for today (at that time, today has just started), tomorrow, two days hence, and three days hence. Unless it is stopped, it should only be creating the last one of those pages, as the others will have, by that time, been created during previous days. It creates the pages with the code <noinclude>{{Wikipedia:Requests for feedback/navday|prev=%s|next=%s}}</noinclude>
, where the first %s
is the previous day's date, and the second is the next's. — The Earwig (talk) 17:47, 1 July 2010 (UTC)[reply]
Discussion
- Sounds great.
- I, personally, made pages for June when we implemented the new transcluded page-per-day system, following the discussions on the WP:FEED talk page. A few days ago, I created pages for all of July and August, until we could get a bot to do it for us.
- So - yes please, many thanks. Chzz ► 10:25, 4 July 2010 (UTC)[reply]
Code looks good, although I personally would pass just the one timestamp to process() and let that function do the time.gmtime() calls itself. And I would replace date() with just time.strftime("%Y %B %d", t)
.
This is so straightforward that I can't see how it could malfunction, but for good measure Approved for trial (2 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Since Chzz has already created the real pages through the end of August, please adjust your prefix for the trial to point somewhere in your or your bot's userspace, so we can see the actual page creations. Anomie⚔ 20:14, 9 July 2010 (UTC)[reply]
- Thanks, I'll get started on this soon, maybe tomorrow morning as it's kinda late right now. — The Earwig (talk) 03:29, 10 July 2010 (UTC)[reply]
- Sorry for the delay. I've set the prefix to User:EarwigBot/FEEDTrial; cron configured and it'll start tonight at midnight UTC. — The Earwig (talk) 22:42, 11 July 2010 (UTC)[reply]
- Trial complete., see User:EarwigBot/FEEDTrial for the pages it created. — The Earwig (talk) 14:58, 13 July 2010 (UTC)[reply]
- Sorry for the delay. I've set the prefix to User:EarwigBot/FEEDTrial; cron configured and it'll start tonight at midnight UTC. — The Earwig (talk) 22:42, 11 July 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Svick (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): C#
Source code available: [19]
Function overview: Update Wikipedia:Dusty articles
Links to relevant discussions (where appropriate):
Edit period(s): Probably once a week.
Estimated number of pages affected: 2: Wikipedia:Dusty articles/List and Wikipedia:Dusty articles/Updated
Exclusion compliant (Y/N): No
Already has a bot flag (Y/N): No
Function details:
The list at Wikipedia:Dusty articles was updated by User:DustyBot until February when it stopped working. This bot will perform the same function – it creates a list of articles that weren't edited the longest (excluding disambiguation pages) based on database dump and current data from the API and publishes 100 of them. The original bot was updating the page every day. I think that's unnecessary and once a week is enough.
I have already tested the bot: [20]. Svick (talk) 15:06, 27 June 2010 (UTC)[reply]
Discussion
Bots may not be tested outside of the bot's or operator's own userspace until after a trial has been approved. Anomie⚔ 15:26, 27 June 2010 (UTC)[reply]
- I thought it's okay because it was only those two edits and on pages that are in the Wikipedia namespace and where it can't cause any harm. It won't happen again. Svick (talk) 18:55, 27 June 2010 (UTC)[reply]
- Ok. I would like to review the code, and why did you use dashes to separate the time components? Anomie⚔ 19:08, 27 June 2010 (UTC)[reply]
- I made the output to be exactly the same as DustyBot's. The hyphens between time components are a (now fixed) bug. The code is available at [21]. It uses my library WpApiLib (source included in the archive), but most of it is not used in this app. Svick (talk) 21:18, 27 June 2010 (UTC)[reply]
- Here are the comments I have on your code:
- One thing I find quite useful for testing is the ability to set a flag that makes my library's edit function just write the proposed edit to a file. Or you could have it show you a diff of each proposed edit, the point being to be able to see what the bot would do without risking disrupting Wikipedia. Doing this is entirely optional, of course, but makes testing very easy.
- Something like this seems like a very good idea when editing articles, but I think it's not very useful for this task: the bot regenerates the page every time from scratch, using straightforward code and edits only pages where disruption would be minimal.
- The editing function in your library should specify the starttimestamp option (which is copied from the return property of the same name in your prop=info&intoken=edit call). Otherwise you may run into T17647, although that's not likely to happen in this task. Once you add that in, basetimestamp should be defaulted to that rather than left out if the latest revision timestamp is unavailable.
- Done.
- I see you use assert=user. Good!
- updateNeeded() should check the time since the bot last edited, rather than the time since anyone last edited. As written, someone could prevent the bot from ever editing by making sure to edit the page themself every 6 days.
- Yeah, fixed.
- It looks like the bot doesn't actually check if the database load fails. I guess that wouldn't matter except on the initial run, as it seems it keeps the database between runs.
- I'm not sure what you mean by this. If you mean the variable
loadFailed
inUpdate()
then that's meant only for the case when the bot reaches the newest edit from the dump, to keep it from entering infinite loop.
- I'm not sure what you mean by this. If you mean the variable
- It also looks like loadFromDump() will never indicate success, as you set Settings.Default.StartDate to max.LastEdit and then return success only if max.LastEdit > Settings.Default.StartDate.
- Fixed.
- Will the dump loader correctly handle the situation when you feed it a new file, where some of the articles in the database from the old dump have newer revision dates and some of the articles have been deleted? Either case is somewhat likely, as the point of this task is to give editors a list of articles that haven't been touched in a long time. Or do you intend to empty the database before loading a new dump?
- After the bot reads the dump for the first time, it will read it again only when the SQL database table is almost empty. If it tries to add an article that's already there, it will keep the old version. I also don't do anything about deleted files when reading dump. So, in both cases, if new info is available from the dump, the bot ignores it, but querying the API will solve both issues, so the result should be correct.
- Consider reading the list of namespace prefixes from the <namespaces> node in the dump, instead of hard-coding them into the bot.
- Good idea, done.
- For each page that ends up in the list and each non-dab page you reject, you make 2 API queries. For each dab page you reject, you make one API query. And even if the first 101 pages give you your 100, you still do this for the next 99. You could cut that down tremendously by making a query for prop=categories|revisions&rvprop=timestamp&clcategories=Category:All+disambiguation+pages&cllimit=max and packing up to 500 pages into the titles parameter (and then it wouldn't matter much if you query a few hundred extra pages).
- I haven't thought about that, done. Also, the limit for SvickBOT is 50 pages. I suppose that's because it doesn't have the bot flag.
- One thing I find quite useful for testing is the ability to set a flag that makes my library's edit function just write the proposed edit to a file. Or you could have it show you a diff of each proposed edit, the point being to be able to see what the bot would do without risking disrupting Wikipedia. Doing this is entirely optional, of course, but makes testing very easy.
- HTH Anomie⚔ 02:09, 28 June 2010 (UTC)[reply]
- Thanks for the review. Updated code is at the same address. Svick (talk) 02:36, 29 June 2010 (UTC)[reply]
- Sorry, I must have seen this on my watchlist when I didn't have time to reply and then forgot to come back to it. In the future, feel free to post {{BAGAssistanceNeeded}} after about a week of no replies. Changes look good, and I think I understand just what the bot is doing with the database now. Let's give it a try. Approved for trial (
510 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Anomie⚔ 16:45, 15 July 2010 (UTC)[reply]
- Sorry, I must have seen this on my watchlist when I didn't have time to reply and then forgot to come back to it. In the future, feel free to post {{BAGAssistanceNeeded}} after about a week of no replies. Changes look good, and I think I understand just what the bot is doing with the database now. Let's give it a try. Approved for trial (
- Thanks for the review. Updated code is at the same address. Svick (talk) 02:36, 29 June 2010 (UTC)[reply]
- Here are the comments I have on your code:
- I made the output to be exactly the same as DustyBot's. The hyphens between time components are a (now fixed) bug. The code is available at [21]. It uses my library WpApiLib (source included in the archive), but most of it is not used in this app. Svick (talk) 21:18, 27 June 2010 (UTC)[reply]
- Ok. I would like to review the code, and why did you use dashes to separate the time components? Anomie⚔ 19:08, 27 June 2010 (UTC)[reply]
Approved. No issues with the trial. Anomie⚔ 19:55, 19 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
Bots in a trial period
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Withdrawn by operator.
Operator: SunCreator (talk · contribs)
Automatic or Manually assisted: Manually-assisted
Programming language(s): None, AWB
Source code available: If required, I can make the source code available in the form of AWB settings files.
Function overview: I am using AWB to inspect a lot of articles as part of WP:URBLP related checking. Hunting for articles that are missing/incorrect categories or WPBiography tags.
Links to relevant discussions (where appropriate): There is a bot User:LivingBot that does similar tasks, but I've heard it's not working lately. Here I'm trying to find things that have previously been missed and thus the appropriate Category:Living people or similar is missing.
Edit period(s): Daily for a while until exhausted checked for BLP's.
Estimated number of pages affected: In some case the bot may add/remove Category:Living people, date of birth and age, birth/death year/unknown/missing category, living=yes/no. If that does result it will be manually-assisted.
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N):
Function details: I am using AWB to build a list of articles, it returns a partial list of a maximum 25000 articles. The categories I wish to check are found in Category:Biography_articles_by_quality and some of those surpass the 25000 limit. To go beyond 25000 one must use a bot account, hence I find myself filling in this request!
Discussion
- Hey SunCreator! Im glad you've taken an interest in operating bots! While I do not know the full details of what you want to do, this kind of bot task seems to lend itself to SQL. If that is true, it will save you loads of time to use a Database query to build your list rather then AWBs functions. What, exactly, will you be parsing for? I may be able to help. Tim1357 talk 03:20, 1 May 2010 (UTC)[reply]
- Hi Tim, yes, much of this could be done with a Database query, I am familiar with SQL. Here would be some obvious starters of conflicting data between something indicating a person is alive and something indicating they are dead.
- Moved database report criteria to => Wikipedia:Bot_requests#Living_people_possibly_deceased_or_deceased_people_possibly_living
- Regards, SunCreator (talk) 04:41, 1 May 2010 (UTC)[reply]
- Hi Tim, yes, much of this could be done with a Database query, I am familiar with SQL. Here would be some obvious starters of conflicting data between something indicating a person is alive and something indicating they are dead.
LivingBot never really did it very efficiently anyway, best of luck to you. - Jarry1250 [Humorous? Discuss.] 19:38, 2 May 2010 (UTC)[reply]
- Thank you! I think it's useful to clarify that I'm not replacing LivingBot or any other automated bot(learnt of User:Yobot recently). I'd like access to more then 25000 articles in category so that I can correct errors like 226 year old(infobox age), the now deleted Duck, marked as a BLP, this 'living' ceramic duck File:Roberto the duck.jpg - actual picture that was in the infobox and was aged 1 year old in the infobox. Numerous other BLPs that for reasons unknown aren't marked with Category:Living people even if they are on the talk page. Regards, SunCreator (talk) 00:31, 5 May 2010 (UTC)[reply]
- You should grab a database dump. Last one is 12th march, but they should be running again soon. Rich Farmbrough, 15:19, 28 May 2010 (UTC).[reply]
- I like the idea as using the database scanner is fast. Unfortunately my hard disk has only 30Gb free so I can't download the zip file let alone uncompress it. Regards, SunCreator (talk) 21:48, 28 May 2010 (UTC)[reply]
- You don't need 30gb to d/l the dump it is about 4-5 G but it is 27 G uncompressed. You can scan the compressed dump but not with AWB. Rich Farmbrough, 19:34, 20 June 2010 (UTC).[reply]
- You should grab a database dump. Last one is 12th march, but they should be running again soon. Rich Farmbrough, 15:19, 28 May 2010 (UTC).[reply]
What's the status of this? Josh Parris 02:29, 31 May 2010 (UTC)[reply]
- Really, what is the status of this? Nothing has been happening; I'm inclined to expire just because there's no activity at all. — The Earwig (talk) 19:39, 20 June 2010 (UTC)[reply]
- What? Am I suppose to use the bot without getting approval? Regards, SunCreator (talk) 08:47, 30 June 2010 (UTC)[reply]
- Approved for trial (25 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Let's see how it goes! –xenotalk 16:59, 7 July 2010 (UTC)[reply]
- News? MBisanz talk 03:35, 9 August 2010 (UTC)[reply]
- It's okay, got the info thank you. On a bit of a hiatus at the moment but will get back to this later. Regards, SunCreator (talk) 00:28, 10 August 2010
- Status? Mr.Z-man 04:01, 26 September 2010 (UTC)[reply]
- on wikibreak. Regards, SunCreator (talk) 09:09, 26 September 2010 (UTC)[reply]
You do not seem to be on wikibreak anymore, judging by Special:Contributions/SunCreator. Do you intend to resume this request? Anomie⚔ 00:03, 11 November 2010 (UTC)[reply]
- I no longer have a sense that this checking which effectivly results in more uBLPs being tagged would be welcomed activity. See discussion at WP:Administrators'_noticeboard/Unsourced_biographies_of_living_persons#Huge_backlog_of_tagged_unsourced_biographies_of_living_persons with a sense that the number is to 'high/huge' and adding to it is not welcomed from either side. In addition over at WP:RSN I'm being falsely accused of drive by tagging, and identifying more Category:Living people and tagging uBLP's based on horizontal checking through articles seems an unwelcomed cause. At this point I'd likely resume when number of uBLPs (currently 935) is low, down to a few months backlog or less. Regards, SunCreator (talk) 00:41, 11 November 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Request Expired.
Operator: Robert Skyhawk (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): AWB
Source code available: Run using AWB, whose parameters I can make available on request.
Function overview: Updates Good Articles in respect to {{Good article}} template.
Links to relevant discussions (where appropriate): Bot request, Discussion regarding symbol usage
Edit period(s): Weekly, unless a large number of pages need tagging at a given time.
Estimated number of pages affected: All of the Good Articles; there are currently 39,747
Exclusion compliant (Y/N): Yes
Already has a bot flag (Y/N): No
Function details: The decision was recently made to add {{Good article}}, which displays at the top of a page, to all of the good articles. While users have already added the template to most of the GAs with AWB in semi-automated mode, there is a desire in the GA Project for a bot that would periodically check the Good articles; adding the template to those that don't have it, and removing the template from pages that are no longer Good articles. The bot would use Category:Wikipedia good articles as a list of actual GAs, since the category is populated by {{GA}}, which is put on talk pages of GAs and determines whether or not the article is actually a Good article. To check articles that use the template, Category:Good articles is used, since it is populated by {{Good article}}.
Robert Skyhawk (T C B) 03:37, 18 June 2010 (UTC)[reply]
Discussion
- Thanks for volunteering to do this! I think you want Category:Wikipedia good articles, not Category:Wikipedia Good Articles. Would it be possible to also check the list at WP:Good articles? I am not sure which should be considered more authoritative in determining whether an article is a GA, but I suppose the bot could report any discrepancies for a sentient editor to sort out. Ucucha 05:14, 18 June 2010 (UTC)[reply]
- Thanks for correcting me on the category. As for the list at WP:GA, it would be harder to use because there are inevitably going to be some links on that page that aren't part of the actual list, and AWB would still treat them as GAs. Category:Wikipedia good articles should be a reliable list, because every article that passes GA is supposed to transclude {{GA}} on its talk page. It's then very simple for the bot to convert those to mainspace pages, and scan for the template. Robert Skyhawk (T C B) 18:05, 18 June 2010 (UTC)[reply]
- Well, they can also have {{ArticleHistory}} (indeed, that is preferable), but that does not matter much because it still generates the same category. All links on WP:GA that are not to GAs ought to be in transcluded templates; can't AWB ignore such links? Ucucha 18:10, 18 June 2010 (UTC)[reply]
- Thanks for correcting me on the category. As for the list at WP:GA, it would be harder to use because there are inevitably going to be some links on that page that aren't part of the actual list, and AWB would still treat them as GAs. Category:Wikipedia good articles should be a reliable list, because every article that passes GA is supposed to transclude {{GA}} on its talk page. It's then very simple for the bot to convert those to mainspace pages, and scan for the template. Robert Skyhawk (T C B) 18:05, 18 June 2010 (UTC)[reply]
- This would be fairly accurate, and worth doing it shows right now that
- ....
- Black-shouldered Kite
- Brander–Spencer model
- Ferris Bueller's Day Off
- Gregory R. Ball
- Igor Panarin
- Kia Stevens
- Mary McLeod Bethune
- The Juice Is Loose
- Tiber Oil Field
- ...
- Are not consistently in all three groups.
- Rich Farmbrough, 18:51, 19 June 2010 (UTC). (Note the first group were fixed while I was writing this leading to some head-scratching about which group was which.[reply]
- Between the two of us, we fixed all those. The first group had the GA template on the talk page and the symbol on the article page, but was not listed on WP:GA; the second group was listed on WP:GA and had a GA template on the talk page, but did not have the GA symbol. Ucucha 19:20, 19 June 2010 (UTC)[reply]
- Rich Farmbrough, 18:51, 19 June 2010 (UTC). (Note the first group were fixed while I was writing this leading to some head-scratching about which group was which.[reply]
(outdent) Right now, getting all of the links on WP:GA, then removing those that are not mainspace, produces a list of 9277 pages, which is the exact number of Talk namespace pages in Category:Wikipedia good articles. This means that both methods are identically accurate. Robert Skyhawk (T C B) 19:49, 19 June 2010 (UTC)[reply]
- Yes, because Rich and I just fixed all discrepancies. But new problems may arise in the future. Ucucha 19:52, 19 June 2010 (UTC)[reply]
- Oh, I see. So I suppose this comes down to what actually makes an article a GA: transcluding {{GA}}, or being listed on WP:GA. Which would be the best source? Robert Skyhawk (T C B) 20:14, 19 June 2010 (UTC)[reply]
- The best way to do it is to look at what the reviewer wanted. For example, I found an article which had failed, but where the reviewer had put up the {{GA}} template instead of the GA-fail template. However, the bot of course can't do that. Perhaps the best solution would be to have the bot add {{Good article}} if the article both is on WP:GA and has {{GA}}, and have it post to WT:GA if it detects discrepancies (articles which are on WP:GA but don't have {{GA}} and vice versa). Ucucha 20:19, 19 June 2010 (UTC)[reply]
- This could certainly be done; I would first get a list of pages on WP:GA, filter non-GAs, and convert them to their to talk pages. Then I would get a dump of these pages and scan them for {{GA}}. Finally, I would convert those that are "{{GA}} positive" back to mainspace and run a check for {{Good article}}. This would lengthen the process considerably, but I'm fine doing it if that's what is desired. I'm not an expert with AWB, so anyone knows a more efficient way of doing what I've described, please let me know. As for anomalies, I can't make AWB post a report to a talk page, but I'm pretty sure that I'll be able to tell which pages the aforementioned system refutes, and go from there. Robert Skyhawk (T C B) 23:13, 19 June 2010 (UTC)[reply]
- I have it easy and can say whatever I like, and you'll actually have to code the bot, so please do say it when you think what I want is impossible or inconvenient. Ucucha 06:38, 20 June 2010 (UTC)[reply]
- I appreciate that you say that, but I would be fine going through such a process; now that I think about it, a similarly complex process would have to be used to search for non-GAs that transclude the template. What it really comes down to is whether we think such accuracy is necessary with a little symbol. Robert Skyhawk (T C B) 19:41, 20 June 2010 (UTC)[reply]
- I think these lists are going to be short, so a little cut-and-pasting is going to do what is needed. However if they were long, a little plugin would be what is needed - alternatively you could create a hidden category (or categories) called "Anomalous good article status" and add it to the anomalous pages. A template on WP:GA or WP talk: GA could display the current state much as the progress boxes do with the clean-up categories. Rich Farmbrough, 19:51, 20 June 2010 (UTC).[reply]
- I appreciate that you say that, but I would be fine going through such a process; now that I think about it, a similarly complex process would have to be used to search for non-GAs that transclude the template. What it really comes down to is whether we think such accuracy is necessary with a little symbol. Robert Skyhawk (T C B) 19:41, 20 June 2010 (UTC)[reply]
- I have it easy and can say whatever I like, and you'll actually have to code the bot, so please do say it when you think what I want is impossible or inconvenient. Ucucha 06:38, 20 June 2010 (UTC)[reply]
- This could certainly be done; I would first get a list of pages on WP:GA, filter non-GAs, and convert them to their to talk pages. Then I would get a dump of these pages and scan them for {{GA}}. Finally, I would convert those that are "{{GA}} positive" back to mainspace and run a check for {{Good article}}. This would lengthen the process considerably, but I'm fine doing it if that's what is desired. I'm not an expert with AWB, so anyone knows a more efficient way of doing what I've described, please let me know. As for anomalies, I can't make AWB post a report to a talk page, but I'm pretty sure that I'll be able to tell which pages the aforementioned system refutes, and go from there. Robert Skyhawk (T C B) 23:13, 19 June 2010 (UTC)[reply]
- The best way to do it is to look at what the reviewer wanted. For example, I found an article which had failed, but where the reviewer had put up the {{GA}} template instead of the GA-fail template. However, the bot of course can't do that. Perhaps the best solution would be to have the bot add {{Good article}} if the article both is on WP:GA and has {{GA}}, and have it post to WT:GA if it detects discrepancies (articles which are on WP:GA but don't have {{GA}} and vice versa). Ucucha 20:19, 19 June 2010 (UTC)[reply]
- Oh, I see. So I suppose this comes down to what actually makes an article a GA: transcluding {{GA}}, or being listed on WP:GA. Which would be the best source? Robert Skyhawk (T C B) 20:14, 19 June 2010 (UTC)[reply]
{{BAGAssistanceNeeded}}
Am I ready for an approval? Since there are only about 11 pages that need modification (according to my calculations), I would request either a small trial (~5-10 articles) or direct approval without a trial. Thanks, Robert Skyhawk (T C B) 04:06, 3 July 2010 (UTC)[reply]
- Approved for trial. Please provide a link to the relevant contributions and/or diffs when the trial is complete. do what you think is needed. MBisanz talk 03:11, 5 July 2010 (UTC)[reply]
- A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) Any news? MBisanz talk 03:32, 24 August 2010 (UTC)[reply]
- I'm going to expire this in a few days if I don't hear back. MBisanz talk 19:10, 6 September 2010 (UTC)[reply]
Request Expired. Marking this as expired, I see one trial edits, but no comments from the operator, who should have got back from vacation >2 weeks ago. This request may be reopened at any time. For when/if it is, I should like to point out that AWB's list building feature would allow you to easily make a list of pages both in the category Wikipedia good articles, and linked to from WP:GA, without having to scan each page manually. - Kingpin13 (talk) 10:01, 18 September 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Tim1357 (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): Python
Source code available: ...
Function overview: Reverting blatant vandalism.
Links to relevant discussions (where appropriate): N/A (ClueBot already does this)
Edit period(s): Continuous
Estimated number of pages affected: Dependent on vandalism.
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): Y (But it wont use it, all of its functions should be in watch-lists)
Function details:
- Main bot function
- The bot downloads User:DASHBot/Vandalism and parses it for regexes.
- It compiles those regexes, and keeps them associated with their score.
- It then connects to irc://irc.wikipedia.com, to channel #en.wikipedia
- When a new edit is made, it continues only if the user is not in Huggle's whitelist
- Then, it checks the edit against DASHBot's User:DASHBot/Ignore list.
- If none of those steps excuse the edit, the bot downloads the diff and parses put all new text (red text, and new green sections that do not appear in the old side)
- It checks each regex, and adds the score to the edits score. Scores are cumulative, so if it says "fuk" 5 times, the score will +=-3*5
- If the edit is smaller than 400 new charachters, the score to revert is -3.
- If it is less than 150 characters, it is -2.
- IF it is less than 15 characters, it is -1.
- Logging
Logging for false positives is incredibly important in this case. For that reason, I made a simple error-report generator that can be accessed online. Try it out with this edit.
- Extra Notes
- All regex pages are reloaded the instant the bot understands they have been edited.
- I have been running a dry run in the bot's userspace. See the results of it at User:DASHBot/Dryrun. (Note the items at the bottom will be a more accurate representation of the bot's ability).
Discussion
Please run this on a different account, it's bad enough to have 10 tasks running under one account, but also having an anti-vandalism bot that runs constantly will cause problems down the road.
- "If the edit is smaller than 400 new charachters, the score to revert is -3."
- I think you mean to go with more or equal to?
- "Source code available: ..."
- ?
- "Already has a bot flag (Y/N): Y (But it wont use it, all of its functions should be in watch-lists)"
- I hope this is some horrible typo, watch-lists have nothing to do with bot flags...
Also, per the Dryrun, this and this would of been reverted. I know it's great to detect a large amount of vandalism, but being too sensitive and having false positives is not an acceptable side affect. This is present in ClueBot and other antivandalism bots. FinalRapture - † ☪ 21:18, 8 June 2010 (UTC)[reply]
- Both those false positives have been fixed, even before I read you comments. Finding errors like these were the whole purpose of the dry run.
- The source is not yet available because right now it looks like a 5 year old wrote it, Im cleaning it up now, and will publish it after the full trial.
- It won't edit with a bot flag because its edits should not appear in the recent changes feed. Sorry, it was early when I filled this all out. Tim1357 talk 23:47, 8 June 2010 (UTC)[reply]
- What are the problems you have with running this under the same bot account? Its easier for me to do, and I dont think it makes it any harder for others to use, given the propper documentation and logging. Tim1357 talk 23:47, 8 June 2010 (UTC)[reply]
- "It won't edit with a bot flag because its edits should not appear in the recent changes feed.". What the hell? FinalRapture - † ☪ 02:07, 9 June 2010 (UTC)[reply]
- What are the problems you have with running this under the same bot account? Its easier for me to do, and I dont think it makes it any harder for others to use, given the propper documentation and logging. Tim1357 talk 23:47, 8 June 2010 (UTC)[reply]
Sorry if I was unclear. Tim1357 talk 01:10, 10 June 2010 (UTC)[reply]Edits by such accounts are hidden by default within recent changes.
- The comment about the watchlist isn't as ridiculous as FinalRapture makes it, since there is a user preference that enables you to hide bot edits from your watchlist. Then again, you can't run one task with a bot flag and another without one on the same account. Ucucha 16:54, 11 June 2010 (UTC)[reply]
- Hate to be contrary, but you can. Quoted from the API documentation:
. --Tim1357 talk 00:50, 12 June 2010 (UTC)[reply]*
bot
: If set, mark the edit as bot, even if you are using a bot account the edits will not be marked unless you set this flag.
- Hate to be contrary, but you can. Quoted from the API documentation:
- Compliance with 1RR
- I'v put in some thought on how to make this bot not revert the same edit over and over again. The process I've come up with is this:
- When the bot reverts an edit, it makes a hash (to save memory) of the username, the page, and the rendered diff. This hash is then stored, along with a time stamp.
- If within 24 hours, another edit has an identical hash (meaning it is the same user, making the same edit on the same page) the bot will make the revert_score threshold lower. Instead of needing a score of -4 to be reverted, it would need a score of -10 (or something like that) . This means extremely blatant vandalism will be reverted again and again. However, this feature can be turned off
Tim1357 talk 17:19, 12 June 2010 (UTC)[reply]
- IMO AVBots should be 1RR. It's better then picking arbitrary numbers as a cutoff. Q T C 01:39, 13 June 2010 (UTC)[reply]
- If I'm reading User:DASHBot/Vandalism/Die correctly, it says that any user can change the regex. This can be dangerous. Sole Soul (talk) 19:25, 13 June 2010 (UTC)[reply]
- Yes, but the payoff is that other users can help build the regex base. Additionally, I will request that the page be fully protected, so that only admins may edit it. I believe that admins will be smart enough to not modify a regex if they do not know how. I might be overestimating them though. Tim1357 talk 21:14, 13 June 2010 (UTC)[reply]
- Full protection is good, but then how you could edit it? I suggest moving the page to User:Tim1357/regex.css or something similar. Sole Soul (talk) 21:26, 13 June 2010 (UTC)[reply]
- Yes, but the payoff is that other users can help build the regex base. Additionally, I will request that the page be fully protected, so that only admins may edit it. I believe that admins will be smart enough to not modify a regex if they do not know how. I might be overestimating them though. Tim1357 talk 21:14, 13 June 2010 (UTC)[reply]
- Are there auto confirmed users that aren't in the white-list? Tim1357 talk 21:14, 13 June 2010 (UTC)[reply]
- I don't know, I don't use Huggle :) Sole Soul (talk) 21:26, 13 June 2010 (UTC)[reply]
- Apparently not, "Huggle whitelists users with edit counts above 500". Autoconfirmed users should not be reverted by a bot. Note: 3 of the 6 false positive edits reported were made by autoconfirmed users. Sole Soul (talk) 22:30, 13 June 2010 (UTC)[reply]
- If it's alright with you, I'd like to stick with the Huggle white list. I spent some time looking, and it appears there is nowhere that the API will let me download a list of autoconfirmed users. In fact, the database does not even have a place where it saves a list of these users. Hopefully the bot is coded well enough so that this will not effect performance. Tim1357 talk 03:32, 14 June 2010 (UTC)[reply]
- Apparently not, "Huggle whitelists users with edit counts above 500". Autoconfirmed users should not be reverted by a bot. Note: 3 of the 6 false positive edits reported were made by autoconfirmed users. Sole Soul (talk) 22:30, 13 June 2010 (UTC)[reply]
- I don't know, I don't use Huggle :) Sole Soul (talk) 21:26, 13 June 2010 (UTC)[reply]
- Are there auto confirmed users that aren't in the white-list? Tim1357 talk 21:14, 13 June 2010 (UTC)[reply]
- ClueBot doesn't revert users with > 50 edits, or IPs with > 500 edits. (X! · talk) · @219 · 04:15, 15 June 2010 (UTC)[reply]
- I agree. The bot will ignore edits by non-IP users that have more than 50 edits. The bot still uses Huggle's whitelist, because that list is helpful to strip bots, admins, and other experienced editors before it downloads the edit. It checks the user's edit count after it downloads and evaluates the edit.Tim1357 talk 16:17, 15 June 2010 (UTC)[reply]
- Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. MBisanz talk 04:36, 27 June 2010 (UTC)[reply]
Done, along with a bit extra monitored sessions. No errors were encountered. I suggest either approval or another long trial period (a week or so). Tim1357 talk 17:07, 4 August 2010 (UTC)[reply]
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Request Expired.
Operator: Gregbard (talk · contribs)
- Supervised
Automatic or Manually assisted:
- Automatic
- Perl
Source code available:
Function overview:
- Update Wikipedia:WikiProject_Deletion_sorting/Philosophy.
- Update Index of philosophy pages.
- Identify most wikilinked articles within the project, and output the result to Wikipedia:WikiProject_Philosophy/Most_wikilinked.
- Identify most popular redlinks within the project, and append the result to Wikipedia:WikiProject_Philosophy/Most_wanted redlinks.
Links to relevant discussions (where appropriate): I asked Pollinossiss his opinion, and he thought it was a good idea. No other discussion has been held. However it appears to be working for other projects.
- Listed at WT:PHILO
Edit period(s): To be run automatically on a thrice weekly basis in the case of the index and deletion update, and on a monthly basis in the case of the search for red and bluelinks.
Estimated number of pages affected: 8 project pages within WikiProject Philosophy, plus a log subpage in the case of the index update, and an archive subpage in the case of the deletion sorting.
- Index of philosophy: Index_of_philosophy_articles_(A–C), Index_of_philosophy_articles_(D–H), Index_of_philosophy_articles_(I–Q), and Index_of_philosophy_articles_(R–Z).
- Wikipedia:WikiProject Deletion sorting/Philosophy
- Wikipedia:WikiProject_Philosophy/Most_wikilinked
- Wikipedia:WikiProject_Philosophy/Most_wanted_redlinks
Exclusion compliant (Y/N):
Already has a bot flag (Y/N):
Function details:
- A) Thrice a week, the bot will identify all the articles tagged with Template:WikiProject Philosophy which are not currently listed in the Index of philosophy. It will insert wikilinks to any new articles into the index alphabetically.
- B) Thrice a week, the bot will compare the category Category:Articles for deletion with the articles tagged with Template:WikiProject Philosophy and place a transclusion of the discussion on Wikipedia:WikiProject Deletion sorting/Philosophy.
- C) Once a month, the bot will search through the articles within WP:PHILOSOPHY and identify the most wikilinked terms, and output the result to Wikipedia:WikiProject_Philosophy/Most_wikilinked. Wikilinks to articles within the project will be displayed in bold font.
- D) Once a month, the bot will search through the articles within WP:PHILOSOPHY and identify the most redlinked terms, and output the result to Wikipedia:WikiProject_Philosophy/Most_wanted_redlinks.
Discussion
Four comments:
- A bot cannot operate itself. The responsible editor must be identified above.
- I have corrected that.
- The bot account should only be used to make edits for an approved task or trial, and nondisruptive edits in your or the bot's userspace. This page should be edited using your main account.
- I will comply.
- You are estimating that the bot will affect "14,000" articles, but the function overview mentions three project pages and, vaguely, "Index of philosophy pages". Please clarify, and provide more details as to which pages are "Index of philosophy pages" and what will be done to them.
- I was thinking to future expansion of features, however you are correct. As the proposal stands now, only those few project pages will be the subject of any edits. The Index of philosophy consists of five pages: Index of philosophy, Index_of_philosophy_articles_(A–C), Index_of_philosophy_articles_(D–H), Index_of_philosophy_articles_(I–Q), and Index_of_philosophy_articles_(R–Z). I would like, in the future, to expand that function to the List_of_philosophers which also is five pages similarly formatted. However I would like to see the index update feature to work first.
- "monthly or quarterly" seems too infrequent for Wikipedia:WikiProject Deletion sorting/Philosophy to be useful, it would probably need to be done at least daily. Are their existing bots that do deletion sorting for other projects that can handle this too? I see User:ArticleAlertbot is down at the moment.
Anomie⚔ 12:05, 4 June 2010 (UTC)[reply]
- In the case of the AFD feature, you are probably correct, although I would think that once every few days would suffice.Yes the Article alertbot is down and that is too bad. However, even when it is up, I still have to manually place AFD proposals on the philosophy page that come to my attention via it. I am certainly open to input, and I thank you for it.Greg Bard 21:02, 4 June 2010 (UTC)[reply]
- I note in (A) you specify that it will add links for newly-templated articles. Will it also remove links for de-templated and deleted articles?
- How do you intend to do (C) and (D)? Will the information be loaded from a database dump, or be generated from the toolserver, or from the API (using which queries?), or from loading each page? Anomie⚔ 16:37, 11 June 2010 (UTC)[reply]
- I will basically be be taking User:Mathbot's code and modifying it for use in WP:PHILO. I believe that way Mathbot does it, the operation is by category. If the article is in the particular category, it is written into the list. I am not sure if Mathbot merely recomposes the whole list each time or just acts upon the individual changes. I am at a bit of a disadvantage currently as I have no Unix or Linux account to be working on these things. I have applied at the toolsever, and am currently awaiting approval. I have provided an incomplete list of modified files at User:Philosobot/Source code. Perhaps some functions will be capable of being run from User:Mathbot with slight modification, or perhaps some files could be shared. However, I wouldn't want to put anyone out. I feel I can create a custom bot just for WP:PHILO on my own. Greg Bard 22:28, 13 June 2010 (UTC)[reply]
- Greetings, I am awaiting a response. If there is something more that I need to do, please let me know about it. I am not sure if the account on nightshade is waiting for the bot proposal to be approved or if the bot request is waiting for approval on nightshade. Please advise. Same discussion:<https://jira.toolserver.org/browse/ACCAPP-168> Greg Bard 00:54, 24 June 2010 (UTC)[reply]
- I will basically be be taking User:Mathbot's code and modifying it for use in WP:PHILO. I believe that way Mathbot does it, the operation is by category. If the article is in the particular category, it is written into the list. I am not sure if Mathbot merely recomposes the whole list each time or just acts upon the individual changes. I am at a bit of a disadvantage currently as I have no Unix or Linux account to be working on these things. I have applied at the toolsever, and am currently awaiting approval. I have provided an incomplete list of modified files at User:Philosobot/Source code. Perhaps some functions will be capable of being run from User:Mathbot with slight modification, or perhaps some files could be shared. However, I wouldn't want to put anyone out. I feel I can create a custom bot just for WP:PHILO on my own. Greg Bard 22:28, 13 June 2010 (UTC)[reply]
- In the case of the AFD feature, you are probably correct, although I would think that once every few days would suffice.Yes the Article alertbot is down and that is too bad. However, even when it is up, I still have to manually place AFD proposals on the philosophy page that come to my attention via it. I am certainly open to input, and I thank you for it.Greg Bard 21:02, 4 June 2010 (UTC)[reply]
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) - Any update on this? Mr.Z-man 14:48, 11 July 2010 (UTC)[reply]
Update 7-11-10
Greetings,
There is no progress on this bot at all. I do not have access to a unix account, and I have not been able to use perl from this home computer. I have applied for an account on the toolserver, and I have not been approved yet. I am not sure exactly what the issue is there. I too am awaiting an update. <https://jira.toolserver.org/browse/ACCAPP-168>. Is there any way this process can be expedited? Perhaps there is some alternative free service I can use that someone can direct me to. Perhaps if I had more information about their concernes I would be able to respond to them. Greg Bard (talk) 23:40, 11 July 2010 (UTC)[reply]
- I don't think TS access can be expedited, but you might ask on freenode in #wikimedia-toolserver. Any other news? MBisanz talk 03:32, 9 August 2010 (UTC)[reply]
- A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. Any news? MBisanz talk 05:40, 5 September 2010 (UTC)[reply]
Request Expired. Mr.Z-man 04:03, 26 September 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Request Expired.
Operator: Surena (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): Python (PythonWikiBot)
Source code available: Standard pywikipedia
Function overview: interwiki
Links to relevant discussions (where appropriate):
Edit period(s): Continuous
Estimated number of pages affected:
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): N
Function details:
Discussion
"interwiki, welcome,replace text ,find bad redirect,move category". Welcome bots are not permitted by policy. What are you replacing? We already have a bunch of bots that find bad redirects and move categories, what does yours add? As a user with only 101 edits (including one to this BRFA), do you fully understand policies in place on Wikipedia and for operation of robots? FinalRapture - † ☪ 17:37, 19 June 2010 (UTC)[reply]
- hi, i am a persian wiki user and edit in enwiki occasionally. my request means i can those work and i need bot on en Wikipedia for interwiki now, reason of my request is only for english wiki admins don't block my bot- can you help my what work i do for this?►Surena/Discussion)29 Khordad 1389-- 17:56, 19 June 2010 (UTC)[reply]
- If you would only like to do interwiki please update your request to represent that. FinalRapture - † ☪ 17:59, 19 June 2010 (UTC)[reply]
- I update my request & only safe inter wiki & move category (can i safe move category request? if not tell me for remove this)--►Surena/Discussion)29 Khordad 1389-- 18:28, 19 June 2010 (UTC)[reply]
- To move categories you would have to be familiar with our processes for determining consensus for moving categories. While you seem active on fawiki, you have not shown any experience with the English Wikipedia. If you want to run the pywikipedia interwiki.py here, you should be able to be approved for that once you have read WP:Bot policy, particularly the section on Interwiki links. Anomie⚔ 19:19, 19 June 2010 (UTC)[reply]
- i update my request and safe only inter wiki & i know how work for inter wiki--►Surena/Discussion)30 Khordad 1389-- 10:50, 20 June 2010 (UTC)[reply]
- To move categories you would have to be familiar with our processes for determining consensus for moving categories. While you seem active on fawiki, you have not shown any experience with the English Wikipedia. If you want to run the pywikipedia interwiki.py here, you should be able to be approved for that once you have read WP:Bot policy, particularly the section on Interwiki links. Anomie⚔ 19:19, 19 June 2010 (UTC)[reply]
- I update my request & only safe inter wiki & move category (can i safe move category request? if not tell me for remove this)--►Surena/Discussion)29 Khordad 1389-- 18:28, 19 June 2010 (UTC)[reply]
- If you would only like to do interwiki please update your request to represent that. FinalRapture - † ☪ 17:59, 19 June 2010 (UTC)[reply]
- Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. for interwiki. MBisanz talk 02:31, 23 June 2010 (UTC)[reply]
- {{OperatorAssistanceNeeded}} Any progress? Anomie⚔ 01:17, 17 July 2010 (UTC)[reply]
- now i edit whit my bot for finish my Trial period. whats wrong?|--►Surena/Discussion)26 Tir 1389-- 16:57, 17 July 2010 (UTC)[reply]
- Your bot has only made 4 edits in the 24 days since the trial was approved. I was wondering if you had forgotten about it. I also see that one of your edits was to Template:Ancient Egyptian religion footer, which is currently not allowed by the bot policy for interwiki.py. Anomie⚔ 17:49, 17 July 2010 (UTC)[reply]
- now i edit whit my bot for finish my Trial period. whats wrong?|--►Surena/Discussion)26 Tir 1389-- 16:57, 17 July 2010 (UTC)[reply]
- {{OperatorAssistanceNeeded}} Any progress? Anomie⚔ 01:17, 17 July 2010 (UTC)[reply]
- A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. I'm going to mark this as expired unless we hear back in a few days. MBisanz talk 05:20, 15 August 2010 (UTC)[reply]
Request Expired. Tempted to mark this as declined due to a lack of understanding of the bot policy, but will mark as expired for now. Can always be declined/approved at a later date. - Kingpin13 (talk) 12:50, 22 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Request Expired.
Operator: FinalRapture (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): PHP
Source code available: Here
Function overview: Adds {{SVG-Logo}} to SVG logos
Links to relevant discussions (where appropriate): ISB already does this
Edit period(s): One time run, ISB will do this to all new uploads
Estimated number of pages affected: About 1000
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): Y
Function details: Takes Category:All non-free logos, strips out files that don't have the extension .svg, strips out images that are in Category:Vector images of trademarks as well, processes each and if the image has a non-free logo template and no {{SVG-Logo}} the bot will add it underneath the current logo template.
Discussion
Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. — The Earwig (talk) 19:57, 21 June 2010 (UTC)[reply]
- How you doing on the trial? Tim1357 talk 23:40, 16 July 2010 (UTC)[reply]
- Any news on the trial? MBisanz talk 15:11, 3 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Rannpháirtí anaithnid (talk · contribs)
Automatic or Manually assisted: Automatic, but will be supervised while the process running.
Programming language(s): JavaScript
Source code available: This is the script. Framework is available here.
Function overview: To roll out the {{Irish place name}} template across Republic of Ireland towns and villages that do not already use it, identify Irish towns and villages that do not have the Irish-language name or do not have a English-language translation of the Irish-langauge name.
Links to relevant discussions (where appropriate): The original proposal and consensus. A more recent double check before logging this request.
Edit period(s): One time run.
Estimated number of pages affected: 1420 pages to be examined, approximately half of which will be edited.
Exclusion compliant (Y/N): No Yes.
Already has a bot flag (Y/N): No.
Function details: The {{Irish place name}} template is a variant of {{lang-ga}} and {{derive}}. It formats the Irish-language version of place names and their translations. It add articles without translations for their Irish names to Category:Untranslated_Irish_place_names. This template is used on about half of the articles in Category:Towns and villages in the Republic of Ireland by county judging by the current transclusion rates.
The script will replace the first instance of {{derive}}, {{lang-gle}}, {{lang-ga}} or "* in Irish" that occurs within parenthesis on each page with {{Irish place name}} for articles in child (but not grandchild) categories of Category:Towns and villages in the Republic of Ireland by county. Invariably this is the Irish-language place name. The process will then convert instances of {{Irish place name|XXX}}
. followed by "meaning YYY" and to {{Irish place name|XXX|YYY}}
.
If the above processes made no change to a page then the page will be added to Category:Articles on towns and villages in the Ireland possibly missing Irish place names.
Once the process is complete that category will be manually examined and unsuitable places will be removed. A follow up request will be (a) to notify the talk pages of articles in Category:Articles on towns and villages in the Ireland possibly missing Irish place names about the {{Irish place name}} template encourage editors to provide an Irish-langauge version and its translation; and (b) to remove the article from Category:Articles on towns and villages in the Ireland possibly missing Irish place names.
Consensus was gained to run this process on places across the entire island of Ireland. However given the sensitivities around Northern Ireland topics the request on this occasion is only for places in the Republic of Ireland. Once that process is done attention can be turned to running the same (or similar) process on Northern Ireland articles.
Sand-boxed trials taking 10 radnomly selected pages category have been run. Minor errors were found in early trials. Mainly these errors were not catching all variants of syntax to be replaced by {{Irish place name}}. None were destructive. The last two trials (2x10 articles at random) passed without error.
Discussion
Note that the bot policy states "The account's name should identify the operator or bot function." I see User:Luasóg is already associated with your bot from a past WP:CHU; if you redirect it to your page instead of to the bot's (i.e. turn it into a doppelgänger account for you instead of for your bot), IMO that would be sufficient. Anomie⚔ 20:32, 29 May 2010 (UTC)[reply]
- Nice advice. Thank you. Done that. --RA (talk) 10:35, 30 May 2010 (UTC)[reply]
- Why is the bot not exclusion compliant? Josh Parris 11:18, 30 May 2010 (UTC)[reply]
- No great reason. I simply didn't think it would be necessary in this context because this would be a one-time run addressing a very specific issue and not the sort of bot that could repeatedly harang an article by adding incorrect interwiki links or templates etc.. I can add it to the script if thought necessary. --RA (talk) 12:33, 30 May 2010 (UTC)[reply]
- I can pretty much guarantee that a second run will be required, and besides which some articles may have {{nobots}} which ought to be respected. Josh Parris 01:29, 31 May 2010 (UTC)[reply]
- No great reason. I simply didn't think it would be necessary in this context because this would be a one-time run addressing a very specific issue and not the sort of bot that could repeatedly harang an article by adding incorrect interwiki links or templates etc.. I can add it to the script if thought necessary. --RA (talk) 12:33, 30 May 2010 (UTC)[reply]
- Why is the bot not exclusion compliant? Josh Parris 11:18, 30 May 2010 (UTC)[reply]
- Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. MBisanz talk 06:49, 13 June 2010 (UTC)[reply]
Done. 50 article were take of the top of the list of 1446 over all. This resulted in 20 articles being changed. Each of the changed articles were manually diffed and the changes checked. No errors were found in the changed articles. The relevant output of the script is below.
... Saved Aghade (1) No change for Ballinkillen (2) No change for Ballon, County Carlow (3) Saved Ballymurphy, County Carlow (4) No change for Borris, County Carlow (5) Saved Carlow (6) No change for Clonegal (7) No change for Clonmore, County Carlow (8) Saved Coolkenno (9) Saved Fennagh, County Carlow (10) Saved Hacketstown (11) No change for Kildavin (12) No change for Leighlinbridge (13) Saved Muine Bheag (14) No change for Myshall (15) No change for Nurney, County Carlow (16) Saved Old Leighlin (17) Saved Palatine, County Carlow (18) No change for Rathvilly (19) Saved Royal Oak, County Carlow (20) Saved St Mullin's (21) No change for Tinryland (22) No change for Tullow (23) No change for Arvagh (24) Saved Bailieborough (25) No change for Ballinagh (26) No change for Ballyconnell (27) No change for Ballyhaise (28) No change for Ballyjamesduff (29) Saved Ballymagauran (30) No change for Bawnboy (31) No change for Belturbet (32) No change for Blacklion (33) No change for Butlersbridge (34) No change for Cavan (35) Saved Cootehill (36) No change for Crossdoney (37) No change for Dowra (38) No change for Glangevlin (39) No change for Kilcogy (40) No change for Killeshandra (41) Saved Kilnaleck (42) Saved Kingscourt (43) Saved Mountnugent (44) No change for Mullagh, County Cavan (45) Saved Shercock (46) No change for Stradone, County Cavan (47) No change for Swanlinbar (48) Saved Virginia, County Cavan (49) Saved Ardsallis (50) All done.
--RA (talk) 16:43, 4 July 2010 (UTC)[reply]
- I see the task description says it will replace the first instance of {{derive}}, {{lang-gle}}, {{lang-ga}} or "* in Irish", but in Bailieborough it replaced the first instance of {{lang-gle}} and the first instance of {{derive}} (although it seems to have turned out ok in that case). Anomie⚔ 01:16, 17 July 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Stwalkerster (talk · contribs)
Automatic or Manually assisted: Technically automatic
Programming language(s): C#
Source code available: http://svn.helpmebot.org.uk:3690/svn/coordbot/trunk/StwalkerCoordBot/ , using my Utility library: [22]
Function overview: Add coordinates to articles that do not have them.
Links to relevant discussions (where appropriate):
Edit period(s): Whenever there's a batch of edits to be done.
Estimated number of pages affected: With any luck, the clearing of this category
Exclusion compliant (Y/N): Y, assuming http://en.wikipedia.org/wiki/Template:Bots#C.23 works.
Already has a bot flag (Y/N): Y
Function details: The bot will run whenever I have work for it to do, and will be launched by giving it a KML file. This file will be an export from Google Earth, containing placemarks, the name of which will map to an article on Wikipedia. The bot will then parse the coordinates from that file, and put them in the right place on the relevant article.
The KML file is an export of manually generated placemarks. I will find the right place, create a placemark in the right place (like I already do), and then the bot will do the tedious work for me, while I get to keep playing on Google Earth. Technically it's automatic because I won't supervise what edits it actually makes, but I'll be telling it what to put on which pages by the nature of the KML file I give it. Stwalkerster [ talk ] 23:05, 12 June 2010 (UTC)[reply]
Discussion
I see no reason why not to trial this bot. I personally think that this is more of a helper-script than a bot. Perhaps, if you allow others to use the code, we may use it like a userscript, similar to twinkle. In any case: Approved for trial (30 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Tim1357 talk 01:29, 13 June 2010 (UTC)[reply]
- It is a helper script, yes, but it is making the edits on its own, on a bot account.
- Re trial: 26 coordinates saved from a file of 29. Automatic report failed to save due to database lag, so I added it manually from the email. I'll get another three coordinates and run that to make it up to 30 edits, when I've sorted [23] and [24] out. Small niggles that are annoying to see surfacing, but shouldn't be too bad :) Stwalkerster [ talk ] 19:33, 13 June 2010 (UTC)[reply]
- {{OperatorAssistanceNeeded}} Do you think you are ready for approval or do you want another trial? MBisanz talk 02:45, 23 June 2010 (UTC)[reply]
- Basically, there's two outstanding bugs that I've not had a chance to have a look at yet. Neither are particularly important, [25] is annoying to me at it's worst, and [26] is copying the behaviour of another bot which adds coordinates to articles (not putting them in the infobox). Preferably another trial would be nice, but if you feel that once these two bugs are fixed there won't need to be another trial... well - I think you get what I mean. Stwalkerster [ talk ] 16:37, 23 June 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Bwilkins (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): Pywikipedia
Source code available: https://svn.toolserver.org/svnroot/josh/g7bot
Function overview: Delete pages in userspace when requested by that user, under strict conditions.
Links to relevant discussions (where appropriate): VPP here
Edit period(s): Continuous
Estimated number of pages affected: 50 per day
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): Y
Function details: Similar to how 7SeriesBOT checks G7's now, this task will check db-user category, will check if:
- the page does have the correct tag
- it was tagged by the correct user/owner
- check to see if it has never been moved (to prevent moving pages from articlespace then deleting them), and
- will delete the page
Discussion
There's a problem with this, because user request does not apply to talk pages. So I'd be more happy if it checked that the only (substantial?) editor was the user requesting their user space to be deleted. - Kingpin13 (talk) 14:05, 27 March 2010 (UTC)[reply]
- Yes, check if it is the only editor. If there was another editor, even a minor edit from some bot, leave the request for an admin to decide. --JokerXtreme (talk) 14:17, 27 March 2010 (UTC)[reply]
- FYI, I've notified WT:CSD of this request. - Kingpin13 (talk) 14:21, 27 March 2010 (UTC)[reply]
- Great to see something moving on this,from previous discussions of this sort of Bot, may I suggest:
- If a page is flagged {{g7}}, {{u1}} or {{db-author}},
- and has only been edited by one account, Or by a bot or bots on the whitelist for this bot.
- and has not been moved other than a move from user space to mainspace
- and is not in projectspace
- and is not in templatespace
- then delete.
- You've already got much of that in the spec, the projectspace and templatespace limitations will kick in rarely but one should be cautious there. Can you find a way to screen out ones where the deletion tag appears because someone has tagged a userbox or other transcluded page? ϢereSpielChequers 14:49, 27 March 2010 (UTC)[reply]
- I don't think that trancluded material will trigger the bot. Will they? --JokerXtreme (talk) 15:01, 27 March 2010 (UTC)[reply]
- Well I hope not, but we sometimes get speedies appearing at CAT:SPEEDY because of tranclusions, so I'd appreciate assurance that the bot will not be fooled by them. ϢereSpielChequers 15:06, 27 March 2010 (UTC)[reply]
- I guess that it wouldn't be nice if vandals could take advantage of such liabilities. --JokerXtreme (talk) 15:09, 27 March 2010 (UTC)[reply]
- (edit conflict) The function details say that the bot will check that the page is correctly tagged, and isn't just in the category. At present I think the bot will just be deleting under U1...? So project/template space shouldn't matter. - Kingpin13 (talk) 15:10, 27 March 2010 (UTC)[reply]
- Well I hope not, but we sometimes get speedies appearing at CAT:SPEEDY because of tranclusions, so I'd appreciate assurance that the bot will not be fooled by them. ϢereSpielChequers 15:06, 27 March 2010 (UTC)[reply]
- I don't think that trancluded material will trigger the bot. Will they? --JokerXtreme (talk) 15:01, 27 March 2010 (UTC)[reply]
- If a page is flagged {{g7}}, {{u1}} or {{db-author}},
- One other point - you want to check that the user page in question has only ever been in the userspace of the page creator. If a page in created by Alice, but it was created in Bob's userspace, then deleting it requires human review of what's going on. Most of such deletions would still be uncontroversially mechanical, but the remainder would be things that shouldn't be blindly bot-deleted. — Gavia immer (talk) 15:16, 27 March 2010 (UTC)[reply]
- That should be covered by "not moved", shouldn't it? And of course that would be covered by "only edited by userspace-owner" which has been correctly been requested as an additional requirement above. The bot should only delete pages that someone created in their own userspace without any third-party edits to it. Regards SoWhy 16:17, 27 March 2010 (UTC)[reply]
- It should be covered. I just want to be certain the bot will handle this specific case correctly. — Gavia immer (talk) 17:19, 27 March 2010 (UTC)[reply]
- That should be covered by "not moved", shouldn't it? And of course that would be covered by "only edited by userspace-owner" which has been correctly been requested as an additional requirement above. The bot should only delete pages that someone created in their own userspace without any third-party edits to it. Regards SoWhy 16:17, 27 March 2010 (UTC)[reply]
- Great to see something moving on this,from previous discussions of this sort of Bot, may I suggest:
- FYI, I've notified WT:CSD of this request. - Kingpin13 (talk) 14:21, 27 March 2010 (UTC)[reply]
As long as bot is limited to userspace and follows the limitations laid down by WereSpielChequers, especially that concerning edits from other users, I don't see a problem. Eluchil404 (talk) 02:49, 28 March 2010 (UTC)[reply]
Well it seems the only additional check wanted is to check that the only contributor to the page (possibly excluding bots) is the user. Is it possible to have this added to the bot please? - Kingpin13 (talk) 15:33, 1 April 2010 (UTC)[reply]
Any updates on this? Also, is this bot running? I noticed that it was deleting a few pages tagged as U1...? - Kingpin13 (talk) 21:57, 6 April 2010 (UTC)[reply]
- I see that the edit summary is this:"(Only one contributor who requested deletion under WP:CSD#G7)". Wouldn't it be better if the user that requested the deletion is mentioned in the summary? --JokerXtreme (talk) 23:11, 6 April 2010 (UTC)[reply]
- 7SeriesBOT has been running it's first task - deleting only G7's under strict conditions - for some time. Nothing related to U1 (a new task) has been added yet, as I do not yet see official approval for this U1 task yet (that, and Josh is away). (talk→ BWilkins ←track) 09:27, 7 April 2010 (UTC)[reply]
- But the bot has been deleting U1 tagged user pages, on the 5th of this month. Will Josh be doing the coding again then? - Kingpin13 (talk) 09:30, 7 April 2010 (UTC)[reply]
- I haven't had the chance to ask him yet ... but when he returns, I hope that he agrees to. In theory, it's a simple addition (I could probably do it myself, but it would take longer!) (talk→ BWilkins ←track) 12:15, 7 April 2010 (UTC)[reply]
- http://en.wikipedia.org/w/index.php?title=Special:Log&user=7SeriesBOT shows 3 deletes by the bot on that date, all summaries claim G7. Josh Parris 13:42, 10 April 2010 (UTC)[reply]
- But the bot has been deleting U1 tagged user pages, on the 5th of this month. Will Josh be doing the coding again then? - Kingpin13 (talk) 09:30, 7 April 2010 (UTC)[reply]
- 7SeriesBOT has been running it's first task - deleting only G7's under strict conditions - for some time. Nothing related to U1 (a new task) has been added yet, as I do not yet see official approval for this U1 task yet (that, and Josh is away). (talk→ BWilkins ←track) 09:27, 7 April 2010 (UTC)[reply]
- Support with the conditions:
- the page is in userspace or usertalkspace and has the tag g7, u1 or redirects placed directly (i.e. not through transclusion) on it
- it has only been edited by the user to which the userpage or usertalkpage belongs
- it has never been moved
As I can find situations where omitting any one of those conditions would need review (edits by bots should exclude the page too because it may be bots warning on edits); but if they're all met it seems reasonable to delete without review. Cenarium (talk) 18:45, 9 April 2010 (UTC)[reply]
In the original 7SeriesBOT proposal, it was suggested that a plausible scenario might run like this: User drafts a wonderful piece of prose in User space. User gets very mad at project. User U1 tags wonderful piece of prose and leaves screaming profanities. Bot deletes this as meeting its criteria. Project loses wonderful piece of prose. What safeguards can be erected against this scenario? Josh Parris 13:57, 10 April 2010 (UTC)[reply]
- That it can be undeleted? --JokerXtreme (talk) 14:34, 10 April 2010 (UTC)[reply]
- What can be undeleted? I can't see anything. Can you? When a human deletes a page, they look at its contents and make a judgment of value. Bots are notoriously bad at that. If valuable content is just deleted, it's no longer discoverable, even if it isn't lost. Josh Parris 16:32, 10 April 2010 (UTC)[reply]
- The 1000+ admins can look. The odds are pretty slim, however. (talk→ BWilkins ←track) 16:45, 10 April 2010 (UTC)[reply]
- Odds of someone destroying his work or admins discovering it? I find it highly unlike that someone would delete his own work. It would be rare enough anyway to allow us to consider that possibility negligible. --JokerXtreme (talk) 17:27, 10 April 2010 (UTC)[reply]
- The 1000+ admins can look. The odds are pretty slim, however. (talk→ BWilkins ←track) 16:45, 10 April 2010 (UTC)[reply]
- What can be undeleted? I can't see anything. Can you? When a human deletes a page, they look at its contents and make a judgment of value. Bots are notoriously bad at that. If valuable content is just deleted, it's no longer discoverable, even if it isn't lost. Josh Parris 16:32, 10 April 2010 (UTC)[reply]
Is this going anywhere? There seems to be enough support for this task, I need to know if Josh is willing to code it? If not I suggest you (Bwilkins) find another user to code this (I could make you a c# .net program to do this), or I can mark this as expired, and it can be re-opened once there's some code. - Kingpin13 (talk) 09:47, 17 April 2010 (UTC)[reply]
- I was going to let Josh settle in after his wikibreak - he's had some coding to work on with WildBot since his return. If I read correctly, we'll need to add:
- a separate section for checking the parameters as approved above (including a variety of ways that U1 actually gets tagged)
- a second deletion message noting "deleted according to U1"
- I'll coordinate with Josh (talk→ BWilkins ←track) 11:11, 17 April 2010 (UTC)[reply]
Source Code
Some code
|
---|
def get_all_bots(self):
'''Loads a list of all flagged bots, and saves it to self.bots'''
params = {
'action' :'query',
'list' :'allusers',
'augroup' :'bot'
}
data = wikipedia.query.GetData(params,self.site)['query']['allusers']
all_bots = [p['name'] for p in data]
del data
self.bots = all_bots
def owner_is_only_contributor(self,page):
'''Given a wikipedia.Page object, it determines if the page is in the proper namespace, and if there is one or zero non bot contributors.'''
verdict = False
if int( page.namespace() ) in [1,2,3,5,7,9,11,13,15,101,109]: #if it is a userpage, or any talk page
contributors = list(page.contributingUsers()) #get a list of all users who have edited the page
contributors = [con for con in contributors if con not in self.bots] #remove all bots
if len(contributors) <= 1: #Make sure that there is one or less (non bot) contributors
verdict = True
return verdict
|
I have contacted Josh, who is able to provide some assistance - even considering his busy schedule. As it's an add-on to an existing bot, if the new task itself is approved, then the implementation is likely the easy part overall. (talk→ BWilkins ←track) 09:29, 26 April 2010 (UTC)[reply]
- This seems like an unopposable idea. There are edge cases where it might be abused, but they would be admin-fixable anyway. Rich Farmbrough, 11:02, 1 May 2010 (UTC).[reply]
I've done a first-pass, untested write up, but Internet issues mean I can't as-yet upload this. Josh Parris 01:02, 7 May 2010 (UTC)[reply]
- Now uploaded. Head nod to Tim1357, whose snippets were mercilessly stolen. Josh Parris 06:38, 7 May 2010 (UTC)[reply]
Trial
Run it for a week in Trial mode; there's nothing stopping you as bots can edit their own user space freely, especially for testing purposes. Josh Parris 11:19, 13 May 2010 (UTC)[reply]
- He's been running in conjunction with the original version for a couple of days now ... the new code seems to be generally catching the CSD category first ... (talk→ BWilkins ←track) 11:27, 13 May 2010 (UTC)[reply]
- For those interested, the bot's proposed actions can be monitored at User:7SeriesBOT/Dry-Run 2 Josh Parris 11:52, 13 May 2010 (UTC)[reply]
- The trial seems to have stalled. Can you bring us up-to-date where things are? Josh Parris 02:33, 31 May 2010 (UTC)[reply]
- For those interested, the bot's proposed actions can be monitored at User:7SeriesBOT/Dry-Run 2 Josh Parris 11:52, 13 May 2010 (UTC)[reply]
Update: It's been running fine today. Original G7bot is currently shutdown. Lots of pages being logged today - would be good to turn this back on to actual "delete" mode soon. One forgets how many pages that this ends up actually affecting until you see the CSD count some days. (talk→ BWilkins ←track) 19:28, 1 June 2010 (UTC)[reply]
Ready?
An analysis over over a week's worth of data, including the new code of mid-May suggests that it's tagging as it should. Although, by appearances, if one user, WildBot and FrescoBot all edit a talkpage, it's suggesting that there are multiple contributors - whereas I believe that bot edits are not supposed to be considered. This is minor, as for the most part everything looks good, and I think it's ready to go live. (talk→ BWilkins ←track) 09:20, 4 June 2010 (UTC)[reply]
- While you hunt Josh down to fix that bug, I'll give you this:
- Approved for trial (30 deletions). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Tim1357 talk 22:19, 13 June 2010 (UTC)[reply]
- I've set it to delete overnight (my time) ... should be less than 30 by morning. 7SeriesBOT (talk) 23:41, 13 June 2010 (UTC)[reply]
- Having some bizarre connectivity issues, but only on that PC. I have a batch file that does regular ping's to en.wikipedia, and show full connectivity, but the bot continually says "retrying in 1 minute". Even switched it back to non-delete mode last night. Anyone seen anything like this, or is it truly a connection issue? (talk→ BWilkins ←track) 11:00, 24 June 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Withdrawn by operator.
Operator: DeltaQuad (talk · contribs)
Automatic or Manually assisted: Automatic, with a look over contribs everyonce in a while.
Programming language(s): Python
Source code available: Pywikipedia, source:
Version 1.0
Version 1.01 Fixing CSD errors, and not moving some CSDs
Version 1.1 Fixing redirect Errors & CSD tagging
Function overview: Will move WP:AFC Submissions to the correct pages for us to review.
Links to relevant discussions (where appropriate): None.
Edit period(s): once every 30 min (pending approval on Toolsever)
Estimated number of pages affected: 1 month maybe 20-30
Exclusion compliant (Y/N): No, as bots should be all WP:AFC subpages, as EarwigBot also checks for copyright vios on those pages.
Already has a bot flag (Y/N): N
Function details: If it finds submissions with the extra wording in it, or it's in the namespace, and moves them to the proper location. It will also tag submissions, if moving across anything besides namespaces 4 & 5, will tag with CSD R2.
Old links:
Discussion
Should have the code ready for viewing within the next 24 hours. Sorry for the delay. -- /MWOAP|Notify Me\ 16:42, 17 April 2010 (UTC)[reply]
- From what I've seen working in AfC, there aren't that many misplaced pages which aren't already moved to the correct place quickly by anyone watching the category. If this could run off the toolserver continuously or something, it might be more effective. fetchcomms☛ 21:46, 19 April 2010 (UTC)[reply]
{{BAGAssistanceNeeded}} Untouched by BAG since 17th. -- /MWOAP|Notify Me\ 20:47, 26 April 2010 (UTC)[reply]
- Approved for trial (15 pages). Please provide a link to the relevant contributions and/or diffs when the trial is complete.. –xenotalk 20:50, 26 April 2010 (UTC)[reply]
Error Occured: The bot attempted to move the pages, but was not confirmed and failed there (has been confirmed now). Also, I am now adding CSD clearing from the mainspace as AFC's do not need to be CSD'd. Will try some more edits when the next opertunity comes up. This is one of the fails, the other one was deleted.-- /MWOAP|Notify Me\ 19:51, 28 April 2010 (UTC)[reply]
- Done Now, only moves pages that don't have CSD or fall under CSD G11, A7, and A9. See Ver. 1.01 -- /MWOAP|Notify Me\ 20:44, 28 April 2010 (UTC)[reply]
I'm pleased to note that the bot continues with the trail, several edits have happened in the last few days. Josh Parris 09:50, 15 May 2010 (UTC)[reply]
- Working Yep, it is still ongoing, I have found some problems and have fixed them each step of the way, but there have been delays since it is hard to find mis-placed submissions while I am on. I should have the modified code uploaded soon. Thank you for waiting. Now, do you want 15 correct edits or 15 edits? I would like to be able to have it edit until it gets it right, which I am close to, and I always sight each change it makes. -- /MWOAP|Notify Me\ 19:00, 16 May 2010 (UTC)[reply]
Done I have uploaded the final version of my code. One thing that I can't avoid is when the bot moves a page from the userspace, it only removes the "User:" part of it. This can be easily dealt with by a reviewer at the end, if it even gets that far. I tried to find a way to program it in, but I couldn't. Otherwise I'd say this thing is ready to go. (I've been over 15 edits and the past three are perfect :) ) -- /MWOAP|Notify Me\ 20:50, 20 May 2010 (UTC)[reply]
- Approved for trial (25 pages). Please provide a link to the relevant contributions and/or diffs when the trial is complete.. Let's see a bit more of an extended trial with the new codebase. –xenotalk 20:53, 20 May 2010 (UTC)[reply]
- Ok, just a head's up, another 25 edits are going to take a while as I don't have it on 24-7, and I can only do it a certain times. Toolsever hasn't been approved yet. -- /MWOAP|Notify Me\ 20:56, 20 May 2010 (UTC)[reply]
- Heads up, I was moding the code this morning to correct a userspace compaitibility issue, and I made a stupid mistake in the code. It created two mistakes which I got an admin to fix and I stopped it before it went further. Just thought I would log this. -- /MWOAP.alt|Notify Me\ 15:19, 21 May 2010 (UTC)[reply]
How's this going? Josh Parris 02:28, 31 May 2010 (UTC)[reply]
- Currently, the bot is healthy after making a few changes to stop the bot from moving created pages back. Waiting to test more. -- /MWOAP|Notify Me\ 01:31, 2 June 2010 (UTC)[reply]
- Bot again had an issue. Added some extra safety paramitters to the code so it doesn't edit if there is an error. Should be uploaded soon. -- /MWOAP.alt|Notify Me\ 15:13, 2 June 2010 (UTC)[reply]
- The bot keeps getting caught in the page moves with the error: "Unknown Error {u'error':{u'info' u'The modification you tried to make was aborted by an extension hook', u'code': u'hookaborted'}}. What is this? -- /DeltaQuad|Notify Me\ 02:41, 4 June 2010 (UTC)[reply]
- Comment and question: I am not convinced that this bot is a good idea; there are very few misplaced submissions, and they are quite quickly dealt with manually. I think this is a case where manual intervention is preferable - a misplaced submission might need moving to AfC, or it might need other action - discussion with the user, etc. Sometimes they intended to make it live - I've come across such examples. So, to demonstrate the need, could you provide some statistics re. how many misplaced submissions there have been in the past few months, and how many should have / were relocated to AFC? (And please let me know; I don't manage to watchlist things very well) Chzz ► 18:56, 8 June 2010 (UTC)[reply]
- Right now we are recieving multiple Userpages that need to be moved to the correct location so I am working on that to fix it. Currently the bot is going under a rename to DeltaQuadBot, I will update the links. I am working with the username code to get it to work now. I assume that Edit filter hits can still count as edits. -- /DeltaQuad|Notify Me\ 20:50, 13 June 2010 (UTC)[reply]
- Hey MWOAP, its been over a month. Hows it going? Tim1357 talk 23:27, 16 July 2010 (UTC)[reply]
- Sorry, ya, it's been a while. I have had a move recently, sorry for the delay. Trying to get the bot to work, but it won't even launch right now. Will report back in a few days. -- /DeltaQuad|Notify Me\ 17:03, 21 July 2010 (UTC)[reply]
- This is funny, it now works :P. Anyway, it hit an edit conflict on the last one, so adding a sleep parameter now to avoid. -- /DeltaQuad|Notify Me\ 17:30, 21 July 2010 (UTC)[reply]
- Ok, that's good, any new updates for us? MBisanz talk 14:52, 28 July 2010 (UTC)[reply]
- I have added safety sections in, which will check if the file already exists in a certain spot, but that is not functioning for some reason. Still ongoing battle. Also having problem with getting /*username*/ out of titles switched from the userspace. -- /DeltaQuad|Notify Me\ 15:07, 28 July 2010 (UTC)[reply]
- Ok, that's good, any new updates for us? MBisanz talk 14:52, 28 July 2010 (UTC)[reply]
- This is funny, it now works :P. Anyway, it hit an edit conflict on the last one, so adding a sleep parameter now to avoid. -- /DeltaQuad|Notify Me\ 17:30, 21 July 2010 (UTC)[reply]
- Sorry, ya, it's been a while. I have had a move recently, sorry for the delay. Trying to get the bot to work, but it won't even launch right now. Will report back in a few days. -- /DeltaQuad|Notify Me\ 17:03, 21 July 2010 (UTC)[reply]
- Hey MWOAP, its been over a month. Hows it going? Tim1357 talk 23:27, 16 July 2010 (UTC)[reply]
I noticed the bot is making some dumb mistakes (e.g. [27] where the username is still included in the new title). I agree with Chzz that this bot is probably not needed as it is easy to cope with the small number of misplaced submissions manually. — Martin (MSGJ · talk) 08:10, 9 August 2010 (UTC)[reply]
- I am actually starting to agree with that. The code is soo long now just to handle these things and it would need more testing in my userspace and I think this is just too much work for the productivity that we are going to get. I think we can close this up now and if I have any more ideas for a bot, I will come back. -- /DeltaQuad|Notify Me\ 11:46, 9 August 2010 (UTC)[reply]
- Withdrawn by operator. MBisanz talk 04:49, 10 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
Bots that have completed the trial period
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Nettrom (talk · contribs)
Automatic: Specify whether supervised or unsupervised --> Automatic or Manually assisted: Automatic, supervised
Programming language(s): Perl, Python (with pywikibot), SQL
Source code available: Currently not
Function overview: Post suggestions to a limited number of new users in research project
Links to relevant discussions (where appropriate):
Edit period(s): daily, over a period of eight weeks
Estimated number of pages affected: 500
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): Y
Function details: SuggestBot already posts suggestions to talk pages of users who have requested to get them. As part of a research study, we want to post a suggestions to a limited number of new users in order to see if we can make them contribute more than they otherwise would. Through statistic analysis of en-WP data we have identified some criteria which we think identify the more prolific users, and now want to test our hypothesis with an experiment. We will not post suggestions to users who have been warned or banned for vandalistic edits, or similar actions. Unless the users in the experiment specifically ask for regular suggestions, they will only have received a one-time posting on their user talk page.
Discussion
I'd just like to point out that SuggestBot has already been the subject of two published research articles. The first is "SuggestBot: Using Intelligent Task Routing to Help People Find Work in Wikipedia" from 2007, and the second is "The Diffusion of a Task Recommendation System to Facilitate Contributions to an Online Community." from 2009. Nettrom (talk) 20:37, 11 June 2010 (UTC)[reply]
- This seems to be directly in line with the proposals at WP:Research. If nothing else, the people working on that proposal should be invited to participate in this BRFA. Please do so. Were I unaware of that proposal, I would say "go get strong consensus at WP:VPR, advertised at WP:CENT, WP:AN, and anywhere else you can think of, and come back when that's done", which IMO would be a rather higher hurdle. Anomie⚔ 21:32, 11 June 2010 (UTC)[reply]
- I have made a post to WP:Research and invited anyone interested to participate, thanks for the suggestion! The other option does sound like a slightly difficult hill to climb, yes. Nettrom (talk) 01:35, 12 June 2010 (UTC)[reply]
- Can you please provide some bibliographic details for those articles (direct links, preferably)? ElKevbo (talk) 22:13, 19 June 2010 (UTC)[reply]
- Sorry for not seeing this sooner. You can find an abstract of the first article through our research lab's home page, SuggestBot: (…) -- GroupLens Research, and also through ACM's Digital Library: [28] The second article was published in Journal of Computer-Mediated Communication, and an abstract should be available through [29]. Let me know if there's additional questions. Cheers, Nettrom (talk) 19:34, 25 June 2010 (UTC)[reply]
More importantly, I'd also like to point out that I have made a similar request on Norwegian Wikipedia, as we would like to run a parallel study across both language versions of Wikipedia, also using SuggestBot's Norwegian sister bot AnbefalingsBot. If you read Norwegian, the request and its discussion is here (if needed I can summarise it). Cross-language Wikipedia research like this is currently rare, which is one of the motivations for this study. Nettrom (talk) 13:26, 12 June 2010 (UTC)[reply]
- Does the bot care if the user has been previously welcomed or not? Note that some welcome messages are also warnings. Gigs (talk) 22:40, 13 June 2010 (UTC)[reply]
- In the process of selecting users, we will check for the common warning templates on their user talk page, and if any is present they'll be excluded from the study. Apart from that I've made a note to make manual checks of their user talk page before we post suggestions, as the number of users is small enough to allow for that. Are you also wondering if our suggestions will incorporate a welcoming message to the users? I've thought of that, and am considering having a group that gets it, and one that doesn't, to see if it has any effect, but the presence of other users and their welcoming messages might make it difficult, so I'm not totally sure. Nettrom (talk) 02:09, 14 June 2010 (UTC)[reply]
- Well, yes welcoming first might be good. Just be aware that some of the welcome templates are also warnings, like
{{welcome-coi}}
. Gigs (talk) 02:58, 14 June 2010 (UTC)[reply]
- Well, yes welcoming first might be good. Just be aware that some of the welcome templates are also warnings, like
- In the process of selecting users, we will check for the common warning templates on their user talk page, and if any is present they'll be excluded from the study. Apart from that I've made a note to make manual checks of their user talk page before we post suggestions, as the number of users is small enough to allow for that. Are you also wondering if our suggestions will incorporate a welcoming message to the users? I've thought of that, and am considering having a group that gets it, and one that doesn't, to see if it has any effect, but the presence of other users and their welcoming messages might make it difficult, so I'm not totally sure. Nettrom (talk) 02:09, 14 June 2010 (UTC)[reply]
This request sounds reasonable and low-risk for editors. I have some general concerns about this process but I'll post them elsewhere so as not to derail this discussion. ElKevbo (talk) 22:13, 19 June 2010 (UTC)[reply]
- Approved for trial (50 edits or 5 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. MBisanz talk 02:40, 23 June 2010 (UTC)[reply]
- Since the whole task is only 200-250 pages, and suggest-bot is well established, I would suggest approving the whole task. That way, base-lining dates and so forth will be easier for the researchers. Rich Farmbrough, 07:02, 23 June 2010 (UTC).[reply]
- After having discussed the study again with my advisers it appears we might have to scale our sample size, perhaps doubling it. While that might appear to be a huge amount of users, I'd like to note that it's roughly 1.7% of our identified population size, and about 1% of active registered users, so we'll be leaving most Wikipedians to themselves. I'll have more exact numbers after next week's meeting. I'd be happy to do a 50-user trial to demonstrate that we'll be targeting the right users and stay away from disruptive behaviour, though. Cheers, Nettrom (talk) 19:46, 25 June 2010 (UTC)[reply]
Update 2010-06-29: After having revisited our statistics and calculations for the size of the study, we have found it necessary to increase our size to about 500 users, and increase the experiment period to eight weeks (we originally planned for four). Revisiting the numbers I posted on June 25, the time extensions cuts the percentages in half, thus we'll be reaching less than one percent of either user base, meaning we're still leaving most Wikipedians alone.
Unless there are concerns raised, I will commence a 50-user trial as soon as possible. We're currently fixing a data gathering bug in SuggestBot and should have that resolved in a couple of days. Nettrom (talk) 18:13, 29 June 2010 (UTC)[reply]
Suggestions: One of the concerns raised about the process outlined in Wikipedia:Subject Recruitment Approvals Group and Wikipedia:Research was the potential annoyance factor of making uninvited requests for assistance to a volunteer group who are often already quite stretched. Using SuggestBot as a means of recruiting research subjects is a good idea, as people who receive SuggestBot have already opted in for suggestions on how they might further assist Wikipedia. I see this a good way forward.
- As a means of recuiting more subjects, might it be possible for SuggestBot to also contact users who opt in for researching?
- Additionally, could SuggestBot, as a one off when the trial is succesfully completed and the project is given the green light, include in its message to all users a link for people to opt in for future research projects. This would have the benefit of picking up users who read the talkpage of users who receive SuggestBot. SilkTork *YES! 17:50, 1 July 2010 (UTC)[reply]
- For this study we aim to discover if our suggestions have an effect, and have therefore made preparations to make sure that is the only thing we're looking for. Adding a message asking users to opt in will only make things more difficult for us, for instance by triggering undesired side effects, and therefore I do not wish to do that.
- SuggestBot is built with a specific purpose in mind, while the recruitment process you are describing is of a more general nature. I am not sure SuggestBot is the right bot to do that job, and would instead recommend discussing that kind of problem over on Wikipedia Talk:Research or Wikipedia Talk:Subject Recruitment Approvals Group. Cheers, Nettrom (talk) 20:13, 6 July 2010 (UTC)[reply]
Trial complete.
Update 2010-07-07: We've now completed the 50-user trial. See SuggestBot's contribution history, edits starting from 16:05 on July 2 to 11:23 on July 7, with the exception of User talk:Sillybillypiggy, User talk:Birdy god, and User talk:White Shadows, which were all regular requests for suggestions posted on User:SuggestBot/Requests and thus use a different template. Cheers, Nettrom (talk) 16:42, 7 July 2010 (UTC)[reply]
Update 2010-07-19: I'd appreciate some response on this request from a BAG member at this point. Before the trial started we also approached the Welcoming committee members to notify them of our intentions. Neither there nor here (nor from the WP:Research crowd) have we seen any strong opposition from the community, and if there are any issues that the BAG members think haven't been discussed or some actions we haven't taken, I'd appreciate getting to know about them so I can work on it. Cheers, Nettrom (talk) 15:48, 19 July 2010 (UTC)[reply]
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: VernoWhitney (talk · contribs)
Automatic or Manually assisted: Automatic
Programming language(s): Python
Source code available: Not written yet
Function overview: Clerical tasks for Wikipedia:WikiProject Copyright Cleanup.
Links to relevant discussions (where appropriate):
Edit period(s): Daily
Estimated number of pages affected: Less than 10 per day
Exclusion compliant (Y/N): Y
Already has a bot flag (Y/N): N
Function details: I am interested in automating a decent variety of tasks focused on helping out with the copyright area, but they are all relatively minor:
- transclude new pages created by CorenSearchBot to WP:SCV
- add {{adminbacklog}} and {{backlog}} tags to WP:CP and WP:SCV respectively if/when they get backed up
- act as a backup for DumbBOT and Zorglbot for their tasks at WP:CP if/when they are down (this involves moving transclusions and loading new daily pages)
- notify authors that their pages have been blanked (by {{subst:copyvio}}) in case they aren't notified by the taggers, so that the pages don't get relisted for an extra week without any action being taken on them
- add pages newly tagged with {{close paraphrase}} to the appropriate daily page at WP:CP
Discussion
If you haven't done so already, please notify WikiProject Copyright Cleanup of this BRfA, and request some input from them. Regarding function 1, I presume you mean pages tagged by the bot (not created)? Doesn't CorenSearchBot do this itself? If not, I feel that it should be something CSB does, rather than having another bot running around after it. Regarding task 4, how do you identify if the user has already been warned? Do you link to the nominator? Will you be using a custom notification message for the author? - Kingpin13 (talk) 12:27, 24 May 2010 (UTC)[reply]
- I am a member of Copyright Cleanup, and I've gone ahead and posted a link and request for input on the talk page. Regarding #1 I actually do mean created, it makes a new page recording every day (e.g., Wikipedia:Suspected copyright violations/2010-05-24) and for the past couple of months I've just been transcluding them into the main WP:SCV page (like this) since that's how they were doing it before I joined on. No, CSBot doesn't do it itself, and it could be something that it does, but DumbBOT already transcludes the page onto the daily WP:CP page here), so I guess I just assumed it was something to be done by other bots and leave CorenSearchBot to its main task of the actual searching and tagging.
- Regarding #4, this is obviously the most complicated part of the bot, and may take some time for me to figure out the best practices and algorithm for it. After blanking, a page (or possibly just sections of the page) looks like this, which provides the convenient "{{subst:Nothanks-web|pg=American Council of Engineering Companies|url=http://www.acec.org/about/index.cfm}} ~~~~" code to notify the uploaders which when placed looks like this. I believe the bot message should be largely the same, albeit with an additional note at the end that the message was left by a bot and instructions to contact me if the bot is in error or leaving a duplicate notice or the like. I hadn't considered the need to link to the nominator, but now that you mention it I probably should, in case the uploader has any questions about why the article/section was blanked. I was going to have it figure out which new articles were transcluding Template:copyviocore from the previous day and then check for who uploaded the blanked material (when I do this by hand unless it's a brand new article I use WikiBlame/Revision history search to find when the offending text was added and identify the contributor, although I'm unsure if there's an easier way to do it with a bot). Once the contributor is found, checking their talk page for a notification regarding the page in question and if necessary the days talk page history to ensure they haven't received and deleted the notification.
- I will say that there also some other still-clerical duties that didn't occur to me when filing (such as closing a CCI which is marked as completed, or backing up DumbBOT's other CP function of listing articles that are tagged {{copyvio}} but not listed at WP:CP), and I'm sure others that haven't occured to me yet - would I need to file another BRfA for additional things of that nature? VernoWhitney (talk) 14:27, 24 May 2010 (UTC)[reply]
- I suppose it's a little late now, but should I have filed five seperate BRfAs instead of this single one? VernoWhitney (talk) 17:50, 25 May 2010 (UTC)[reply]
- After reading about some problems with another bot which left notice templates, I'm thinking that completely automating #4 may not be a good idea, and instead it should probably just make a list in it's own userspace (or a notation elsewhere depending on where others at WP:COPYCLEAN may like it) for manual double-checking in case a personal notice was left. VernoWhitney (talk) 16:56, 26 May 2010 (UTC)[reply]
- Okay, so will it actually be waiting for CSBot to create the subpage, or will it just transclude the next subpage every day?
It's a good idea to include a link to the nominator as this makes it easier for the user being warned, and also reduces the users asking the bot for help. I wouldn't worry too much about CSDWarnBot, the main problems with that was the message, and that it wasn't actually waiting a set amount of time between the tagging of the article, and the warning of the user. I've actually since created a bot which does the same task (User:SDPatrolBot II) and haven't had any serious complaints, it's not actually the task of warning users on behalf of others that there was a problem with, rather just the way in which that bot was carrying out the task. However, it's your call :).
It did occur to me it might be easier to have them as separate BRfAs, but don't worry too much about it, we can simply trial them separately (code permitting?), so it's not a large problem. As to future tasks, they would require further BRfAs, unless they are simply uncontroversial changes/fixes to the current tasks, if you're not sure, you can just ask me or another BAG member in the future. - Kingpin13 (talk) 17:45, 26 May 2010 (UTC)[reply]- I was thinking of having it wait until CSBot created the subpage, just so it's not putting up a useless redlink, but it would work fine either way. I guess if the timing was the big problem with CSDWarnBot then I'll stick with my original idea for #4, since I'll only be running this once a day (and only a dozen articles are blanked on a busy day) the chance of stepping on someone's toes shouldn't be a large concern. As I haven't coded the bot yet (waiting to be sure I'm not wasting my time) I'll make sure they can be trialed separately, although I imagine they could be fairly drawn-out trials as only task #1 is guaranteed to run every day (assuming CSBot isn't down). VernoWhitney (talk) 18:31, 26 May 2010 (UTC)[reply]
- Regarding #4 notifications, WP:Bots/Requests for approval/Erwin85Bot 8 AfD notification is more comparable: it runs once a day for a 7-day process. Flatscan (talk) 04:05, 28 May 2010 (UTC)[reply]
- Not sure about having it wait for CSBot, since that would mean (presumably) that you would have to have it running continuously, repeatedly checking if the page is created, if you're happy with that, then there's no problem (so long as it's waiting a reasonable amount of time in-between checks). Like you say, either way works. Also, will the bot need to remember pages it has previously listed, so it does not list them again? Hmm, well it would be best to have the bot check the article was tagged at least 15 minutes or so before warning, if you're getting the nominator anyway, it shouldn't be too difficult to check the time of that revision? Okay, tasks #1 and #2 seem pretty straightforward, you may want to start on the code for those, and let me know once you're ready to trial them. Regarding #2, what will the number of open pages be before the backlogged template is added/removed? Also, you may want to come up with some drafts for the warning message to be used for task #4, if you want it be used as a on-wiki template, let me know if you also want the template protected - Kingpin13 (talk) 18:47, 26 May 2010 (UTC)[reply]
- I'll just start with it transcluding whether the new daily page exists or not and worry about the timing later. For #1, the bot shouldn't have to keep track of pages at all, just to know what day it is as there's no reason for them to be un-transcluded until the day is over and every entry has been dealt with. For #2, I was thinking 4 days of backlog warrants the tag (3 days can pile up with trickier cases or a weekend, but 4 is rarer). As far as a delay for #4, I was thinking of having it start its scanning of new tags around 00:00 UTC (since everything's broken down by day), but then wait to do at least some of the edits until around 00:20 UTC once Zorglbot and DumbBOT and should have completed their work (at 00:00 and 00:15 respectively), so I'll put notifications into that second batch too. I'll start working on a notification message in my sandbox. I'm not sure how much detail you're looking for so feel free to stop me if I'm rambling or ask for more details if I'm not giving enough. VernoWhitney (talk) 19:43, 26 May 2010 (UTC)[reply]
- I was thinking of having it wait until CSBot created the subpage, just so it's not putting up a useless redlink, but it would work fine either way. I guess if the timing was the big problem with CSDWarnBot then I'll stick with my original idea for #4, since I'll only be running this once a day (and only a dozen articles are blanked on a busy day) the chance of stepping on someone's toes shouldn't be a large concern. As I haven't coded the bot yet (waiting to be sure I'm not wasting my time) I'll make sure they can be trialed separately, although I imagine they could be fairly drawn-out trials as only task #1 is guaranteed to run every day (assuming CSBot isn't down). VernoWhitney (talk) 18:31, 26 May 2010 (UTC)[reply]
- Okay, so will it actually be waiting for CSBot to create the subpage, or will it just transclude the next subpage every day?
- After reading about some problems with another bot which left notice templates, I'm thinking that completely automating #4 may not be a good idea, and instead it should probably just make a list in it's own userspace (or a notation elsewhere depending on where others at WP:COPYCLEAN may like it) for manual double-checking in case a personal notice was left. VernoWhitney (talk) 16:56, 26 May 2010 (UTC)[reply]
As an admin part of WP:Copyclean, I think every task here, except perhaps for #2, is sorely needed. 2 is merely in the "nice to have" category as far as I'm concerned. MLauba (Talk) 16:33, 27 May 2010 (UTC)[reply]
- Ditto in all respects. I've added a more detailed note at Wikipedia talk:WikiProject Copyright Cleanup#Bot thoughts. --Moonriddengirl (talk) 17:04, 27 May 2010 (UTC)[reply]
Okay, I now have task #1 programmed and ready for trial. VernoWhitney (talk) 02:39, 9 June 2010 (UTC)[reply]
- And task #2 ready for trial. VernoWhitney (talk) 00:12, 11 June 2010 (UTC)[reply]
- Brilliant. Approved for trial (7 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. For tasks 1 and 2, that may not be long enough for the backlog, but lets see how it goes. - Kingpin13 (talk) 15:42, 11 June 2010 (UTC)[reply]
- For convenient timing, CorenSearchBot has been out of commission recently, so there will at least be a "backlog" at WP:SCV of links to non-existant pages. VernoWhitney (talk) 20:06, 11 June 2010 (UTC)[reply]
- Trial complete. 7-day trial completed for tasks #1 and #2. The first two days were run manually, after which I set up the last 5 for completely automatic. VernoWhitney (talk) 14:42, 18 June 2010 (UTC)[reply]
- For convenient timing, CorenSearchBot has been out of commission recently, so there will at least be a "backlog" at WP:SCV of links to non-existant pages. VernoWhitney (talk) 20:06, 11 June 2010 (UTC)[reply]
- Brilliant. Approved for trial (7 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. For tasks 1 and 2, that may not be long enough for the backlog, but lets see how it goes. - Kingpin13 (talk) 15:42, 11 June 2010 (UTC)[reply]
Brilliant, the trial all looked fairly straightforward. Firstly just a sorry for the late reply, I've not had as much time for Wikipedia recently. Moving back to task #3, have you decided how to identify if the creator has been warned yet? Also, could you provide a link to your sandbox please? - Kingpin13 (talk) 07:38, 21 June 2010 (UTC)[reply]
- No worries. Since task #3 is just acting as a backup to other bot tasks, I'm assuming you're asking about #4 and I think I've figured out how to do it, but I'm still working on fully implementing it yet and seeing if it correctly determines who needs to be notified in a dry run. My sandbox is at User:VernoWhitney/Sandbox2. VernoWhitney (talk) 11:59, 21 June 2010 (UTC)[reply]
- Yeah, sorry about that, meant #4 (spotted that I had it wrong, and then forgot to fix it!) Okay, this is pretty much the same as SDPatrolBot II, and there doesn't seem to be much opposition from users. Let me know once you want to do a (user-space if possible) trial of task 4. If it is a user-space trial, would it be possible to list some pages which the bot thinks the creators' of have already been notified? Just so we get a larger number of pages to search through, if not that's fine, - Kingpin13 (talk) 13:44, 21 June 2010 (UTC)[reply]
- Yeah, as is it's currently getting a list of all of the blanked pages (see User:VWBot/Trial), having it just make a note for its conclusions as to whether they've been notified (or don't need to be in certain cases), won't be a problem. VernoWhitney (talk) 13:59, 21 June 2010 (UTC)[reply]
- Great, let's do a user space trial for this, probably for a few days. If you could, please list all the parameters the bot would put in the template. Oh by the way, I meant to mention that I like the look of the template you are using, comparing to the standard one. It's a bit long, but can't really be helped with a copyright notice. Also, if you need protection for a template, just ask me. Approved for trial (7 or so days, userspace only). Please provide a link to the relevant contributions and/or diffs when the trial is complete. (for task #4) - Kingpin13 (talk) 13:30, 23 June 2010 (UTC)[reply]
- Trial complete. 7-day userspace trial completed for task #4. I also have the code ready for task #5 and most of #3. #4 has been a work in progress, so I don't know if you want to see more of it or not, but it's been seven days. VernoWhitney (talk) 03:12, 30 June 2010 (UTC)[reply]
- Great, let's do a user space trial for this, probably for a few days. If you could, please list all the parameters the bot would put in the template. Oh by the way, I meant to mention that I like the look of the template you are using, comparing to the standard one. It's a bit long, but can't really be helped with a copyright notice. Also, if you need protection for a template, just ask me. Approved for trial (7 or so days, userspace only). Please provide a link to the relevant contributions and/or diffs when the trial is complete. (for task #4) - Kingpin13 (talk) 13:30, 23 June 2010 (UTC)[reply]
- Yeah, as is it's currently getting a list of all of the blanked pages (see User:VWBot/Trial), having it just make a note for its conclusions as to whether they've been notified (or don't need to be in certain cases), won't be a problem. VernoWhitney (talk) 13:59, 21 June 2010 (UTC)[reply]
- Yeah, sorry about that, meant #4 (spotted that I had it wrong, and then forgot to fix it!) Okay, this is pretty much the same as SDPatrolBot II, and there doesn't seem to be much opposition from users. Let me know once you want to do a (user-space if possible) trial of task 4. If it is a user-space trial, would it be possible to list some pages which the bot thinks the creators' of have already been notified? Just so we get a larger number of pages to search through, if not that's fine, - Kingpin13 (talk) 13:44, 21 June 2010 (UTC)[reply]
Section break
Alright, let's see what's going on here:
- Task 1: Overall, seems fine. I'm going to assume that this was a bug and you fixed it, caused by starting the bot at the wrong time? Perhaps the bot should check to make sure it isn't re-adding a day that's already listed. In any event, I don't suspect this should be a problem. Perhaps, if another user transcludes the date before the bot does, but I don't think that will happen.
- Task 2: Seems fine. The only reported backlog was a mistake, however, due to the pages not having been created by CSBot. I don't suspect that this will be a problem, though. Unless the bot goes down and it's adding pages that don't exist, there shouldn't be any trouble. A few other things: what happens if another user (unintentionally) removes the {{nobacklog}} template? Does the bot have a fail-safe? Again, probably not important. Also, any chance we can combine this and this into one edit? Not important, as before, and I'd suspect that it would be a little difficult to program, but would be a nice feature.
- Task 3: Not exactly sure how to trial this, maybe in the bot's userspace?
- Task 4: Hm, now... this is a little more interesting. Apparently you're going to go with a WONTFIX on the CCI bug? Alright. Assuming we skip that one, the rest of the results are mostly okay, but I'm still a little worried about trialing this outside of the userspace. For example, AB Aurigae from 29 June could be especially problematic (worse than not notifying a contributor) because it is notifying someone who wrote the article but did not introduce the violation, which would be very frustrating if it happened to me. How do you feel about this?
- Task 5: I think it's safe to try this one out. Approved for trial (5 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete.
There you go. — The Earwig (talk) 17:03, 1 July 2010 (UTC)[reply]
- Okay, replying in order:
- Task 1: Yes, that edit was a mistake: I intended it to be another userspace test but didn't change the page-to-edit. I have since added a check for an existing transclusion.
- Task 2: That edit was actually working as intended, since there were 4 days transcluded. Yeah, it'll look odd if CSBot goes down again and nobody's removing empty dates, but really I was just letting them pile up so that I could see VWBot actually edit it. If someone removes the {{nobacklog}} tag, I have it set up to just add a new {{backlog}} tag before {{/doc}} since that shouldn't be removed. I'm also trying to find the right wording on some html comments to ask people to not remove the tag in the first place. It actually shouldn't be a problem to combine those edits, I just haven't yet.
- Task 3: I could set most of it up for Userspace just replacing WP:CP/... with User:VWBot/... . There is one CP task that DumbBOT is supposed to be doing that it isn't at the moment which is "list articles that are tagged {{copyvio}} but not listed". It worked at least through March, but it hasn't been recently and Tizio hasn't yet answered the question about it which I placed on his talk page a week ago, so that part at least could be done live.
- Task 4: Yeah, trying to track which CCIs have been recently closed and which subpages they may have and which articles they link to seems like a very messy project to me, so at least for now the workaround is to just not close CCIs for one day to let VWBot see the backlinks. As far as that particular mistagging goes, the copyvio is two sentences out of the 24 in the article (not including references and such) which makes it tricky. I already have the algorithm only looking inside the blanked portion of the article, so I think pagescraping the source(s) may be the only option to improve accuracy there. Any objection to me continuing this task in userspace so I can see if I can increase the accuracy?
- That's fine, feel free to continue testing it. — The Earwig (talk) 18:24, 1 July 2010 (UTC)[reply]
- Task 5: Great. Will start tonight. VernoWhitney (talk) 18:19, 1 July 2010 (UTC)[reply]
- Trial complete. 5-day trial completed for task #5. It didn't run every night (no new taggings some days), the first run began with a misstep, which was fixed, the second run missed the source parameter in the tag, but last night worked fine (I fixed my regex). Task #4 is still running in userspace, but not ready for the real-world yet. VernoWhitney (talk) 13:01, 6 July 2010 (UTC)[reply]
- Feel free to just consider Task #4 withdrawn for now (although I'd like to revisit it in the future). It's going to take me a while to work out the kinks in it so I'll just keep running it in userspace so I can follow-up manually until I feel it's ready for a live trial. VernoWhitney (talk) 15:51, 8 July 2010 (UTC)[reply]
{{BAGAssistanceNeeded}} On the off-chance someone is looking at this, I'm now trialing #3 in userspace, since #1, 2 and 5 have already completed live trials and 4 has been withdrawn for now. VernoWhitney (talk) 13:02, 5 August 2010 (UTC)[reply]
- Verno, just want to say I'm really sorry about the slow progress of this. I'm now back from camping, and don't have further plans to be away much over the rest of summer, so please feel free to badger me as much as you like :D. I'll try and take a look through the trial edits for #5, and #3. Thanks, - Kingpin13 (talk) 08:57, 7 August 2010 (UTC)[reply]
- That's fine. I've been coding this from scratch, so it's taken me a while to get things up and running too. Since it's been making edits for a bunch of different tasks, let me know if you have trouble sorting them out. VernoWhitney (talk) 11:35, 7 August 2010 (UTC)[reply]
- Right, the trial edits for task five look fairly straight forward. Now task three looks a bit more complicated, perhaps you could just outline the different tasks this involves? And if possible, link to a userspace trial edit for each one? - Kingpin13 (talk) 05:14, 11 August 2010 (UTC)[reply]
- Task 3 is a backup for Zorglbot and DumbBOT for the tasks they've been approved for at WP:CP. As such I planned on running most of this task at 00:20 or later, since their edits should all be finished by 00:15. This total task includes:
- Creating a new daily page (Zorglbot) as at User:VWBot/2010 August 7 or, if one is already created but not populated with {{subst:Cppage}} (DumbBOT), populating it. This second situation has been run once, here (the date's wrong because I ran it a day late) with the live version for comparison here.
- Updating WP:CP/NewListings (Zorglbot), moving in the new day and moving out the old day as here. This edit should be unnecessary, but must be done in order for Zorglbot to work correctly.
- Updating the main WP:CP page with the old day which can now be processed (Zorglbot) as here.
- Listing those articles which have been blanked but not listed on the just-finished daily page (DumbBOT). As DumbBOT is currently not doing this task (it's been broken since March or so if I recall correctly), I have been including this in my userspace listing alongside Task 5 as here so that I could copy/paste the edit by hand as here, where the first article listed is a newly blanked page and the 2nd-4th are Task 5's close paraphrases.
- I've been running all of this manually and tweaking it regularly, which means the timing is completely inconsistent and has lead to a couple of errors, such as not creating User:VWBot/2010 August 11 on time (I didn't copy/paste that line of code into my python console) and this duplicate entry when my code to make sure that it only acts as a backup was broken. That particular bug has now been fixed. VernoWhitney (talk) 13:56, 11 August 2010 (UTC)[reply]
- Great, thanks very much for sorting it like that, much easier to understand :). I feel that this is pretty much ready for approval, there's been a few problems in the trials, but everything seems to have been addressed, and if anything else pops up it can presumably be sorted on the job. So long as you're happy with it..? I think I'll mark it as approved. Cheers, - Kingpin13 (talk) 16:55, 16 August 2010 (UTC)[reply]
- Yeah, I think all of the bugs have been sorted out of the tasks that are currently up for approval (and I'll keep babysitting it just to make sure), so I'm comfortable running it live. VernoWhitney (talk) 17:03, 16 August 2010 (UTC)[reply]
- Great, thanks very much for sorting it like that, much easier to understand :). I feel that this is pretty much ready for approval, there's been a few problems in the trials, but everything seems to have been addressed, and if anything else pops up it can presumably be sorted on the job. So long as you're happy with it..? I think I'll mark it as approved. Cheers, - Kingpin13 (talk) 16:55, 16 August 2010 (UTC)[reply]
- Task 3 is a backup for Zorglbot and DumbBOT for the tasks they've been approved for at WP:CP. As such I planned on running most of this task at 00:20 or later, since their edits should all be finished by 00:15. This total task includes:
- Right, the trial edits for task five look fairly straight forward. Now task three looks a bit more complicated, perhaps you could just outline the different tasks this involves? And if possible, link to a userspace trial edit for each one? - Kingpin13 (talk) 05:14, 11 August 2010 (UTC)[reply]
- That's fine. I've been coding this from scratch, so it's taken me a while to get things up and running too. Since it's been making edits for a bunch of different tasks, let me know if you have trouble sorting them out. VernoWhitney (talk) 11:35, 7 August 2010 (UTC)[reply]
Approved. Very well. Seems to be making good edits, has the support of the community, and a great op ;). Good luck :) - Kingpin13 (talk) 17:07, 16 August 2010 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.
Operator: Xqt (talk · contribs)
Automatic or Manually assisted: automatic
Programming language(s): Python
Source code available: based on noreferences.py
Function overview: Goes over multiple pages of namespace:0 listed at Category:Pages with missing references list, searches for pages where <references /> is missing although a <ref> tag is present, and in that case adds a new references section. This also works if <ref>-tag is included via a template
Links to relevant discussions (where appropriate):
Edit period(s): daily
Estimated number of pages affected: > 9'000
Exclusion compliant (Y/N): Yes
Already has a bot flag (Y/N): Yes
Function details: This bot is running at de-wiki since several months. In addition to the basic noreferences.py it looks for well known templates which contains a <ref> template and checks whether this tag is already used at the actual page. After that it looks for on of the followin section to place the <references /> tag: References, Footnotes, Notes. If no section is given, it places a References section before one of the following: Further reading, External links, See also, Notes.
Discussion
- what makes this any different than say Wikipedia:Bots/Requests for approval/Mobius Bot 3? βcommand 21:41, 19 May 2010 (UTC)[reply]
- Xqbot uses the same behaviour as noreferences: it does not create a new section if one of these sections exists given above; if it needs a new section, it places it in the right order before on of that sections described above instead of always on the bottom of the text. It also recognizes all those templates which include a <references /> tag, not only the {{reflist}} and it places <references /> instead of the {{reflist}} template (exept this would be required here). It is able to detect ref tags by the page content displayed on the screen instead of only one given template part as Mobius Bot does and which is not sure placing a ref tag. And last it find references tags in more various appearance. here are some samples of doing this job on de-wiki. -Xqt (talk) 22:46, 19 May 2010 (UTC)[reply]
Approved for trial (30 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Josh Parris 08:27, 20 May 2010 (UTC)[reply]
- Why did you approve this for trial without having the operator address the issues I'm raising? βcommand 17:18, 20 May 2010 (UTC)[reply]
- The operator is proposing using a standard pywikipedia bot; one trial doesn't preclude another, so if the discussion results in a customization we can run another trial, but if no customization is required the trial is out of the way allowing earlier approval. Basically, my view is that a small trial doesn't prejudice the outcome. Successful conclusion of the trial doesn't equate to bot approval. I didn't intend to disrespect or belittle your questions by way of this, and I hope the operator wouldn't interpret my behaviour as indicating your questions didn't demand answers; I certainly wouldn't close without them being addressed to your satisfaction. Josh Parris 06:57, 21 May 2010 (UTC)[reply]
- Instead you allow the bot to blindly damage wikipedia. before trialing a bot you should have all the major bugs worked out. That obviously is not the case here. As you saw in the other BRFA editing non-template caused errors often covers up more serious issues. βcommand 14:42, 21 May 2010 (UTC)[reply]
- This bot is not a anti-vandal-fighter. A good solution would be waiting a delay time before a vulnerable page is changed. My script is based on the pwb created nearby 3 years ago and I am running it since 1 1/2 years on de-wiki. This means I get reported if any behavior would be wrong and I am able to fix it in mean time. A trial period is also period to test the behavior and optimize it if needed. On the other hand it could help to improve if you could give me a sample of these cases where errors are lurking in your opinion. Thanks. -Xqt (talk) 19:42, 21 May 2010 (UTC)[reply]
- [30] [31] [32] [33] [34] [35] [36] and I can go on. βcommand 20:04, 21 May 2010 (UTC)[reply]
- This is a good point and a generic problem; any bot which repairs articles may be repairing an article that was damaged by vandalism that should have been reverted instead. Would an appropriate exclusion be articles with the last non-bot edit by a non-autoconfirmed user (ie, IP or <10 edits)? Josh Parris 07:15, 22 May 2010 (UTC)[reply]
- [30] [31] [32] [33] [34] [35] [36] and I can go on. βcommand 20:04, 21 May 2010 (UTC)[reply]
- This bot is not a anti-vandal-fighter. A good solution would be waiting a delay time before a vulnerable page is changed. My script is based on the pwb created nearby 3 years ago and I am running it since 1 1/2 years on de-wiki. This means I get reported if any behavior would be wrong and I am able to fix it in mean time. A trial period is also period to test the behavior and optimize it if needed. On the other hand it could help to improve if you could give me a sample of these cases where errors are lurking in your opinion. Thanks. -Xqt (talk) 19:42, 21 May 2010 (UTC)[reply]
- Instead you allow the bot to blindly damage wikipedia. before trialing a bot you should have all the major bugs worked out. That obviously is not the case here. As you saw in the other BRFA editing non-template caused errors often covers up more serious issues. βcommand 14:42, 21 May 2010 (UTC)[reply]
- The operator is proposing using a standard pywikipedia bot; one trial doesn't preclude another, so if the discussion results in a customization we can run another trial, but if no customization is required the trial is out of the way allowing earlier approval. Basically, my view is that a small trial doesn't prejudice the outcome. Successful conclusion of the trial doesn't equate to bot approval. I didn't intend to disrespect or belittle your questions by way of this, and I hope the operator wouldn't interpret my behaviour as indicating your questions didn't demand answers; I certainly wouldn't close without them being addressed to your satisfaction. Josh Parris 06:57, 21 May 2010 (UTC)[reply]
BetaCommand, xqt, what is the state of play here? Josh Parris 02:18, 31 May 2010 (UTC)[reply]
- I am playing soon. Sorry for that delay. I had some bigger projects in real life outside my home. The proposed exclusions seems good enought for implementing it. -Xqt (talk) 15:47, 3 June 2010 (UTC)[reply]
- Here is a sample with 25 edits -Xqt (talk) 16:57, 3 June 2010 (UTC)[reply]
- Looks good Xqt, but I want some sort of response to Betacommands concern before I approve it. Tim1357 talk
- There where some vandalized pages. Changing these page would not confuse the <reference /> tags but makes anti-vandal fighting more difficult and is not necessary. Therefore this bot just ignores IP-edits and gives the RC patrol the change to revert the last IP action. Also this bot may not run high frequented, ihmo one or two times daily is enough. This gives additional delay to react. During testing phase I found this a sufficient measure for the given problems. The code is published as mentioned above and has a code review since two weeks here. The remaining pages after a bot run like [37][38][39][40] have to be fixed by hand. -Xqt (talk) 07:25, 15 June 2010 (UTC)[reply]
- Looks good Xqt, but I want some sort of response to Betacommands concern before I approve it. Tim1357 talk
- Here is a sample with 25 edits -Xqt (talk) 16:57, 3 June 2010 (UTC)[reply]
Why are you using <references /> over say {{reflist}}
which seems to be the pretty standard way these days? Peachey88 (Talk Page · Contribs) 08:20, 14 July 2010 (UTC)[reply]
- Its pretty trivial, they do the same thing and I have seen both used extensively. Betacommand, if you are alright with xqts response Im ready to approve this. Tim1357 talk 23:35, 16 July 2010 (UTC)[reply]
- Id rather change it from ip edit to a time delay say 12 hours. but other than that its ok ΔT The only constant 23:49, 16 July 2010 (UTC)[reply]
- Like User:Tim1357: <references /> does the same as the template and the pywikibot framework has no suggestion to substitute <references /> with a given template. But it is not a problem to changes this behavior. Requests should be made at pywikibot framework request tracker (or at one of my talk pages). btw: the following templates which includes a <references /> tag are recognized by bot: . Xqt (talk) 17:42, 27 July 2010 (UTC)[reply]
[u'Reflist', u'Refs', u'FootnotesSmall',u'Reference', u'Ref-list', u'Reference list', u'References-small', u'Reflink', u'Footnotes', u'FootnotesSmall']
- Like User:Tim1357: <references /> does the same as the template and the pywikibot framework has no suggestion to substitute <references /> with a given template. But it is not a problem to changes this behavior. Requests should be made at pywikibot framework request tracker (or at one of my talk pages). btw: the following templates which includes a <references /> tag are recognized by bot:
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
Approved requests
Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 45) Approved 13:47, 29 May 2024 (UTC) (bot has flag)
- Numberguy6Bot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 13:18, 26 May 2024 (UTC) (bot has flag)
- BsoykaBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 13:18, 26 May 2024 (UTC) (bot has flag)
- SDZeroBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 13) Approved 13:08, 26 May 2024 (UTC) (bot has flag)
- CopyPatrolBot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 12:59, 18 May 2024 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 29) Approved 11:35, 3 May 2024 (UTC) (bot has flag)
- ButlerBlogBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 4) Approved 11:35, 3 May 2024 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 42) Approved 11:39, 1 April 2024 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 43b) Approved 20:42, 31 March 2024 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 44) Approved 20:26, 31 March 2024 (UTC) (bot has flag)
- BattyBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 82) Approved 12:48, 30 March 2024 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 28) Approved 07:32, 29 March 2024 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 25) Approved 12:45, 13 March 2024 (UTC) (bot has flag)
- AnomieBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 81) Approved 21:04, 10 March 2024 (UTC) (bot has flag)
- FrostlySnowman (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 11) Approved 09:40, 17 February 2024 (UTC) (bot has flag)
- AnomieBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 82) Approved 19:58, 4 February 2024 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 43) Approved 13:45, 4 February 2024 (UTC) (bot has flag)
- SDZeroBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 12) Approved 16:43, 1 February 2024 (UTC) (bot has flag)
- The Sky Bot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 13:02, 25 January 2024 (UTC) (bot has flag)
- DeadbeefBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 19:46, 23 January 2024 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 26) Approved 14:24, 6 January 2024 (UTC) (bot has flag)
- BsoykaBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Approved 13:35, 1 January 2024 (UTC) (bot has flag)
- ButlerBlogBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 13:35, 1 January 2024 (UTC) (bot has flag)
- Cewbot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 12) Approved 13:38, 31 December 2023 (UTC) (bot has flag)
- BattyBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 79) Approved 08:41, 31 December 2023 (UTC) (bot has flag)
- KiranBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 8) Approved 16:54, 28 December 2023 (UTC) (bot has flag)
- BattyBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 80) Approved 14:13, 17 December 2023 (UTC) (bot has flag)
- ButlerBlogBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Approved 14:13, 17 December 2023 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 27) Approved 19:35, 14 December 2023 (UTC) (bot has flag)
- BaranBOT (BRFA · contribs · actions log · block log · flag log · user rights) Approved 13:06, 14 December 2023 (UTC) (bot has flag)
Denied requests
Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.
- SagaCookBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 16:33, 6 July 2010 (UTC)
- CountBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 00:27, 20 June 2010 (UTC)
- TotalDramaBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 21:49, 18 June 2010 (UTC)
- ReplyBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 21:23, 6 June 2010 (UTC)
- Infobox Bot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 03:27, 29 May 2010 (UTC)
- TFAProtectorBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 21:43, 27 May 2010 (UTC)
- Fixatypobot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 10:25, 15 May 2010 (UTC)
- ValhallaBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 20:40, 3 May 2010 (UTC)
- HBC AIV helperbot 10 (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 20:34, 3 May 2010 (UTC)
- ChinaRailwayENGED (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 08:47, 10 April 2010 (UTC)
- Velociraptorbot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 00:02, 1 March 2010 (UTC)
- Seobot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 03:31, 20 February 2010 (UTC)
- GeneGoBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 08:20, 23 January 2010 (UTC)
- Template Maintenance Bot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 22:29, 28 December 2009 (UTC)
- IronBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 02:18, 19 December 2009 (UTC)
- Andrea105Bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 5) Bot denied 03:54, 14 December 2009 (UTC)
- Andrea105Bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 4) Bot denied 03:54, 14 December 2009 (UTC)
- Andrea105Bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Bot denied 03:54, 14 December 2009 (UTC)
- MisterWikiBot (2nd) (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 20:30, 4 December 2009 (UTC)
- MisterWikiBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 16:48, 4 December 2009 (UTC)
- EmBOTellado (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 03:48, 2 December 2009 (UTC)
Expired/withdrawn requests
These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at anytime. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.
- DodoBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 16:39, 6 July 2010 (UTC)
- Mobius Bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Expired 01:47, 28 June 2010 (UTC)
- WildBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 7) Withdrawn by operator 20:33, 19 June 2010 (UTC)
- ChrisSalij Bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 16:09, 18 June 2010 (UTC)
- DASHBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 13) Withdrawn by operator 04:58, 15 June 2010 (UTC)
- Mobius Bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Expired 15:48, 13 June 2010 (UTC)
- SpeakerBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 14:02, 13 June 2010 (UTC)
- CrimsonBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Withdrawn by operator 01:13, 3 June 2010 (UTC)
- AushulzBot (BRFA · contribs · actions log · block log · flag log · user rights) Expired 02:16, 31 May 2010 (UTC)
- CMoonBot (BRFA · contribs · actions log · block log · flag log · user rights) Expired 01:15, 31 May 2010 (UTC)
- SpeakerBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Withdrawn by operator 11:48, 30 May 2010 (UTC)
- SpeakerBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 03:25, 29 May 2010 (UTC)
- SmackBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: XXIV) Expired 11:11, 25 May 2010 (UTC)
- usyd-schwa-querybot (BRFA · contribs · actions log · block log · flag log · user rights) Expired 11:11, 25 May 2010 (UTC)
- AVBOT (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 12:19, 24 May 2010 (UTC)
- Ohms Law Bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 21:14, 19 May 2010 (UTC)
- EditCountBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 20:18, 13 May 2010 (UTC)
- MPUploadBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Withdrawn by operator 06:00, 10 May 2010 (UTC)
- OwensQueryBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 20:38, 7 May 2010 (UTC)
- DASHBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 9.5) Withdrawn by operator 00:51, 28 April 2010 (UTC)
- TorNodeBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 23:37, 25 April 2010 (UTC)
- DASHBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 8) Withdrawn by operator 19:51, 28 March 2010 (UTC)
- EarwigBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 11) Withdrawn by operator 17:37, 22 March 2010 (UTC)
- Full-date unlinking bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 03:30, 9 March 2010 (UTC)
- Addbot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 22) Expired 03:20, 9 March 2010 (UTC)
- ContentCreationBOT (BRFA · contribs · actions log · block log · flag log · user rights) Expired 03:00, 9 March 2010 (UTC)
- MondalorBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 13:21, 1 March 2010 (UTC)
- Sarukebot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 20:18, 11 February 2010 (UTC)
- SoxBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 20) Withdrawn by operator 06:11, 25 January 2010 (UTC)
- Redirectcreation Bot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 02:32, 25 January 2010 (UTC)
- SvickBOT (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 01:28, 22 January 2010 (UTC)
- CobraBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Withdrawn by operator 01:57, 20 January 2010 (UTC)
- MWOAPBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 21:56, 16 January 2010 (UTC)
- DASHBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 6) Withdrawn by operator 06:34, 8 January 2010 (UTC)
- MWOAPBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 08:50, 5 January 2010 (UTC)
- Coreva-Bot 2 (BRFA · contribs · actions log · block log · flag log · user rights) Expired 01:28, 3 January 2010 (UTC)
- SmackBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: XXII) Withdrawn by operator 21:43, 29 December 2009 (UTC)
- Chris G Bot 2 (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 11:00, 29 December 2009 (UTC)
- DASHBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 17:09, 22 December 2009 (UTC)
- WaybackBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 05:02, 10 December 2009 (UTC)
- DrilBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 5) Withdrawn by operator 04:50, 5 December 2009 (UTC)
- SmackBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: XXIII) Withdrawn by operator 14:23, 3 December 2009 (UTC)
- ActiveAdminBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 01:29, 2 December 2009 (UTC)