Talk:Pywikibot/Archive 1: Difference between revisions

Add topic
From Meta, a Wikimedia project coordination wiki
Content deleted Content added
m Billinghurst moved page Talk:Pywikipediabot/Archive 1 to Talk:Pywikibot/Archive 1: align to preferred name
 
(67 intermediate revisions by 36 users not shown)
Line 1: Line 1:
{{MovedToMediaWiki|Manual talk:{{PAGENAME}}}}
{| style="margin:1em; padding:5em 1em; background:#000; color:#FFF;"
|-
| [[Image:Stop hand 2.svg|50px]]
| This page is for discussion of the associated page "[[{{PAGENAME}}|<span style="color:#AAF;">{{PAGENAME}}</span>]]" ''only''. Questions or comments regarding Pywikipedia itself should be posted to the [mailto:pywikipediabot-users@lists.sourceforge.net <span style="color:#AAF;">Pywikipedia mailing list</span>] ([http://sourceforge.net/mailarchive/forum.php?forum_id=36014 <span style="color:#AAF;">archives</span>]); bug reports should be posted to the [http://sourceforge.net/tracker/?atid=603138&group_id=93107&func=browse <span style="color:#AAF;">Pywikipedia bug tracking system</span>]. For any other comments related to Pywikipedia itself, see the [http://sourceforge.net/projects/pywikipediabot <span style="color:#AAF;">Pywikipedia sourceforge page</span>].</big>
|}

== CVS diff ==
You recommend using "cvs diff". Surely you want to recommend "cvs diff -u"? This gives a much more robust diff format than the default (submitting patches in a format other than unidiff is a common newbie dev mistake). Subversion uses that format by default, and is nicer than CVS in lots of other ways, so you might want to move to that instead :) [[User:Hairy Dude|Hairy Dude]] 08:25, 7 March 2006 (UTC)
:pywikipediabot is a framework for which most of the users have installed CVS tools, and these users are not using CVS for anything else. It is bad enough that we required normal users to use developers tools to use the programs, migrating to subversion will not happen any time soon! You are completely right that -u should be used to generate patches against cvs. [[User:Rob Hooft|Rob Hooft]] 12:01, 9 April 2006 (UTC)

== CVS instructions ==

I think the CVS instructions need updating. They don't seem to work- did the CVS repository move to another address? --[[User:129.21.121.171|129.21.121.171]] 18:33, 16 May 2006 (UTC)

:Ok, it seems to be working now. I guess the CVS server ''did'' move. --[[User:129.21.121.171|129.21.121.171]] 04:05, 23 May 2006 (UTC)

== Unicode problem ==

I am writing a bot to run on Hindi Wikipedia. I am using python wikimedia framework. I am getting an error, "UnicodeDecodeError: 'utf8' codec can't decode byte 0xff in position 0: unexpected code byte" when executing the following code,

f = codecs.open('hi-towns.csv', 'r', config.textfile_encoding)
x = f.read().decode('utf8')

Any help will be greatly appreciated. Thanks, [[User:Ganeshk|Ganeshk]] 03:54, 16 July 2006 (UTC)
:This was solved. - [[User:Ganeshk|Ganeshk]] 15:15, 16 July 2006 (UTC)

== Not updated? ==

TortoiseCVS does not seem to work with Pywikipedia, <s>and the link http://sourceforge.net/cvs/?group_id=93107 seems to be dead</s>. - [[User:Soulkeeper|Soulkeeper]] 19:12, 1 September 2006 (UTC)

== python-xml ==

On another list it mentioned that the python-xml libraries were needed as well. With Fedora Linux I got that using 'yum install PyXML'. Other systems probably vary. [[User:143.52.82.248|143.52.82.248]] 13:57, 1 November 2006 (UTC)

== doc included in framework ==

I extracted the documentation included in wkipedia.py with this :
<pre>
import wikipedia
import codecs

manuel = file("manuel.txt", "w")
doc = wikipedia.__doc__

for ligne in doc:
manuel.write(ligne)
</pre>


----

Maybe one can include this somewhere ?

<pre>

Library to get and put pages on a MediaWiki.

Contents of the library (objects and functions to be used outside, situation
late August 2004)

Classes:
Page: A MediaWiki page
__init__ : Page(Site, Title) - the page with title Title on wikimedia site Site
title : The name of the page, in a form suitable for an interwiki link
urlname : The name of the page, in a form suitable for a URL
titleWithoutNamespace : The name of the page, with the namespace part removed
section : The section of the page (the part of the name after '#')
sectionFreeTitle : The name without the section part
aslink : The name of the page in the form [[Title]] or [[lang:Title]]
site : The wiki this page is in
encoding : The encoding of the page
isAutoTitle : If the title is a well known, auto-translatable title
autoFormat : Returns (dictName, value), where value can be a year, date, etc.,
and dictName is 'YearBC', 'December', etc.
isCategory : True if the page is a category, false otherwise
isImage : True if the page is an image, false otherwise

get (*) : The text of the page
exists (*) : True if the page actually exists, false otherwise
isRedirectPage (*) : True if the page is a redirect, false otherwise
isEmpty (*) : True if the page has 4 characters or less content, not
counting interwiki and category links
interwiki (*) : The interwiki links from the page (list of Pages)
categories (*) : The categories the page is in (list of Pages)
linkedPages (*) : The normal pages linked from the page (list of Pages)
imagelinks (*) : The pictures on the page (list of Pages)
templates (*) : All templates referenced on the page (list of strings)
getRedirectTarget (*) : The page the page redirects to
isDisambig (*) : True if the page is a disambiguation page
getReferences : List of pages linking to the page
namespace : The namespace in which the page is
permalink (*) : The url of the permalink of the current version
move : Move the page to another title

put(newtext) : Saves the page
delete : Deletes the page (requires being logged in)

(*) : This loads the page if it has not been loaded before; permalink might
even reload it if it has been loaded before

Site: a MediaWiki site
messages : There are new messages on the site
forceLogin(): Does not continue until the user has logged in to the site
getUrl(): Retrieve an URL from the site

Special pages:
Dynamic pages:
allpages(): Special:Allpages
newpages(): Special:Newpages
longpages(): Special:Longpages
shortpages(): Special:Shortpages
categories(): Special:Categories

Cached pages:
deadendpages(): Special:Deadendpages
ancientpages(): Special:Ancientpages
lonelypages(): Special:Lonelypages
uncategorizedcategories(): Special:Uncategorizedcategories
uncategorizedpages(): Special:Uncategorizedpages
unusedcategories(): Special:Unusuedcategories

Other functions:
getall(): Load pages via Special:Export
setAction(text): Use 'text' instead of "Wikipedia python library" in
editsummaries
argHandler(text): Checks whether text is an argument defined on wikipedia.py
(these are -family, -lang, -log and others)
translate(xx, dict): dict is a dictionary, giving text depending on language,
xx is a language. Returns the text in the most applicable language for
the xx: wiki

output(text): Prints the text 'text' in the encoding of the user's console.
input(text): Asks input from the user, printing the text 'text' first.
showDiff(oldtext, newtext): Prints the differences between oldtext and newtext
on the screen

getLanguageLinks(text,xx): get all interlanguage links in wikicode text 'text'
in the form xx:pagename
removeLanguageLinks(text): gives the wiki-code 'text' without any interlanguage
links.
replaceLanguageLinks(oldtext, new): in the wiki-code 'oldtext' remove the
language links and replace them by the language links in new, a dictionary
with the languages as keys and either Pages or titles as values
getCategoryLinks(text,xx): get all category links in text 'text' (links in the
form xx:pagename)
removeCategoryLinks(text,xx): remove all category links in 'text'
replaceCategoryLinks(oldtext,new): replace the category links in oldtext by
those in new (new a list of category Pages)
stopme(): Put this on a bot when it is not or not communicating with the Wiki
any longer. It will remove the bot from the list of running processes,
and thus not slow down other bot threads anymore.

</pre>

== Error in article ==

''(To work in Python25 http://www.python.org/download/, the files must be downloaded into the Python25 folder).''
I have installed Python25 and downloaded pywikipedia from cvs today, the pywikipedia files are on a different disk than the Python25 folder, and it works. I conclude that what this line claims is false. I am not 100% sure so I didn't edit it out myself. If anyone is 100% sure, please edit it out. [[User:Theroachman|Theroachman]] 23:47, 14 February 2007 (UTC)

:I've always thought this was false because I think I have Python 2.5 on the other computer I used to run Pywikipediabot on. But I just verified. I installed Python 2.5 and altered my environment variable so that the system would use Python25 instead of Python24, and Pywikipediabot runs the exact same as it did with Python 2.4 so this line must be false. I really don't get why anyone would have added this line. It's more unproductive than productive, IMHO you should never go and put frameworks like Pywikipediabot into the core folder of the programs running them.
:Also, for those who use TortiseCVS there is a bit of confusion with the CVSROOT for those who have never used it before, I'm going to tweak that to.
:Also, I have a good tip for Windows users that makes putting the bot in your My Documents not a pain to cd into. [[User:64.180.238.162|64.180.238.162]]

== using special characters ==

I would like to use the template.py, but run into one problem.

template.py "Tableheader 100" "Tableheader|100%"

Notice the %-sign in the example, it gets lost. Any idea, how it could be included?

Ok, found it: %% --[[User:GunterS|GunterS]] 13:05, 3 March 2007 (UTC)

== Dry run ==

Does anyone know how to run a bot, record edits that it would have made, but not actually make the edtits? (also known as a Dry-run)
:Within this framework, you have to edit the code in wikipedia.py under the Site class, under the postData method so that "con.send(data)" line prints data instead, you'll have to comment out the rest of that method as well. This may cause other problems but it's the best place (As far as I know) to start. [[User:138.67.78.236|138.67.78.236]]

Latest revision as of 12:13, 5 February 2014