User:R. Hillgentleman/doc
閱讀設定
doc included in framework
[編輯]I extracted the documentation included in wkipedia.py with this :
import wikipedia import codecs manuel = file("manuel.txt", "w") doc = wikipedia.__doc__ for ligne in doc: manuel.write(ligne)
Maybe one can include this somewhere ?
Library to get and put pages on a MediaWiki. Contents of the library (objects and functions to be used outside, situation late August 2004) Classes: Page: A MediaWiki page __init__ : Page(Site, Title) - the page with title Title on wikimedia site Site title : The name of the page, in a form suitable for an interwiki link urlname : The name of the page, in a form suitable for a URL titleWithoutNamespace : The name of the page, with the namespace part removed section : The section of the page (the part of the name after '#') sectionFreeTitle : The name without the section part aslink : The name of the page in the form [[Title]] or [[lang:Title]] site : The wiki this page is in encoding : The encoding of the page isAutoTitle : If the title is a well known, auto-translatable title autoFormat : Returns (dictName, value), where value can be a year, date, etc., and dictName is 'YearBC', 'December', etc. isCategory : True if the page is a category, false otherwise isImage : True if the page is an image, false otherwise get (*) : The text of the page exists (*) : True if the page actually exists, false otherwise isRedirectPage (*) : True if the page is a redirect, false otherwise isEmpty (*) : True if the page has 4 characters or less content, not counting interwiki and category links interwiki (*) : The interwiki links from the page (list of Pages) categories (*) : The categories the page is in (list of Pages) linkedPages (*) : The normal pages linked from the page (list of Pages) imagelinks (*) : The pictures on the page (list of Pages) templates (*) : All templates referenced on the page (list of strings) getRedirectTarget (*) : The page the page redirects to isDisambig (*) : True if the page is a disambiguation page getReferences : List of pages linking to the page namespace : The namespace in which the page is permalink (*) : The url of the permalink of the current version move : Move the page to another title put(newtext) : Saves the page delete : Deletes the page (requires being logged in) (*) : This loads the page if it has not been loaded before; permalink might even reload it if it has been loaded before Site: a MediaWiki site messages : There are new messages on the site forceLogin(): Does not continue until the user has logged in to the site getUrl(): Retrieve an URL from the site Special pages: Dynamic pages: allpages(): Special:Allpages newpages(): Special:Newpages longpages(): Special:Longpages shortpages(): Special:Shortpages categories(): Special:Categories Cached pages: deadendpages(): Special:Deadendpages ancientpages(): Special:Ancientpages lonelypages(): Special:Lonelypages uncategorizedcategories(): Special:Uncategorizedcategories uncategorizedpages(): Special:Uncategorizedpages unusedcategories(): Special:Unusuedcategories Other functions: getall(): Load pages via Special:Export setAction(text): Use 'text' instead of "Wikipedia python library" in editsummaries argHandler(text): Checks whether text is an argument defined on wikipedia.py (these are -family, -lang, -log and others) translate(xx, dict): dict is a dictionary, giving text depending on language, xx is a language. Returns the text in the most applicable language for the xx: wiki output(text): Prints the text 'text' in the encoding of the user's console. input(text): Asks input from the user, printing the text 'text' first. showDiff(oldtext, newtext): Prints the differences between oldtext and newtext on the screen getLanguageLinks(text,xx): get all interlanguage links in wikicode text 'text' in the form xx:pagename removeLanguageLinks(text): gives the wiki-code 'text' without any interlanguage links. replaceLanguageLinks(oldtext, new): in the wiki-code 'oldtext' remove the language links and replace them by the language links in new, a dictionary with the languages as keys and either Pages or titles as values getCategoryLinks(text,xx): get all category links in text 'text' (links in the form xx:pagename) removeCategoryLinks(text,xx): remove all category links in 'text' replaceCategoryLinks(oldtext,new): replace the category links in oldtext by those in new (new a list of category Pages) stopme(): Put this on a bot when it is not or not communicating with the Wiki any longer. It will remove the bot from the list of running processes, and thus not slow down other bot threads anymore.