Difference between revisions of "User:MaintenanceBot"
(another idea which would be pretty easy to code) |
(Software requirements) |
||
(9 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | + | '''User:MaintenanceBot''' automatically adds missing details to pages. To activate it, simply create a page with an [[SMWObject]]. For maximum effect, add a {{t|Wikipedia}} parameter to point it to the corresponding [[Wikipedia]] page. | |
− | |||
− | == | + | ==Object class== |
− | + | '''All Wikispooks pages must have an SMWObject'''. This requirement is fundamental for the MaintenanceBot to upgrade your page. The most common object type is [[person]], in which case write: | |
− | + | :{{t|<nowiki>{{Person}}</nowiki>}} | |
− | + | For a listof all possible objects, see [[SMWObject]]. | |
− | |||
− | == | + | ==Wikipedia parameter== |
− | + | MaintenanceBot cannot improve pages much unless they also have a {{t|{{!}}Wikipedia}} parameter (which may be abbreviated to "{{t|{{!}}WP}}"). | |
− | |||
− | |||
− | |||
− | |||
− | == | + | ===Foreign Wikipediae=== |
− | + | The Wikispooks working language is [[English]], and almost all the Wikipedia links are to {{t|en.wikipedia.org}}. Obscure foreigners may have no ''English'' Wikipedia page, but have a page in their native language. In this case, choosing a foreign Wikipedia page (for example {{t|'''de'''.wikipedia.org}}) is acceptable. MaintenanceBot can scrape non-English pages, but with reduced effectiveness. | |
− | |||
− | |||
− | == | + | ===Tagged URLs=== |
− | + | Sometimes, Wikipedia has no page, but does have a (sub-)section. In this case, a hashtagged URL should be used ("...wiki/page#section") but maintenance bot does ''not'' not use this for scraping. | |
+ | ==Further ideas, Mistakes?== | ||
+ | MaintenanceBot ''does'' make mistakes. Please report these, together with ideas on the [[User_talk:MaintenanceBot|discussion page]]. | ||
+ | |||
+ | ==History== | ||
+ | * 2016-11-0? v1.3 - MaintenanceBot is now more reliable. | ||
+ | * 2016-02-07 v1.2 - MaintenanceBot is now active again. | ||
+ | * 2015-12-? - Broken somewhen by server change | ||
+ | * 2015-07-12 v1.0 - First version of MaintenanceBot launched. | ||
+ | |||
+ | ==Software requirements== | ||
+ | MaintenanceBot was written in PHP5 and requires the {{t|php-curl}} and {{t|php-xml}} packages. | ||
[[Category:Wikispooks Bots]] | [[Category:Wikispooks Bots]] |
Latest revision as of 17:32, 17 January 2019
User:MaintenanceBot automatically adds missing details to pages. To activate it, simply create a page with an SMWObject. For maximum effect, add a Wikipedia parameter to point it to the corresponding Wikipedia page.
Contents
Object class
All Wikispooks pages must have an SMWObject. This requirement is fundamental for the MaintenanceBot to upgrade your page. The most common object type is person, in which case write:
- {{Person}}
For a listof all possible objects, see SMWObject.
Wikipedia parameter
MaintenanceBot cannot improve pages much unless they also have a |Wikipedia parameter (which may be abbreviated to "|WP").
Foreign Wikipediae
The Wikispooks working language is English, and almost all the Wikipedia links are to en.wikipedia.org. Obscure foreigners may have no English Wikipedia page, but have a page in their native language. In this case, choosing a foreign Wikipedia page (for example de.wikipedia.org) is acceptable. MaintenanceBot can scrape non-English pages, but with reduced effectiveness.
Tagged URLs
Sometimes, Wikipedia has no page, but does have a (sub-)section. In this case, a hashtagged URL should be used ("...wiki/page#section") but maintenance bot does not not use this for scraping.
Further ideas, Mistakes?
MaintenanceBot does make mistakes. Please report these, together with ideas on the discussion page.
History
- 2016-11-0? v1.3 - MaintenanceBot is now more reliable.
- 2016-02-07 v1.2 - MaintenanceBot is now active again.
- 2015-12-? - Broken somewhen by server change
- 2015-07-12 v1.0 - First version of MaintenanceBot launched.
Software requirements
MaintenanceBot was written in PHP5 and requires the php-curl and php-xml packages.