Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Mediawiki-apiMay 2009

mediawiki-api@lists.wikimedia.org
  • 23 participants
  • 13 discussions
Start a nNew thread
prop=revisions for multiple users?
by Charlotte Webb 19 Jun '09

19 Jun '09
Could we add some way for this query to get the combined edits of twoor more users on the same page? I know you can query several pages atonce but I'd want something like&users=Tom|Dick|Harry&titles=The_weather_in_London if possible.Thanks.—C.W.
2 1
0 0

05 Jun '09
Hi all,First post to the list. I've got a bunch of questions, and I hope this isthe right place to ask them.I'm interested in the idea of wiki 'mirroring': updating a second wiki ('B')periodically with content from wiki A. (There's of course some discussion ofthis on the web, so I'm aware that there's been quite a bit of thinking onthis already, but I couldn't quite find the solution I was looking for.)A first stab at mirroring would be to do a Special:Export on the whole of A,and then do a Special:Import on B. But this becomes impractical for largerwikis: Ideally, I just want to update what needs updating.The best way to do this would probably be something like list=recentchanges(going back to the date of last transfer). Of course this doesn't work,because recentchanges are are periodically purged, so cannot be used betweenarbitrary dates. The log doesn't seem to record edits (is this correct?), sothis can't be used to get a list of changes between two arbitrary dates.So, question 1: Is it possible to get a list of all changes (includingedits) between two dates (in a single query)?If one wanted the complete version history, then another way to do thiswould be to get all revisions since the last transfer made, i.e. somethinglike:action=query&prop=revisions&revids=1450|1451|1452|...&rvprop=content(then transform xml to Special:Import format, and upload). Together with aquery of the log, this would give you all changes.But suppose the wiki is very active or you don't have much bandwidth or yousimply don't want the whole version history, but just the latest versions(since the last transfer). The only way I can see is to do something likethis: - 1. Fetch the list of namespaces - 2. Get the list of revisions in each namespace (action=query&prop=revisions&generator=allpages for each namespace) - 3. See what needs updating, and then fetch all the changed pages.Question 2: Can you see a better way of doing this? Also, why won'tgenerator=allpages work across namespaces? (I guess there my be a reason whythat isn't possible to do easily.)One way would be to try something like:action=query&prop=revisions&generator=allpages&rvstart=20090521000000but this doesn't work.So, my question 3: Do you know why this doesn't work? I assume there isn'tan efficient mysql query to accomplish this, or are there other reasons?Finally, I guess I am wondering whether there are people actively interestedin discussing issues around wiki mirroring/synchronisation more. If so,what's the best mailing list for this?Sorry, the post got a bit longer than I expected - thanks for consideringthis!All the best,Bjoern
3 10
0 0

31 May '09
Hi all,I've just started on a MediaWiki extension that uses the MW API forbulk-editing articles. Unfortunately for some reason after making the call toedit the pages, instead of the browser seeing whatever I have written with$wgOut->addHTML(), the browser gets redirected to the last page that was edited.This is the code I am using to perform the edit, where $p is a result returnedfrom a previous Query API call (to get the list of pages that need editing.) $req = new FauxRequest(array( 'action' => 'edit', 'bot' => true, 'token' => $p['edittoken'], 'title' => $p['title'], 'summary' => $this->strSummary, 'text' => $newContent, 'basetimestamp' => $p['starttimestamp'] ), true); $processor = new ApiMain($req, true); $processor->execute();If I comment out the execute() line then I see my summary Special page, butwith execute() present I get pushed onto the last article edited instead(although every edit does go through successfully.) The other API call (forquerying the page content) works fine, it's only the Edit call that seems toexhibit this behaviour.Does anyone have any idea what's going on here? I'm running MW 1.14.0.Many thanks,Adam.
7 18
0 0
exporting certain revisions of a page
by Randomcoder 20 May '09

20 May '09
HelloI'm trying to use the Special:Export to export certain revisions of files.I've tried the code written in these pageshttp://www.mediawiki.org/wiki/Manual_talk:Parameters_to_Special:Exporthttp://www.mediawiki.org/wiki/Manual:Parameters_to_Special:ExportIt seems I can only get either the most current one , or all of the revisions.I need to take exactly some particular set of consecutive revisions starting from a date or an id number.Is that possible ?I've also taken a look at the page here describing the API of mediawikihttp://en.wikipedia.org/w/api.php export - Export the current revisions of all given or generated pages exportnowrap - Return the export XML without wrapping it in an XML result (same format as Special:Export). Can only be used with exportThis seems to again emphasize that it's not possible.Is this true , it's not possible to get to some particular revisions inside an article through the API ?Thank you,Stefan
2 3
0 0
Help to get uncategory page
by Nan Li 19 May '09

19 May '09
Hi , Following this page (http://en.wikipedia.org/w/api.php ) . I can't query uncategory pages on MediaWiki . Could you give me any helpor suggestion? Thanks. -Mark
2 4
0 0
Hope libmediawiki-api-perl gets into Debian
by jidanni@jidanni.org 14 May '09

14 May '09
I filedhttp://bugs.debian.org/cgi-bin/bugreport.cgi?bug=527536Debian Bug report logs - #527536Request For Packaging: libmediawiki-api-perl -- replacement for libmediawiki-perlin the hope that somebody puts it into Debian.
6 9
0 0

11 May '09
This might sound silly since I'm a newbie but what would be the best way tocall the API from a php script ?This is what I did :-<?phpecho exec( "lynx -dumphttp://en.wikipedia.org/w/api.php?action=opensearch&search=Malaysia&format=…");?>Any suggestions?-- Rezuan Asrahquestion = ( to ) ? be : ! be; -- Wm. Shakespearehttp://elusi.blogspot.com
4 7
0 0
svn repository is not updated?
by Rezuan Asrah 10 May '09

10 May '09
Hello,I tried to checkout the latest source code but the error said :-svn: PROPFIND request failed on'/viewvc/mediawiki/trunk/phase3/includes/api'svn: PROPFIND of '/viewvc/mediawiki/trunk/phase3/includes/api': 301 Moved (http://svn.wikimedia.org)This was the command :-svn cohttp://svn.wikimedia.org/mediawiki/trunk/phase3/includes/api wiki_apiAny idea?-- Rezuan Asrahquestion = ( to ) ? be : ! be; -- Wm. Shakespearehttp://elusi.blogspot.com
2 1
0 0
svn repository is not updated?
by MinuteElectron 10 May '09

10 May '09

10 May '09
Why not allow arbitrary SQL queries on most of the database tables?Let's see, only a few, like the user table, have much confidentialinformation, and even only a few columns of it too.So api.php could drop its read privileges for (parts of?) that tablebefore running any queries.Motivation example:It comes the time when all websites should check for link lint.OK, so I need a list of external links that are present in my wikis.$ echo "SELECT DISTINCT el_to FROM wiki_externallinks ORDER BY el_to;"| mysql -B my_databasegets it for me all with one command.Can api.php get all the external links, for all namespaces, all in one shot?Can Special:Linksearch get them all either, all in one shot?The sysop could also customize what tables/columns to restrict, andhow many rows to output. Also set the total row output limit too.No need for only allowing SELECT, as api.php would first drop allother privileges than read-only privileges, including the privilege toGRANT its privileges back to itself... No need to even filter againstSQL injection attacks (but as I don't even know how to spell SQL,don't quote me on that.)Anyway, being able to do arbitrary SQL would greatly simplify manyapi.php queries. Let's see, for the URL perhaps use:api.php?sql=SELECT+DISTINCT...(maybe use no CAPS in the examples to "sell the ease of the idea".)
9 19
0 0
Results per page:

[8]ページ先頭

©2009-2026 Movatter.jp