Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Mediawiki-apiOctober 2008

mediawiki-api@lists.wikimedia.org
  • 26 participants
  • 21 discussions
Start a nNew thread
API to access viewcount?
by Jim Tittsler 30 Apr '09

30 Apr '09
How can I access the 'viewcount' property of a page?
4 6
0 0
API Implementation - Critique
by Mark Henderson 20 Feb '09

20 Feb '09
I need some help - I've implemented the API using CodeIgniter for a clients site and everything works on the site except for the wiki pages, which are sluggish.Here is my controller which lists the function calls being made and in what order:http://pastebin.com/m4021c1a7And here is my model where the actual API calls are made:http://pastebin.com/dd7c504cThe api.php file exists on the same server that is making the calls, so it should be instant but the pages are so very slow, and only in the instance of using the MW API.Any help or insight anybody could give would be MUCH appreciated.- Mark
4 7
0 0
List of all authors via API
by Johannes Beigel 01 Nov '08

01 Nov '08
We're heavily using the MediaWiki API in our opensource project mwlib (http://code.pediapress.com/ ), so first of all: Thanks to you all for implementing this functionality to MediaWiki!Maybe you're following the discussion initiated by Erik Möller on Foundation-l about appropriate attribution. As there is yet a consesus to be found, we plan to include all authors (minus minor edits, minus bots) after each article in documents (PDFs, ODFs) rendered from article collections.Currently we're using an API query with prop=revisions, requesting rvprop=user|ids|flags. Afterwards we're filtering out minor edits, anonymous/IP edits and bot edits (via regular expression on username and comment) and combine edits by the same author. To retrieve the data for all revisions for heavily edited articles (e.g. [[en:Physics]]), this requires lots of API requests with rvlimit=500.Is there a way (or a plan to implement one) to retrieve the list of unique contributors for a given article (from a given revision down to the first one)? Ideally this would accept parameters for the mentioned filtering. I guess inside of MediaWiki code this can be handled very efficiently (using appropriate database queries) and would eliminate the need to transfer lots of redundant data over the socket.-- Johannes Beigel
5 7
0 0
Amazon File Repo
by Daniel Friesen 31 Oct '08

31 Oct '08
Before I decide to work on it sometime in the future, anyone else interested in creating a LocalFileRepo for Amazon's API?Unless someone corrects me, the best method of dealing with Amazon S3 for storing images would be to make use of S3's API, rather than mounting buckets onto the filesystem. The former should be more reliable (^_^ trying to use a mountpoint will probably drive someone up the wall like NFS does for brion), and also using the API should be more reliable for handling multiple buckets, since as I recall the Amazon docs say that buckets can only hold up to 5Gb each.Though considering the large things to deal with for multiple buckets, and the fact that the best methods will probably also have some url redirect handling as well to keep the standard urls, it might be best as an extension rather than put into core.-- ~Daniel Friesen (Dantman, Nadir-Seen-Fire)~Profile/Portfolio:http://nadir-seen-fire.com-The Nadir-Point Group (http://nadir-point.com)--It's Wiki-Tools subgroup (http://wiki-tools.com)--The ElectronicMe project (http://electronic-me.org)-Wikia ACG onWikia.com (http://wikia.com/wiki/Wikia_ACG)--Animepedia (http://anime.wikia.com)--Narutopedia (http://naruto.wikia.com)
1 0
0 0
Getting strange characters at action_parse
by Rainer Terhart 29 Oct '08

29 Oct '08
Hello,i don not know, if i am right here, but i got a strange parse-response from wikiparser today.Maybe one can have a look on it. (i am from germany and i used wikis api.php for parsing the entry("Baum");here is the request i used:http://de.wikipedia.org/w/api.php?action=parse&prop=text&format=xml&page=ba…this should be the last parsed text: "<p><span id="interwiki-he-fa" class="FA"></span></p>"actually, api.php adds more text at the end of the response.<p><a href="/w/index.php?title=Af:Boom&....................class="new" title="Zh-yue:&#27193; (Seite nicht vorhanden)">zh-yue:&#27193;</a></p>In the browser, this is shown as very strange HTML-Text.Did i made something wrong?This only happens at the page=BaumGreetingsRaenaet_______________________________________________________________________Jetzt neu! Schützen Sie Ihren PC mit McAfee und WEB.DE. 30 Tagekostenlos testen.http://www.pc-sicherheit.web.de/startseite/?mc=022220
4 4
0 0
Hi all,I haven't found an answer to this elsewhere, so I'm posing the questionhere.Is it possible to use a newer version of the mediawiki api (perhaps bycopying api.php from a newer version) with an older MW installation? I amon MW 1.12 and would like to use the edit feature introduced in MW 1.13, butwithout the effort of upgrading our entire wiki. Is this possible?Thanks for your help.Matthew
3 3
0 0
As of r42471 [1], prop=revisions&rvprop=content will no longer throw an error when too many titles or revisions are specified, but will throw a warning and ignore the superfluous titles/revisions. The warning message is identical to the one issued when too many values are specified for the titles or revids parameter.This change was made to fix bug 16074 [2], which occurred when a generator with gXXlimit=max (or any sufficiently high limit, really) was used to feed prop=revisions&rvprop=content, which would then throw an error because it was fed too many titles or revisions. However, the generator is not aware that prop=revisions threw away most of its results, and will set a query-continue as if this didn't happen. If this query-continue value is used by the client, a (potentially large) number of results will be skipped. When continuing such a request (i.e. one with a generator feeding prop=revisions&rvprop=content with a high or maximum limit), you have to set gXXlimit to a sufficiently low value first, so prop=revisions doesn't receive too many results and doesn't throw stuff away. The right number can be found in the text of the warning message, which is always something like "Too many values supplied for parameter 'titles': the limit is 50" (note that both 'titles' and the number 50 may vary).Finally, it should be noted that this behavior can only occur with prop=revisions&rvprop=content and only when a generator is used to feed it. All other modules and all uses of prop=revisions not involving both rvprop=content and a generator are not affected.Roan Kattouw (Catrope)[1]http://www.mediawiki.org/wiki/Special:Code/MediaWiki/42471[2]https://bugzilla.wikimedia.org/show_bug.cgi?id=16074_______________________________________________Mediawiki-api-announce mailing listMediawiki-api-announce(a)lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
1 0
0 0
editing and timestamp confusion
by Jools Smyth 24 Oct '08

24 Oct '08
I am slightly confused over the editing of a page and avoiding a conflictusing the api.The documentation says"To edit a page, an edit token is required. This token is the same for allpages, but changes at every login. If you want to protect against editconflicts (which is wise), you also need to get the timestamp of the lastrevision. You can obtain these as follows:"And I had implemented this literally, so that on submitted the edit, Ifirst get the timestamp of the very last revision and then submit mychanges. But I assume I really need toGet page contents and store the timestamp - make some changes to the page.when I edit pass my timestamp back to the API.Is this correct? If so, maybe the documentation would be better to say"you also need the timestamp of the revision your edits are based on" ?Best RegardsJools
7 14
0 0
Accessing Template Fields
by Brendan Crosser-McGay 17 Oct '08

17 Oct '08
I looked through the API docs I found online, and I didn't see anystraightforward way to access template fields on a page through any obviousmeans. As an example, I want to access a page, pull in values from a numberof different pre-defined wiki-page-template fields. Is there any way to dothis, or am I stuck doing regex on a big blob of text?Thanks,Brendan
3 3
0 0
hi all, How can i use longitude and latitude get the nearby articles from wiki?-- Kind Regards, NeilSkype: anim510Twitter: anim510Email: lvjiajun(a)nibirutech.comEmail: anim510(a)163.com
2 1
0 0
Results per page:

[8]ページ先頭

©2009-2026 Movatter.jp