Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Mediawiki-apiFebruary 2009

mediawiki-api@lists.wikimedia.org
  • 23 participants
  • 19 discussions
Start a nNew thread
API to access viewcount?
by Jim Tittsler 30 Apr '09

30 Apr '09
How can I access the 'viewcount' property of a page?
4 6
0 0

25 Feb '09
Action=parse can take multiple titles, and you can getother page metadata in addition to just HTML output.Not to mention you can bundle it into one request withaction=parse|query.-ChadOn Feb 23, 2009 3:14 PM, "Michael Dale" <mdale(a)wikimedia.org> wrote:it would be really nice if we could get html output from the api query... this would avoid issuing doing dozens of action parse requestsseparately.It apperas to be mentioned pretty regularly... does anyone know if a bugto that end has been filed.. I will plop one in there if none exists(did not find any with in my quick search)--michaelBryan Tong Minh wrote:> On Fri, Feb 20, 2009 at 6:45 PM, marco tanzi <tanzi.marco(a)gmail.com>wrote:>>> I received a correct json object, but the content of the revision is>> full of data I do not need like {{....}} [[...]] ecc. I would like to>> get only the clean description, only text (like the one visible from>> the wiki website).>>>>>> Run the parsed text (action=parse) through an HTML parser that strips> all the tags.>>> Bryan>> _______________________________________________> Mediawiki-api mailing list> Mediawiki-api(a)lists.wikimedia.org>https://lists.wikimedia.org/mailman/listinfo/mediawiki-api>_______________________________________________Mediawiki-api mailing listMediawiki-api(a)lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/mediawiki-api
5 9
0 0
Geo Coordinates
by Julio Endara 24 Feb '09

24 Feb '09
Is it possible to use the API to query by geo graphical coordinates? E.g.articles near an LONG/LAT ? if so, could you point me to any examples? ThanksJulio
4 5
0 0
parse wikipedia
by marco tanzi 23 Feb '09

23 Feb '09
Hi folks,I am trying to work with the wikipedia API and i am having some little problems :-)I can fetch the main description of the topic i am looking for using:http://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=conten…I received a correct json object, but the content of the revision is full of data I do not need like {{....}} [[...]] ecc. I would like to get only the clean description, only text (like the one visible from the wiki website).How can I do that? there is some parser to clean my json object?hope someone could help me out|!kind regardsMarco
4 3
0 0

22 Feb '09
Hello!I'm developing a new pywikipedia bot that will parse the lonely pages and images (especially images) but I don't trust too much of the HTML code so I prefer to use the APIs instead. Could you please link me how to get a list of lonely images and/or pages? I've tried to find it on my own but I haven't succeeded. If it doesn't exist, is there someone so polite to add this feature?Thanks for the help,Filnik.
2 1
0 0
wikipedia - description and image
by marco tanzi 22 Feb '09

22 Feb '09
Hi guys,I am writing a ruby application to retrieve the wikipedia data: the main description and the main image (the one on the box in the left side).As parameter I have the cruid of the wiki page, so I call the wiki API to get the data, now start the problems:- Main description:I call the following link to retrieve the json object with the data of the main descriptionhttp://en.wikipedia.org/w/api.php?action=query&pageids=52780&prop=revisions…the object is well formed but the text is on wikipedia format.How is possible to convert it into a plain text? (without {{ }}, [ ] and <ref>)is it possible to get a text plain directly ?- Main img (if present)my second problem is to find the right image to show after a researchI have tried to fetch the main image of a wiki page using the following link:http://en.wikipedia.org/w/api.php?action=query&pageids=52780&prop=images&fo…but this object that i receive contains all the images of the page without specify where this images are used.how is possible to know exactly the image used on the left box of the wiki page?anyone can help me?Kind regardsMarco
2 3
0 0
Wikipedia - main description and image
by marco tanzi 20 Feb '09

20 Feb '09
Hi folks,I would like to know if it is possible to retrieve from a wikipedia page only the main description and, if it is available, the image.For example I would like to get only the main description and the image of the U2 band fromhttp://en.wikipedia.org/wiki/index.html?curid=52780 . How can I do this? I looked on the Wikipedia API (http://en.wikipedia.org/w/api.php ) but I haven't found nothing that feet my needs.It would be great if there is a web service that retrieve an XML/JSON object with this dataHope to hear from you soon!regards,Marco
2 6
0 0
API Implementation - Critique
by Mark Henderson 20 Feb '09

20 Feb '09
I need some help - I've implemented the API using CodeIgniter for a clients site and everything works on the site except for the wiki pages, which are sluggish.Here is my controller which lists the function calls being made and in what order:http://pastebin.com/m4021c1a7And here is my model where the actual API calls are made:http://pastebin.com/dd7c504cThe api.php file exists on the same server that is making the calls, so it should be instant but the pages are so very slow, and only in the instance of using the MW API.Any help or insight anybody could give would be MUCH appreciated.- Mark
4 7
0 0
generator exturlusage & prop extlinks
by Nicolas Dumazet 20 Feb '09

20 Feb '09
Hello!Refering tohttp://fr.wikipedia.org/w/api.php?action=query&generator=exturlusage&geuque…:I would expect a extlinks element nested in each page element, sinceeach page has at least a *.yu link ... Here, only a single <page>element provides the extlinks element.Am I missing something big here, or is this a nice bug ?Thanks,-- Nicolas Dumazet — NicDumZ [ nɪk.d̪ymz ]
2 2
0 0
In r46845 [1], the issue raised in bug 11430 [2] a year and a half ago was finally addressed: when the API was asked to produce huge amounts of data (for instance the content of 500 revisions at 280 KB each), it would run out of memory trying to store and process it. To prevent this from happening, the amount of data the API can return is now limited. This means that the behavior of requests that used to run out of memory has changed: they will return fewer results than the limit, even though there are more results available (they'll still set query-continue right, though). For instance, the aforementioned request would return about 300 revisions and set a query-continue for the rest.Roan Kattouw (Catrope)[1]http://www.mediawiki.org/wiki/Special:Code/MediaWiki/46845[2]https://bugzilla.wikimedia.org/show_bug.cgi?id=11430_______________________________________________Mediawiki-api-announce mailing listMediawiki-api-announce(a)lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
2 20
0 0
Results per page:

[8]ページ先頭

©2009-2026 Movatter.jp