Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Mediawiki-apiSeptember 2012

mediawiki-api@lists.wikimedia.org
  • 8 participants
  • 4 discussions
Start a nNew thread

20 Jan '26
Hi there,I'm using the API to extract the raw wiki text from my pages, usingthe "?action=query&titles=Main_Page&export&exportnowrap" syntax. Thatworks perfectly.Now I would like to get the templates expanded out in the result, so Iuse: "?action=query&titles=Main_Page&prop=revisions&rvlimit=1&rvprop=content&rvexpandtemplates",which does the job, as expected, but it also strips out the comments.My problem is that the comments are meaningful to me (I use them tohelp process the wiki text in subsequent steps).Is there a way to expand templates with the API, but leave thecomments intact?Thanks,Kevin
4 5
0 0
Need to extract abstract of a wikipedia page
by aditya srinivas 23 Nov '23

23 Nov '23
Hello,I am writing a Java program to extract the abstract of the wikipedia pagegiven the title of the wikipedia page. I have done some research and foundout that the abstract with be in rvsection=0 So for example if I want the abstract of 'Eiffel Tower" wiki page then I amquerying using the api in the following way.http://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Eiffel…and parse the XML data which we get and take the wikitext in the tag <revxml:space="preserve"> which represents the abstract of the wikipedia page.But this wiki text also contains the infobox data which I do not need. Iwould like to know if there is anyway in which I can remove the infobox dataand get only the wikitext related to the page's abstract Or if there is anyalternative method by which I can get the abstract of the page directly.Looking forward to your help.Thanks in AdvanceAditya Uppu
4 3
0 0
Parse action - number of characters limit
by Łukasz Czyż 27 Sep '12

27 Sep '12
HelloI want to parse some variable-length wiki texts using 'parse' action.My problem is that API returns error if wiki text length exceeds somevalue, say 1000 characters. In such a case API returns well-formattedHTML page informing about unexpected error and it encourages to tryagain later. It doesn't look like expected behavior, rather like somefast-coded check. API documentation doesn't say anything about upperlength limit. Is there a way to deal with that? I wouldn't likesolution like 'splitting text into parts and parsing them separately'because it can be really problematic in some cases e.g. with very longlists ('#' markers).Łukasz
3 4
0 0
extracting external link portion of a wiki page
by Gunaratna, Dalkandura Arachchige Kalpa Shashika Silva 18 Sep '12

18 Sep '12
Hi, I have been using mediaWiki API to get categories a page belongs to using 'query' for 'action' and 'category' for 'prop'. Now I want to get external link part of a page. For example, Amoxicillin wikipedia pagehttp://en.wikipedia.org/wiki/Amoxicillin has a table like structure in the external links section down in the bottom of the page having links to many other related concepts to Amoxicillin. I want to retrieve links of those related concepts. Currently, I am looking for a way to get these details. I am not sure yet whether I can do it through the API or I just have to process the page to get those details. If there is a way to get these details through MediaWiki API, please respond. Thank you very much in advance.regards,Kalpa
5 6
0 0
Results per page:

[8]ページ先頭

©2009-2026 Movatter.jp