Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Mediawiki-apiMarch 2015

mediawiki-api@lists.wikimedia.org
  • 18 participants
  • 12 discussions
Start a nNew thread
MediaWiki API in specific usage
by Support Technique 19 Feb '26

19 Feb '26
Hello,I'm developing a Joomla 3! plugin based on mediawiki API. The goal is importing all datas (titles and description or definitions) from Wiki database to a Joomla website database. Cause the Joomla! website will be finally used as a dictionnary. Now can you tell me how to proceed, please? Unfortunately, the available tutorials on MediaWiki are not so clear on this usage situation. Thanks for answer me as soon as you can!-- Logo Internis Group <http://www.internis-group.com/> *Modeste EKAMBI*Senior Software Engineer - Internis GroupAgence experte des projets web et IT - Développement sur mesure - Développement web et mobile, d'applications et sites internetSkype: internis.groupTéléphone: 699-789-772 / 679-998-516Mail: support(a)internis-group.comBlog:http://blog.internis-group.comSite:http://www.internis-group.com <http://internis-group.com>
2 1
0 0

20 Jan '26
Hi there,I'm using the API to extract the raw wiki text from my pages, usingthe "?action=query&titles=Main_Page&export&exportnowrap" syntax. Thatworks perfectly.Now I would like to get the templates expanded out in the result, so Iuse: "?action=query&titles=Main_Page&prop=revisions&rvlimit=1&rvprop=content&rvexpandtemplates",which does the job, as expected, but it also strips out the comments.My problem is that the comments are meaningful to me (I use them tohelp process the wiki text in subsequent steps).Is there a way to expand templates with the API, but leave thecomments intact?Thanks,Kevin
4 5
0 0
Need to extract abstract of a wikipedia page
by aditya srinivas 23 Nov '23

23 Nov '23
Hello,I am writing a Java program to extract the abstract of the wikipedia pagegiven the title of the wikipedia page. I have done some research and foundout that the abstract with be in rvsection=0 So for example if I want the abstract of 'Eiffel Tower" wiki page then I amquerying using the api in the following way.http://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Eiffel…and parse the XML data which we get and take the wikitext in the tag <revxml:space="preserve"> which represents the abstract of the wikipedia page.But this wiki text also contains the infobox data which I do not need. Iwould like to know if there is anyway in which I can remove the infobox dataand get only the wikitext related to the page's abstract Or if there is anyalternative method by which I can get the abstract of the page directly.Looking forward to your help.Thanks in AdvanceAditya Uppu
4 3
0 0
When list=allusers is used with auactiveusers, a property 'recenteditcount'is returned in the result. In bug 67301[1] it was pointed out that thisproperty is including various other logged actions, and so should really benamed something like "recentactions".Gerrit change 130093,[2] merged today, adds the "recentactions" resultproperty. "recenteditcount" is also returned for backwards compatability,but will be removed at some point during the MediaWiki 1.25 developmentcycle.Any clients using this property should be updated to use the new propertyname. The new property will be available on WMF wikis with 1.24wmf12, seehttps://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap for the schedule. [1]:https://bugzilla.wikimedia.org/show_bug.cgi?id=67301 [2]:https://gerrit.wikimedia.org/r/#/c/130093/-- Brad Jorsch (Anomie)Software EngineerWikimedia Foundation_______________________________________________Mediawiki-api-announce mailing listMediawiki-api-announce(a)lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
1 1
0 0
is it bug?
by ArtGiray . 29 Mar '15

29 Mar '15
I want to get a list that "user" is *"patroller" and "blocked"*my query:https://en.wikipedia.org/w/api.php?rawcontinue&action=query&list=allusers&*augroup=patroller&aurights=block*&aulimit=50&auprop=blockinfobut the api *returns users that is not blocked?*aurights=block is not working probably?is it bug on api or am i use wrong?
2 2
0 0
Hi,Preface: I understand very little about how ResourceLoader works.I was trying to use the mw.Api module to do some API calls, following thegeneral example of the documentation:*var api = new mw.Api();**api.get( {** action: 'query',** meta: 'userinfo'** } ).done ( function ( data ) {** console.log( data );** } );*However, I was getting an error in the JS console:*TypeError: undefined is not a constructor (evaluating 'new mw.Api()')*For our dev server and on my local instance of MediaWiki (on a Mac),$wgResourceLoaderDebug is set to *true* so we can debug our individual JSfiles. For kicks, I set this value to *false *in my page URL (e.g. "http://mywiki/index.php/MyPage?*debug=false*"), allowing ResourceLoader todo its magic - the error went away, giving me the data I wanted. When Iflipped it back to *true*, the error returned.Is this known behavior, a bug? Or could I have some other strangeconfiguration which is causing this problem?Thanks,--Jason Jijason.y.ji(a)gmail.com
4 11
0 0
Wikipedia API request time
by avi shavit 22 Mar '15

22 Mar '15
Hello, I am very new to the Wikimedia API, and have only just startedworking with it.I am writing code to access a Wikipedia page via the API, retrieve the textwith prop=extracts, retrieve wikitext via prop=revisions, and retrieve 7links of some importance, for each of which I query the page image(pageimages) and it's url (imageinfo),I am preforming these requests sequentially in accordance with thedescription given in the API docs.I have found that requests, especially the first two for the raw text andwikitext are taking much longer than expected to conclude. Retrieving plaintext takes around 700 milliseconds, and for wikitext around 800-900 millis.Some of the problem is related to the response size probably, but queryingfor image name and then url which both return a singe result also takesaround 600 milliseconds.This means that my code is taking around 8-9 seconds to finish running andthat is quite too long. I will be uniting requests with multiple titles,however I wanted to know if there is any other possible solution.Thanks!
1 0
0 0
Fwd: Api new user
by ArtGiray . 18 Mar '15

18 Mar '15
Hi,How can i get "new users" from wikipedi api?What request should i write?https://en.wikipedia.org/w/api.php
5 8
0 0

15 Mar '15
Hello all,I am happy to announce the beta release of the Wikimedia REST Content APIathttps://rest.wikimedia.org/Each domain has its own API documentation, which is auto-generated fromSwagger API specs. For example, here is the link for the English Wikipedia:https://rest.wikimedia.org/en.wikipedia.org/v1/?docAt present, this API provides convenient and low-latency access to articleHTML, page metadata and content conversions between HTML and wikitext.After extensive testing we are confident that these endpoints are ready forproduction use, but have marked them as 'unstable' until we have alsovalidated this with production users. You can start writing applicationsthat depend on it now, if you aren't afraid of possible minor changesbefore transitioning to 'stable' status. For the definition of the terms‘stable’ and ‘unstable’ seehttps://www.mediawiki.org/wiki/API_versioning .While general and not specific to VisualEditor, the selection of endpointsreflects this release's focus on speeding up VisualEditor. By storingprivate Parsoid round-trip information separately, we were able to reducethe HTML size by about 40%. This in turn reduces network transfer andprocessing times, which will make loading and saving with VisualEditorfaster. We are also switching from a cache to actual storage, which willeliminate slow VisualEditor loads caused by cache misses. Other users ofParsoid HTML like Flow, HTML dumps, the OCG PDF renderer or Contenttranslation will benefit similarly.But, we are not done yet. In the medium term, we plan to further reduce theHTML size by separating out all read-write metadata. This should allow usto use Parsoid HTML with its semantic markup<https://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec> directly forboth views and editing without increasing the HTML size over the currentoutput. Combined with performance work in VisualEditor, this has thepotential to make switching to visual editing instantaneous and free of anyscrolling.We are also investigating a sub-page-level edit API for micro-contributionsand very fast VisualEditor saves. HTML saves don't necessarily have to waitfor the page to re-render from wikitext, which means that we canpotentially make them faster than wikitext saves. For this to work we'llneed to minimize network transfer and processing time on both client andserver.More generally, this API is intended to be the beginning of a multi-purposecontent API. Its implementation (RESTBase<http://www.mediawiki.org/wiki/RESTBase>) is driven by a declarativeSwagger API specification, which helps to make it straightforward to extendthe API with new entry points. The same API spec is also used toauto-generate the aforementioned sandbox environment, complete with handy"try it" buttons. So, please give it a try and let us know what you think!This API is currently unmetered; we recommend that users not perform morethan 200 requests per second and may implement limitations if necessary.I also want to use this opportunity to thank all contributors who made thispossible:- Marko Obrovac, Eric Evans, James Douglas and Hardik Juneja on theServices team worked hard to build RESTBase, and to make it as extensibleand clean as it is now.- Filippo Giunchedi, Alex Kosiaris, Andrew Otto, Faidon Liambotis, RobHalsell and Mark Bergsma helped to procure and set up the Cassandra storagecluster backing this API.- The Parsoid team with Subbu Sastry, Arlo Breault, C. Scott Ananian andMarc Ordinas i Llopis is solving the extremely difficult task of convertingbetween wikitext and HTML, and built a new API that lets us retrieve andpass in metadata separately.- On the MediaWiki core team, Brad Jorsch quickly created a minimalauthorization API that will let us support private wikis, and Aaron Schulz,Alex Monk and Ori Livneh built and extended the VirtualRestService thatlets VisualEditor and MediaWiki in general easily access external services.We welcome your feedback here:https://www.mediawiki.org/wiki/Talk:RESTBase- and in Phabricator<https://phabricator.wikimedia.org/maniphest/task/create/?projects=RESTBase&…:>.Sincerely --Gabriel WickePrincipal Software Engineer, Wikimedia Foundation
4 3
0 0

10 Mar '15
发件人: shiyue zhang [mailto:byryuer@gmail.com] 发送时间: 2015年3月10日 22:04收件人: mediawiki-api(a)lists.wikimedia.org主题: why I can't see deleted revision information? Hello, I’m a new user of Wikipedia. My username is Yuer3677.I’m a senior student in BUPT China, since I am working on my graduatingthesis, which is related to Wikipedia, I really need to get the deletedrevision information.I try to use action=“query” and list=” deletedrevs”, but the result is <?xml version="1.0"?><api servedby="mw1192"> <error code="drvpermissiondenied" info="You don’t have permission to viewdeleted revision information" xml:space="preserve">Seehttps://en.wikipedia.org/w/api.php for API usage</error></api> So, I want to know why I can’t view the deleted revision information?Since I really need the data to do my research, what can I do to get theinformation? Please help me. Thanks a lot!
2 1
0 0
Results per page:

[8]ページ先頭

©2009-2026 Movatter.jp