Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Mediawiki-apiFebruary 2016

mediawiki-api@lists.wikimedia.org
  • 5 participants
  • 10 discussions
Start a nNew thread
MediaWiki API in specific usage
by Support Technique 19 Feb '26

19 Feb '26
Hello,I'm developing a Joomla 3! plugin based on mediawiki API. The goal is importing all datas (titles and description or definitions) from Wiki database to a Joomla website database. Cause the Joomla! website will be finally used as a dictionnary. Now can you tell me how to proceed, please? Unfortunately, the available tutorials on MediaWiki are not so clear on this usage situation. Thanks for answer me as soon as you can!-- Logo Internis Group <http://www.internis-group.com/> *Modeste EKAMBI*Senior Software Engineer - Internis GroupAgence experte des projets web et IT - Développement sur mesure - Développement web et mobile, d'applications et sites internetSkype: internis.groupTéléphone: 699-789-772 / 679-998-516Mail: support(a)internis-group.comBlog:http://blog.internis-group.comSite:http://www.internis-group.com <http://internis-group.com>
2 1
0 0

20 Jan '26
Hi there,I'm using the API to extract the raw wiki text from my pages, usingthe "?action=query&titles=Main_Page&export&exportnowrap" syntax. Thatworks perfectly.Now I would like to get the templates expanded out in the result, so Iuse: "?action=query&titles=Main_Page&prop=revisions&rvlimit=1&rvprop=content&rvexpandtemplates",which does the job, as expected, but it also strips out the comments.My problem is that the comments are meaningful to me (I use them tohelp process the wiki text in subsequent steps).Is there a way to expand templates with the API, but leave thecomments intact?Thanks,Kevin
4 5
0 0
Need to extract abstract of a wikipedia page
by aditya srinivas 23 Nov '23

23 Nov '23
Hello,I am writing a Java program to extract the abstract of the wikipedia pagegiven the title of the wikipedia page. I have done some research and foundout that the abstract with be in rvsection=0 So for example if I want the abstract of 'Eiffel Tower" wiki page then I amquerying using the api in the following way.http://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Eiffel…and parse the XML data which we get and take the wikitext in the tag <revxml:space="preserve"> which represents the abstract of the wikipedia page.But this wiki text also contains the infobox data which I do not need. Iwould like to know if there is anyway in which I can remove the infobox dataand get only the wikitext related to the page's abstract Or if there is anyalternative method by which I can get the abstract of the page directly.Looking forward to your help.Thanks in AdvanceAditya Uppu
4 3
0 0
When list=allusers is used with auactiveusers, a property 'recenteditcount'is returned in the result. In bug 67301[1] it was pointed out that thisproperty is including various other logged actions, and so should really benamed something like "recentactions".Gerrit change 130093,[2] merged today, adds the "recentactions" resultproperty. "recenteditcount" is also returned for backwards compatability,but will be removed at some point during the MediaWiki 1.25 developmentcycle.Any clients using this property should be updated to use the new propertyname. The new property will be available on WMF wikis with 1.24wmf12, seehttps://www.mediawiki.org/wiki/MediaWiki_1.24/Roadmap for the schedule. [1]:https://bugzilla.wikimedia.org/show_bug.cgi?id=67301 [2]:https://gerrit.wikimedia.org/r/#/c/130093/-- Brad Jorsch (Anomie)Software EngineerWikimedia Foundation_______________________________________________Mediawiki-api-announce mailing listMediawiki-api-announce(a)lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
1 1
0 0

18 Apr '16
We have decided to officially retire therest.wikimedia.org domain infavor of /api/rest_v1/ at each individual project domain. For example,https://rest.wikimedia.org/en.wikipedia.org/v1/?docbecomeshttps://en.wikipedia.org/api/rest_v1/?docMost clients already use the new path, and benefit from betterperformance from geo-distributed caching, no additional DNS lookups,and sharing of TLS / HTTP2 connections.We intend to shut down therest.wikimedia.org entry point aroundMarch, so please adjust your clients to use /api/rest_v1/ soon.Thank you for your cooperation,Gabriel-- Gabriel WickePrincipal Engineer, Wikimedia Foundation
2 2
0 0
Hi,we are considering a policy for REST API end point result formatversioning and negotiation. The background and considerations arespelled out in a task andmw.org page:https://phabricator.wikimedia.org/T124365https://www.mediawiki.org/wiki/Talk:API_versioningBased on the discussion so far, have come up with the followingcandidate solution:1) Clearly advise clients to explicitly request the expected mime typewith an Accept header. Support older mime types (with on-the-flytransformations) until usage has fallen below a very low percentage,with an explicit sunset announcement.2) Always return the latest content type if no explicit Accept headerwas specified.We are interested in hearing your thoughts on this.Once we have reached rough consensus on the way forward, we intend toapply the newly minted policy to an evolution of the Parsoid HTMLformat, which will move the data-mw attribute to a separate metadatablob.Gabriel Wicke
1 3
0 0
carga masiva de datos
by william peñaloza 22 Feb '16

22 Feb '16
buenos díasmi pregunta es para saber si en mediawiki se pueden hacer carga masivo dedatos y cómo se hace desde una base de datos externos sin necesidad deeditarlos desde la misma página como se hace frecuentementegracias por la atencion prestadaestare atento a la respuesta-- *Declinación de Responsabilidades:* Los servicios de MISENA son soportados tecnológicamente por © Google y ofrecidos por el Servicio Nacional de Aprendizaje – SENA de manera gratuita a los aprendices e instructores de programas de formación titulada, las opiniones que contenga este mensaje son exclusivas de su autor y no representan la opinión del Servicio Nacional de Aprendizaje o de sus autoridades. El receptor deberá verificar posibles virus informáticos que tenga el correo o cualquier anexo, razón por la cual el SENA no es responsable de los daños causados por cualquier virus transmitido en este correo electrónico.Los contenidos, textos, imágenes, archivos enviados en este mensaje son responsabilidad exclusiva del remitente y no reflejan ni comprometen de ninguna manera a la institución. No se autoriza el uso de esta herramienta para el intercambio de correos masivos, cadenas o spam, ni de mensajes ofensivos, de carácter político, sexual o religioso, con fines de lucro, con propósitos delictivos o cualquier otro mensaje que se considere indebido o que vaya en contra de la Ley.
3 2
0 0
I've created a wiki page providing an overview of the objectives, currentstatus, and obstacles for the experimental Wiktionary popups, which you canfind here:https://www.mediawiki.org/wiki/Wikimedia_Apps/Wiktionary_definition_popups_…-m.On Sun, Feb 14, 2016 at 2:08 PM, Dmitry Brant <dbrant(a)wikimedia.org> wrote:> Hi Nemo,>> As Gabriel notes, the Wiktionary endpoint is still very much experimental,> and is subject to change. One of the ongoing goals for the Android app is> to integrate more rich content into the browsing experience. One such> feature is to allow the user to highlight words in an article and see a> quick popup definition of the word from Wiktionary[1]. To facilitate this> action, we set up a RESTBase endpoint for fetching the desired term from> Wiktionary[2].>> This feature is currently only available in the Wikipedia Beta app, and is> restricted only to English wiktionary. Further work on this endpoint will> depend on the level of user engagement with the feature, once it's rolled> out to the main Wikipedia app. So, once again, even though we're building> the endpoint with the hope that it would be used by other consumers besides> the Android app (and expanded to all languages), at the moment it's by no> means ready for general consumption.>> We do have a wiki page[3] with some more details on the service endpoints> that are used by the apps, which you, as well as the Wiktionary community,> are welcome to comment on.>> -Dmitry>> [1]https://phabricator.wikimedia.org/T115484> [2]https://phabricator.wikimedia.org/T119235> [3]>https://www.mediawiki.org/wiki/Wikimedia_Apps/Team/RESTBase_services_for_ap…>>> On Sun, Feb 14, 2016 at 12:18 PM, Gabriel Wicke <gwicke(a)wikimedia.org>> wrote:>>> Federico,>>>> as indicated by the classification as "experimental" [1], the>> definition end point [2] is at a very early point of its development.>> The mobile app team has added preliminary support for extracting>> definitions in the content service [3] using Parsoid's template>> metadata, and is using this end point to power a "define this word">> feature in the next version of the Android app. You can preview the>> feature in the beta Android app when browsing English Wikipedia by>> selecting a word, and then hitting the 'definition' icon next to>> 'copy'.>>>> In this first iteration, only English Wiktionary is supported.>> Generalizing the service and API end point to provide definitions>> using more or all Wiktionaries will require more work and planning. In>> the next iteration, I would expect a focus on enabling collaborative>> definition and maintenance of extraction rules, as well as broader>> involvement of Wiktionary communities in the planning process. The>> timing for the next iteration depends partly on the mobile app team's>> priorities, so I will defer to the team to comment on this.>>>> To summarize: We are aiming to gradually develop this into a generally>> useful, stable and well-documented API entry point for word>> definitions. The experimental end point published right now is just>> the beginning, and you are very much invited to help shape the way>> forward.>>>> Gabriel>>>> [1]:https://www.mediawiki.org/wiki/API_versioning#Experimental>> [2]:>>https://en.wiktionary.org/api/rest_v1/?doc#!/Page_content/get_page_definiti…>> [3]:>>https://github.com/wikimedia/mediawiki-services-mobileapps/blob/master/lib/…>>>>>> -->> Gabriel Wicke>> Principal Engineer, Wikimedia Foundation>>>> _______________________________________________>> Mobile-l mailing list>> Mobile-l(a)lists.wikimedia.org>>https://lists.wikimedia.org/mailman/listinfo/mobile-l>>>>>> --> Dmitry Brant> Software Engineer / Product Owner (Android)> Wikimedia Foundation>https://www.mediawiki.org/wiki/Wikimedia_mobile_engineering>>> _______________________________________________> Mobile-l mailing list> Mobile-l(a)lists.wikimedia.org>https://lists.wikimedia.org/mailman/listinfo/mobile-l>>
1 0
0 0
Federico,as indicated by the classification as "experimental" [1], thedefinition end point [2] is at a very early point of its development.The mobile app team has added preliminary support for extractingdefinitions in the content service [3] using Parsoid's templatemetadata, and is using this end point to power a "define this word"feature in the next version of the Android app. You can preview thefeature in the beta Android app when browsing English Wikipedia byselecting a word, and then hitting the 'definition' icon next to'copy'.In this first iteration, only English Wiktionary is supported.Generalizing the service and API end point to provide definitionsusing more or all Wiktionaries will require more work and planning. Inthe next iteration, I would expect a focus on enabling collaborativedefinition and maintenance of extraction rules, as well as broaderinvolvement of Wiktionary communities in the planning process. Thetiming for the next iteration depends partly on the mobile app team'spriorities, so I will defer to the team to comment on this.To summarize: We are aiming to gradually develop this into a generallyuseful, stable and well-documented API entry point for worddefinitions. The experimental end point published right now is justthe beginning, and you are very much invited to help shape the wayforward.Gabriel[1]:https://www.mediawiki.org/wiki/API_versioning#Experimental[2]:https://en.wiktionary.org/api/rest_v1/?doc#!/Page_content/get_page_definiti…[3]:https://github.com/wikimedia/mediawiki-services-mobileapps/blob/master/lib/…-- Gabriel WickePrincipal Engineer, Wikimedia Foundation
1 0
0 0
Hi Luigi,On Fri, Jan 29, 2016 at 12:31 PM, Luigi Assom <itsawesome.yes(a)gmail.com> wrote:> - how to extract _ID from ETag in headers:> GET /page/title/{title}the page id is indeed not directly exposed in the HTML response.However, the revision number is exposed as part of the ETag. This canthen be used to request revision metadata including the page id athttps://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/get_page_revision_….This is admittedly not very convenient, so I createdhttps://phabricator.wikimedia.org/T125453 for generally improved pageid support in the REST API.> - how to ensure> GET /page/title/{title with different char encoding or old titles are always> resolved to last canonical version}The storage backing this end point is automatically kept up to datewith edits and dependency changes. Edits in particular should bereflected within a few seconds.>> If you refer to>>>>https://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/get_page_graph_png…,>> this is an end point exposing rendered graph images for>>https://www.mediawiki.org/wiki/Extension:Graph (as linked in the end>> point documentation).>>> Oh very interesting!> So basically html markup can be extended ?> Would it be possible to share json objects as html5 markup and embed them in> wiki pages?The graph extension is using the regular MediaWiki tag extensionmechanism:https://www.mediawiki.org/wiki/Manual:Tag_extensionsGraphs are indeed defined using JSON within this tag.> I want to avoid to update my graph just because titles changes: entities are> always the same.Makes sense. The current API is optimized for the common case ofaccess by title, but we will consider adding access by page ID aswell.> I still don't know what parsoid is.Parsoid is the service providing semantic HTML and a bi-directionalconversion between that & wikitext:https://www.mediawiki.org/wiki/ParsoidGabriel
1 0
0 0
Results per page:

[8]ページ先頭

©2009-2026 Movatter.jp