Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Wikitech-lMarch 2023

wikitech-l@lists.wikimedia.org
  • 51 participants
  • 50 discussions
Start a nNew thread
Database dumps
by Byrial Jensen 17 Apr '25

17 Apr '25
Until some weeks agohttp://dumps.wikimedia.org/backup-index.html usedto show 4 dumps in progress at the same time. That meant that newdatabase dumps normally was available within about 3 weeks for alldatabases except for enwiki and maybe dewiki where the dump process dueto size took longer time.However the 4 dumps processes at one time become 3 some weeks ago. Andafter massive failures at June 4, only one dump has been in progress atthe same time. So at the current speed it will take several months tocome thru all dumps.Is it possible to speed up the process again using several dumpprocesses at the same time?Thank you,Byrial
3 2
0 0
User-Agent:
by Domas Mituzas 17 Apr '25

17 Apr '25
Hi!from now on specific per-bot/per-software/per-client User-Agent header is mandatory for contacting Wikimedia sites.Domas
19 61
0 0

17 Apr '25
Hoi,This is an inquiry from my friend in academia, researching about Wikipedia.He would like to know whether there's a way to acquire a list of templatesincluding external links. Here are some examples including external links.https://ja.wikipedia.org/wiki/Template:JOI/dochttps://ja.wikipedia.org/wiki/Template:Twitter/docSuch links are stored in externallinks.sql.gz, in an expanded form.When you want to check increase/decrease of linked domains in chronologicalorder through edit history, you have to check pages-meta-history1.xml etc.In a such case, traditional links and links by templates are mixed,Therefore, the latter ones (links by templates) should be expanded totraditional link forms.Sorry if what I am saying does not make sense.Thanks in advance,--Takashi Ota [[U:Takot]]
13 24
0 0
EBNF grammar project status?
by Steve Bennett 01 Apr '25

01 Apr '25
What's the status of the project to create a grammar for Wikitext in EBNF?There are two pages:http://meta.wikimedia.org/wiki/Wikitext_Metasyntaxhttp://www.mediawiki.org/wiki/Markup_specNothing seems to have happened since January this year. Also the comments onthe latter page seem to indicate a lack of clear goal: is this just a funproject, is it to improve the existing parser, or is it to facilititate anew parser? It's obviously a lot of work, so it needs to be of clearbenefit.Brion requested the grammar IIRC (and there's a comment to that effect athttp://bugzilla.wikimedia.org/show_bug.cgi?id=7), so I'm wondering what became of it.Is there still a goal of replacing the parser? Or is there some alternativeplan?Steve
26 217
0 0
Phabricator upstream shutdown
by Brian Wolff 21 Mar '25

21 Mar '25
It sounds like phabricator upstream is going away:https://admin.phacility.com/phame/post/view/11/phacility_is_winding_down_op…Just curious, are we planning to continue using it long term or move tosomething else?--Brian
10 12
0 0
Missing Section Headings
by Marc Riddell 13 Sep '24

13 Sep '24
Hello,I have been a WP editor since 2006. I hope you can help me. For some reasonI no longer have Section Heading titles showing in the Articles. This istrue of all Headings including the one that carries the Article subject'sname. When there is a Table of Contents, it appears fine and, when I clickon a particular Section, it goes to that Section, but all that is there is astraight line separating the Sections. There is also no button to edit aSection. If I edit the page and remove the "== ==" markers from the SectionTitles, the Title then shows up, but not as a Section Heading. Also, I don'thave any Date separators on my Want List. This started 2 days ago. Anythoughts?Thanks,Marc Riddell[[User:Michael David]]
10 11
0 0
I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)(Is there a place onmw.org to put this tidbit on?)------- Forwarded message -------From: "Brian Levine" <support(a)github.com> (GitHub Staff)To: matma.rex(a)gmail.comCc:Subject: Re: Commits in mirrored repositories not showing up on my profileDate: Tue, 09 Jul 2013 06:47:19 +0200Hi BartoszIn order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.CheersBrian-- Matma Rex
3 3
0 0
Research FAQ gets a facelift
by Dario Taraborelli 25 Jun '24

25 Jun '24
We just released a new version of Research:FAQ on Meta [1], significantlyexpanded and updated, to make our processes at WMF more transparent and tomeet an explicit FDC request to clarify the role and responsibilities ofindividual teams involved in research across the organization.The previous version – written from the perspective of the (now inactive)Research:Committee, and mostly obsolete since the release of WMF's openaccess policy [2] – can still be found here [3].Comments and bold edits to the new version of the document are welcome. Forany question or concern, you can drop me a line or ping my username on-wiki.Thanks,Dario[1]https://meta.wikimedia.org/wiki/Research:FAQ[2]https://wikimediafoundation.org/wiki/Open_access_policy[3]https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953*Dario Taraborelli *Head of Research, Wikimedia Foundationwikimediafoundation.orgnitens.org • @readermeter<http://twitter.com/readermeter>
2 1
0 0
Announcing: Path review board
by Brian Wolff 04 Oct '23

04 Oct '23
Yesterday there was a conversation about code review on irc and among otherthings, how sometimes patches can get "stuck".I had an idea for a way to improve things. I'm not sure if it is a goodidea, but there's only one way to find out.So without further ado, announcing the Code Review Patch Board:https://www.mediawiki.org/wiki/Code_review/patch_boardIn short - each person is allowed to list one of their patches on the boardthat they would really like to see reviewed. You can only list one patch ata time, and it should be a patch that you have been unable to get reviewfor for at least a week through normal means. See the page for the fulllist of guidelines.I encourage people to give it a try. Add a patch you wrote that you cannotget a review for. Or if you have +2 rights, try giving some love to theseunderloved patches.I would also love to hear feedback on the general idea as well as thecurrent guidelines.To repeat, the url is:https://www.mediawiki.org/wiki/Code_review/patch_boardThanks,bawolff
4 5
0 0
TL;DR: The legacy Mobile Content Service is going away in July 2023. Pleaseswitch to Parsoid or another API before then to ensure service continuity.Hello World,I'm writing about a service decommission we hope to complete mid-July 2023.The service to be decommissioned is the legacy Mobile Content Service("MCS"), which is maintained by the Wikimedia Foundation's ContentTransform Team. We will be marking this service as deprecated soon.We hope that with this notice, people will have ample time to update theirsystems for use of other endpoints such as Parsoid [1] (n.b., MCS usesParsoid HTML).The MCS endpoints are the ones with the relative URL path pattern/page/mobile-sections* on the Wikipedias. For examples of the URLs see the"Mobile" section on the online Swagger (OpenAPI) specificationdocumentation with matching URLs here:https://en.wikipedia.org/api/rest_v1/#/Mobile== History ==The Mobile Content Service ("MCS") is the historical aggregate service thatoriginally provided support for the article reading experience on theWikipedia for Android native app, as well as some other experiences. Wehave noticed that there are other users of the service. We are not able todetermine all of the users, as it's hard to tell with confidence from theweb logs.The Wikimedia Foundation had already transitioned the Wikipedia forAndroid and iOS apps to the newer Page Content Service ("PCS") severalyears ago. PCS has some similarities with MCS in terms of its mobilityfocus, but it also has different request-response signatures in practice.PCS, as with MCS, is intended to primarily satisfy WikimediaFoundation-maintained user experiences only, and so this is classified withthe "unstable" moniker.== Looking ahead ==Generally, as noted in the lead, we recommend that folks who use MCS (orPCS, for that matter) switch over to Parsoid for accessing Wikipediaarticle content programmatically for the most predictable service.The HTML produced by Parsoid has a versioned specification [2] and becauseParsoid is accessed regularly by a number of components across the globetends to have fairly well cached responses. However, please note thatParsoid may be subject to stricter rate limits that can apply under certaintraffic patterns.At this point, I do also want to note that in order to keep up withcontemporary HTML standards, particularly those favoring accessibility andmachine readability enhancements, Parsoid HTML will undergo change as wefurther converge parsing stacks [3]. Generally, you should expect iterationon the Parsoid HTML spec, and of course as you may have come to appreciatethat the shape of HTML in practice can vary nontrivially wiki-by-wiki aspractices across wikis vary.You may also want to consider Wikimedia Enterprise API options, which rangefrom no cost to higher volume access paid options.https://meta.wikimedia.org/wiki/Wikimedia_Enterprise#Access== Forking okay, but not recommended ==Because MCS acts as a service aggregate and makes multiple backend APIcalls, caveats can apply for those subresources - possibility of APIchanges, deprecation, and the like. We do not recommend a plain fork of MCScode because of the subresource fetch behavior. This said, of course youare welcome to fork in a way compatible with MCS's license.== Help spread the word ==Although we are aware of the top two remaining consumers of MCS, we alsoare not sure who else is accessing MCS and anticipate that some downstreamtech may break when MCS is turned off. As we are cross-posting thismessage, we hope most people who have come to rely upon MCS will see thismessage. Please feel free to forward this message to contacts if you knowthey are using MCS.== Help ==Although we intend to decommission MCS in July 2023, we would like to shareresources if you need some help. We plan to hold office hours in case youwould like to meet with us to discuss this or other Content Transform Teammatters. We will host these events on Google Meet. We will provide noticeof these office hours on the wikitech-l mailing list in the coming weeksand months.Additionally, if you would like to discuss your MCS transition plans,please visit the Content Transform Team talk page:https://www.mediawiki.org/wiki/Talk:Content_Transform_TeamFinally, some Content Transform Team members will also be at the WikimediaHackathon [4] if you would like some in-person support.Thank you.Adam Baso (he/him/his/Adam), on behalf of the Content Transform TeamDirector of EngineeringWikimedia Foundation[1]https://www.mediawiki.org/wiki/Parsoid[2]https://www.mediawiki.org/wiki/Specs/HTML[3]https://www.mediawiki.org/wiki/Parsoid/Parser_Unification/Updates[4]https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2023
3 7
0 0
Results per page:

[8]ページ先頭

©2009-2025 Movatter.jp