Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Wikitech-lAugust 2020

wikitech-l@lists.wikimedia.org
  • 67 participants
  • 58 discussions
Start a nNew thread
Database dumps
by Byrial Jensen 17 Apr '25

17 Apr '25
Until some weeks agohttp://dumps.wikimedia.org/backup-index.html usedto show 4 dumps in progress at the same time. That meant that newdatabase dumps normally was available within about 3 weeks for alldatabases except for enwiki and maybe dewiki where the dump process dueto size took longer time.However the 4 dumps processes at one time become 3 some weeks ago. Andafter massive failures at June 4, only one dump has been in progress atthe same time. So at the current speed it will take several months tocome thru all dumps.Is it possible to speed up the process again using several dumpprocesses at the same time?Thank you,Byrial
3 2
0 0
User-Agent:
by Domas Mituzas 17 Apr '25

17 Apr '25
Hi!from now on specific per-bot/per-software/per-client User-Agent header is mandatory for contacting Wikimedia sites.Domas
19 61
0 0

17 Apr '25
Hoi,This is an inquiry from my friend in academia, researching about Wikipedia.He would like to know whether there's a way to acquire a list of templatesincluding external links. Here are some examples including external links.https://ja.wikipedia.org/wiki/Template:JOI/dochttps://ja.wikipedia.org/wiki/Template:Twitter/docSuch links are stored in externallinks.sql.gz, in an expanded form.When you want to check increase/decrease of linked domains in chronologicalorder through edit history, you have to check pages-meta-history1.xml etc.In a such case, traditional links and links by templates are mixed,Therefore, the latter ones (links by templates) should be expanded totraditional link forms.Sorry if what I am saying does not make sense.Thanks in advance,--Takashi Ota [[U:Takot]]
13 24
0 0
EBNF grammar project status?
by Steve Bennett 01 Apr '25

01 Apr '25
What's the status of the project to create a grammar for Wikitext in EBNF?There are two pages:http://meta.wikimedia.org/wiki/Wikitext_Metasyntaxhttp://www.mediawiki.org/wiki/Markup_specNothing seems to have happened since January this year. Also the comments onthe latter page seem to indicate a lack of clear goal: is this just a funproject, is it to improve the existing parser, or is it to facilititate anew parser? It's obviously a lot of work, so it needs to be of clearbenefit.Brion requested the grammar IIRC (and there's a comment to that effect athttp://bugzilla.wikimedia.org/show_bug.cgi?id=7), so I'm wondering what became of it.Is there still a goal of replacing the parser? Or is there some alternativeplan?Steve
26 217
0 0
Missing Section Headings
by Marc Riddell 13 Sep '24

13 Sep '24
Hello,I have been a WP editor since 2006. I hope you can help me. For some reasonI no longer have Section Heading titles showing in the Articles. This istrue of all Headings including the one that carries the Article subject'sname. When there is a Table of Contents, it appears fine and, when I clickon a particular Section, it goes to that Section, but all that is there is astraight line separating the Sections. There is also no button to edit aSection. If I edit the page and remove the "== ==" markers from the SectionTitles, the Title then shows up, but not as a Section Heading. Also, I don'thave any Date separators on my Want List. This started 2 days ago. Anythoughts?Thanks,Marc Riddell[[User:Michael David]]
10 11
0 0
I know it has been annoying a couple of people other than me, so now that I've learned how to make it work I'll share the knowledge here.tl;dr: Star the repositories. No, seriously. (And yes, you need to star each extension repo separately.)(Is there a place onmw.org to put this tidbit on?)------- Forwarded message -------From: "Brian Levine" <support(a)github.com> (GitHub Staff)To: matma.rex(a)gmail.comCc:Subject: Re: Commits in mirrored repositories not showing up on my profileDate: Tue, 09 Jul 2013 06:47:19 +0200Hi BartoszIn order to link your commits to your GitHub account, you need to have some association with the repository other than authoring the commit. Usually, having push access gives you that connection. In this case, you don't have push permission, so we don't link you to the commit.The easy solution here is for you to star the repository. If you star it - along with the other repositories that are giving you this problem - we'll see that you're connected to the repository and you'll get contribution credit for those commits.CheersBrian-- Matma Rex
3 3
0 0
Research FAQ gets a facelift
by Dario Taraborelli 25 Jun '24

25 Jun '24
We just released a new version of Research:FAQ on Meta [1], significantlyexpanded and updated, to make our processes at WMF more transparent and tomeet an explicit FDC request to clarify the role and responsibilities ofindividual teams involved in research across the organization.The previous version – written from the perspective of the (now inactive)Research:Committee, and mostly obsolete since the release of WMF's openaccess policy [2] – can still be found here [3].Comments and bold edits to the new version of the document are welcome. Forany question or concern, you can drop me a line or ping my username on-wiki.Thanks,Dario[1]https://meta.wikimedia.org/wiki/Research:FAQ[2]https://wikimediafoundation.org/wiki/Open_access_policy[3]https://meta.wikimedia.org/w/index.php?title=Research:FAQ&oldid=15176953*Dario Taraborelli *Head of Research, Wikimedia Foundationwikimediafoundation.orgnitens.org • @readermeter<http://twitter.com/readermeter>
2 1
0 0
bluejeans
by Jeremy Baron 22 Feb '23

22 Feb '23
Hi,On Tue, Mar 1, 2016 at 3:36 PM, David Strine <dstrine(a)wikimedia.org> wrote:> We will be holding this brownbag in 25 minutes. The Bluejeans link has> changed:>>https://bluejeans.com/396234560I'm not familiar with bluejeans and maybe have missed a transitionbecause I wasn't paying enough attention. is this some kind ofexperiment? have all meetings transitioned to this service?anyway, my immediate question at the moment is how do you join withoutsharing your microphone and camera?am I correct thinking that this is an entirely proprietary stackthat's neither gratis nor libre and has no on-premise (not cloud)hosting option? are we paying for this?-Jeremy
9 16
0 0
Hello,can someone to update listhttps://phabricator.wikimedia.org/P10500 whichcontains repositories which haven't mediawiki/mediawiki-codesniffer.I found in list that much repositories are empty, and repositories whicharen't available on Gerrit.So, can someone please update this list of repositories (inmediawiki/extensions) which haven't mediawiki/mediawiki-codesniffer, but atleast, contains one PHP file. or to provide me command with which I canupdate list when I want, so I don't need to request it every time.Best regards,Zoran.P. S.: Happy weekend! :)
4 6
0 0

04 Nov '20
Hi all!Since the new Stable Interface Policy[1] has come into effect, there has beensome confusion about when and how the deprecation process can be accelerated orbypassed. I started a discussion about this issue on the talk page[2], and nowI'm writing this email in the hope of gathering more perspectives.tl;dr: the key question is: Can we shorten or even entirely skip the deprecation process, if we have removed all usages of the obsolete code from public extensions?If you are affected by the answer to this question, or you otherwise haveopinions about it, please read on (ok ok, this mail is massive - at least readthe proposed new wording of the policy). I'm especially interested in theopinions of extension developers.So, let's dive in. On the one hand, the new (and old) policy states: Code MUST emit hard deprecation notices for at least one major MediaWiki version before being removed. It is RECOMMENDED to emit hard deprecation notices for at least two major MediaWiki versions. EXCEPTIONS to this are listed in the section "Removal without deprecation" below.This means that code that starts to emit a deprecation warning in version N canonly be removed in version N+1, better even N+2. This effectively recommendsthat obsolete code be kept around for at least half a year, with a preferencefor a full year and more. However, we now have this exception in place: The deprecation process may be bypassed for code that is unused within the MediaWiki ecosystem. The ecosystem is defined to consist of all actively maintained code residing in repositories owned by the Wikimedia foundation, and can be searched using the code search tool.When TechCom added this section[3][4], we were thinking of the case where amethod becomes obsolete, but is unused. In that case, why go through all thehassle of deprecation, if nobody uses it anyway?However, what does this mean for obsolete code that *is* used? Can we just goahead and remove the usages, and then remove the code without deprecation? Thatseems to be the logical consequence.The result is a much tighter timeline from soft deprecation to removal, reducingthe amount of deprecated code we have to drag along and keep functional. This iswould be helpful particularly when code was refactored to remove undesirabledependencies, since the dependency will not actually go away until thedeprecated code has been removed.So, if we put in the work to remove usages, can we skip the deprecation process?After all, if the code is truly unused, this would not do any harm, right? Andbeing able to make breaking changes without the need to wait a year for them tobecome effective would greatly improve the speed at which we can modernize thecode base.However, even skipping soft deprecation and going directly to hard deprecationof the construction of the Revision class raised concerns, see for instance<https://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg92871.html>.The key concern is that we can only know about usages in repositories in our"ecosystem", a concept introduced into the policy by the section quoted above. Iwill go into the implications of this further below. But first, let me propose achange to the policy, to clarify when deprecation is or is not needed.I propose that the policy should read: Obsolete code MAY be removed without deprecation if it is unused (or appropriately gated) by any code in the MediaWiki ecosystem. Such removal must be recorded in the release notes as a breaking change without deprecation, and must be announced on the appropriate mailing lists. Obsolete code that is still used within the ecosystem MAY be removed if it has been emitting deprecation warnings in AT LEAST one major version release, and a best effort has been made to remove any remaining usages in the MediaWiki ecosystem. Obsolete code SHOULD be removed when it has been emitting deprecation warnings for two releases, even if it is still used.And further: The person, team, or organization that deprecates code SHOULD drive the removal of usages in a timely manner. For code not under the control of this person, team, or organization, appropriate changes SHOULD be proposed to the maintainers, and guidance SHOULD be provided when needed.Compared to the old process, this puts more focus on removing usages of obsoletecode. Previously, we'd often just wait and hope that usages of deprecatedmethods would vanish eventually. Which may take a long time, we still have codein MediaWiki that was deprecated in 1.24. Of course, every now and then someonefixes a bunch of usages of deprecated code, but this is a sporadic occurrence,not designed into the process.With the change I am proposing, whoever deprecates a function also commits toremoving usages of it asap. For extension developers, this means that they willget patches and support, but they may see their code broken if they do notfollow up.Now, my proposal hinges on the idea that we somehow know all relevant code thatneeds fixing. How can that work?When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy,our reasoning was that we want to support primarily extension developers whocontribute their extensions back to the ecosystem, by making them available tothe public. We found it fair to say that if people develop extensions solely fortheir own use, it is up to them to read the release notes. We do not need to goout of our way to protect them from changes to the code base.Effectively, with the proposed change to the policy, maintainers of publicextensions will get more support keeping their extensions compatible, whilemaintainers of private extensions will receive less consideration.It seems desirable and fair to me to allow for "fast track" removal of obsoletecode, but only if we create a clear process for making an extensions "official".How exactly would an extension developer make sure that we know their extension,and consider it part of the ecosystem? In practice, "known code" is codeaccessible via codesearch[5]. But how does one get an extension into thecodesearch index? There is currently no clear process for this.Ideally, it would be sufficient to:* create a page onmediawiki.org using the {{Extension}} infobox,* setting the status to "stable" (and maybe "beta"),* and linking to a public git repository.It should be simple enough to create a script that feeds these repos intocodesearch. A quick look at Category:Extensions_by_status category tells me thatthere are about a thousand such extensions.So, my question to you is: do you support the change I am proposing to thepolicy? If not, why not? And if you do, why do you think it's helpful?-- danielPS: This proposal has not yet been vetted with TechCom, it's just my personaltake. It will become an RFC if needed. This is intended to start a conversation.[1]https://www.mediawiki.org/wiki/Stable_interface_policy[2]https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v[3]https://phabricator.wikimedia.org/T193613[4]https://phabricator.wikimedia.org/T255803[5]https://codesearch.wmcloud.org/search/-- Daniel KinzlerPrincipal Software Engineer, Core PlatformWikimedia Foundation
13 21
0 0
Results per page:

[8]ページ先頭

©2009-2025 Movatter.jp