Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Wikitech-lJune 2013

wikitech-l@lists.wikimedia.org
  • 120 participants
  • 120 discussions
Start a nNew thread
Database dumps
by Byrial Jensen 17 Apr '25

17 Apr '25
Until some weeks agohttp://dumps.wikimedia.org/backup-index.html usedto show 4 dumps in progress at the same time. That meant that newdatabase dumps normally was available within about 3 weeks for alldatabases except for enwiki and maybe dewiki where the dump process dueto size took longer time.However the 4 dumps processes at one time become 3 some weeks ago. Andafter massive failures at June 4, only one dump has been in progress atthe same time. So at the current speed it will take several months tocome thru all dumps.Is it possible to speed up the process again using several dumpprocesses at the same time?Thank you,Byrial
3 2
0 0
User-Agent:
by Domas Mituzas 17 Apr '25

17 Apr '25
Hi!from now on specific per-bot/per-software/per-client User-Agent header is mandatory for contacting Wikimedia sites.Domas
19 61
0 0
EBNF grammar project status?
by Steve Bennett 01 Apr '25

01 Apr '25
What's the status of the project to create a grammar for Wikitext in EBNF?There are two pages:http://meta.wikimedia.org/wiki/Wikitext_Metasyntaxhttp://www.mediawiki.org/wiki/Markup_specNothing seems to have happened since January this year. Also the comments onthe latter page seem to indicate a lack of clear goal: is this just a funproject, is it to improve the existing parser, or is it to facilititate anew parser? It's obviously a lot of work, so it needs to be of clearbenefit.Brion requested the grammar IIRC (and there's a comment to that effect athttp://bugzilla.wikimedia.org/show_bug.cgi?id=7), so I'm wondering what became of it.Is there still a goal of replacing the parser? Or is there some alternativeplan?Steve
26 217
0 0
Missing Section Headings
by Marc Riddell 13 Sep '24

13 Sep '24
Hello,I have been a WP editor since 2006. I hope you can help me. For some reasonI no longer have Section Heading titles showing in the Articles. This istrue of all Headings including the one that carries the Article subject'sname. When there is a Table of Contents, it appears fine and, when I clickon a particular Section, it goes to that Section, but all that is there is astraight line separating the Sections. There is also no button to edit aSection. If I edit the page and remove the "== ==" markers from the SectionTitles, the Title then shows up, but not as a Section Heading. Also, I don'thave any Date separators on my Want List. This started 2 days ago. Anythoughts?Thanks,Marc Riddell[[User:Michael David]]
10 11
0 0
MediaWiki 1.19.0beta2
by Sam Reed 09 May '14

09 May '14
I'm happy to announce the availability of the second beta release of thenew MediaWiki 1.19 release series.Please try it out and let us know what you think. Don't run it on anywikis that you really care about, unless you are both very brave andvery confident in your MediaWiki administration skills.MediaWiki 1.19 is a large release that contains many new features andbug fixes. This is a summary of the major changes of interest to users.You can consult the RELEASE-NOTES-1.19 file for the full list of changesin this version.Five security issues were discovered.It was discovered that the api had a cross-site request forgery (CSRF)vulnerability in the block/unblock modules. It was possible for a useraccount with the block privileges to block or unblock another user withoutproviding a token.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=34212It was discovered that the resource loader can leak certain kinds of privatedata across domain origin boundaries, by providing the data as an executableJavaScript file. In MediaWiki 1.18 and later, this includes the leaking ofCSRFprotection tokens. This allows compromise of the wiki's user accounts, saybychanging the user's email address and then requesting a password reset.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=34907Jan Schejbal ofHatforce.com discovered a cross-site request forgery (CSRF)vulnerability in Special:Upload. Modern browsers (since at least as early asDecember 2010) are able to post file uploads without user interaction,violating previous security assumptions within MediaWiki. Depending on the wiki's configuration, this vulnerability could lead tofurthercompromise, especially on private wikis where the set of allowed file typesisbroader than on public wikis. Note that CSRF allows compromise of a wikifroman external website even if the wiki is behind a firewall.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=35317George Argyros and Aggelos Kiayias reported that the method used to generatepassword reset tokens is not sufficiently secure. Instead we use variousmoresecure random number generators, depending on what is available on theplatform. Windows users are strongly advised to install either the opensslextension or the mcrypt extension for PHP so that MediaWiki can takeadvantageof the cryptographic random number facility provided by Windows.Any extension developers using mt_rand() to generate random numbers incontextswhere security is required are encouraged to instead make use of theMWCryptRand class introduced with this release.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=35078A long-standing bug in the wikitext parser (bug 22555) was discovered tohavesecurity implications. In the presence of the popular CharInsert extension,itleads to cross-site scripting (XSS). XSS may be possible with otherextensionsor perhaps even the MediaWiki core alone, although this is not confirmed atthis time. A denial-of-service attack (infinite loop) is also possibleregardless of configuration.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=35315********************************************************************* What's new?*********************************************************************MediaWiki 1.19 brings the usual host of various bugfixes and new features.Comprehensive list of what's new is in the release notes.* Bumped MySQL version requirement to 5.0.2.* Disable the partial HTML and MathML rendering options for Math, and render as PNG by default. * MathML mode was so incomplete most people thought it simply didn't work.* New skins/common/*.css files usable by skins instead of having to copypiles of generic styles from MonoBook or Vector's css.* The default user signature now contains a talk link in addition to theuser link.* Searching blocked usernames in block log is now clearer.* Better timezone recognition in user preferences.* Extensions can now participate in the extraction of titles from URL paths.* The command-line installer supports various RDBMSes better.* The interwiki links table can now be accessed also when the interwikicache is used (used in the API and the Interwiki extension).Internationalization- --------------------* More gender support (for instance in user lists).* Add languages: Canadian English.* Language converter improved, e.g. it now works depending on the page content language.* Time and number-formatting magic words also now depend on the page content language.* Bidirectional support further improved after 1.18.Release notes- -------------Full release notes:https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob_plain;f=RELEASE-NOTES-1.19;hb=1.19.0beta2https://www.mediawiki.org/wiki/Release_notes/1.19Co-inciding with these security releases, the MediaWiki source coderepository hasmoved from SVN (athttps://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So therelevantcommits for these releases will not be appearing in our SVN repository. Ifyou useSVN checkouts of MediaWiki for version control, you need to migrate these toGit.If you up are using tarballs, there should be no change in the process foryou.Please note that any WMF-deployed extensions have also been migrated to Gitalso, along with some other non WMF-maintained ones.Please bear with us, some of the Git related links for this release may notwork instantly,but should later on.To do a simple Git clone, the command is:git clonehttps://gerrit.wikimedia.org/r/p/mediawiki/core.gitMore information is available athttps://www.mediawiki.org/wiki/GitFor more help, please visit the #mediawiki IRC channel onfreenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing listat mediawiki-l(a)lists.wikimedia.org.**********************************************************************Download:http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gzPatch to previous version (1.19.0beta1), without interface text:http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gzInterface text changes:http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patch.gzGPG signatures:http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz.sighttp://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz.sighttp://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patch.gz.sigPublic keys:https://secure.wikimedia.org/keys.html
5 5
0 0

28 Jan '14
Hello all,I’m delighted to announce that Ken Snider is joining the Wikimediaoperations team. He will start as an international contractor workingremotely from Toronto, Canada on June 10, and will be visiting SF inthe week of June 17. We’re currently in the process of seeking workauthorization in the United States in the Director of TechOpsposition.CT has graciously agreed to support the ops leadership transitionfull-time through June, and part-time through July. We’ll be startingthe handover while Ken is working remotely.A bit more about Ken: Ken was apparently genetically predisposed tobecome a sysadmin since he joined one of Canada’s first large ISPs,Primus, straight out of school in 1997 and helped build theirinfrastructure til 2001. He then joined a startup called OpenCOLA in2001 which was co-founded by Cory Doctorow and developed early P2Pprecursors to tools like BitTorrent and Steam. It’s best known todayfor the development of an open source (GPL’d) cola recipe which isstill in use (more than 150,000 cans sold if Wikipedia is to bebelieved).Ken got involved in one of Cory’s pet projects,BoingBoing.net whichsome of you may have heard of ;-), and has been their sysadmin since2003. After a stint from 2001-2005 at DataWire, Ken became Director ofTech Ops at Federated Media, a role he held from 2005-2012.Federated Media is an ad network that was founded to support hightraffic blogs and sites that want to stay independent of largepublishers, with a network that supports more than 1B requests/day.One of the unusual challenges at FM was that the company grew throughacquisitions of various blogging and publishing networks. This led tothe challenge of integrating very heterogeneous operations andengineering infrastructure, including multiple geographicallydistributed ops teams and data-center locations. As DTO, Ken led theseefforts, such as OS standardization, development of a unifieddeployment infrastructure, etc. Ken also ensured that the operationsgroup partnered effectively with the various engineering teamsdeveloping site features and enhancements.I want to again take this opportunity to thank CT Woo for his tirelessoperations leadership since December 2010. I’d also like to thankeveryone who’s participated in the Director of TechOps search process.Please join me in welcoming Ken to the Wikimedia Foundation and thecommunity. :-)All best,Erik--Erik MöllerVP of Engineering and Product Development, Wikimedia Foundation
4 4
0 0
migrating hooks doc to doxygen?
by Antoine Musso 13 Nov '13

13 Nov '13
Hello,Since we introduced hooks in MediaWiki, the documentation has beenmaintained in a flat file /docs/hooks.txt . Over the week-end I haveconverted the content of that file to let Doxygen recognize it.The patchset is:https://gerrit.wikimedia.org/r/#/c/66128/I have used that patch to generate a temporary documentation. That letseveryone browse the result easily. The produced result is:A landing page:https://doc.wikimedia.org/mw-hooks/hooks_mainpage.htmlThe doc overview:https://doc.wikimedia.org/mw-hooks/page_hooks_documentation.htmlA list of hooks with their documentation:https://doc.wikimedia.org/mw-hooks/page_hooks_list.htmlI think that makes it a bit more accessible to everyone and Doxygenautolink to referenced classes.Some issues I have: - the hooks are listed alphabetically when they could be regrouped bytheme (like API, SpecialPages, HTML Forms ...). - The hooks are documented in a separate file (still docs/hooks.txt),when we might want to have the doc near the wfRunHooks() call.Thoughts ?-- Antoine "hashar" Musso
13 17
0 0
Hey,The new version of git-review released today (1.22) includes a patch Iwrote that makes it possible to work against a single 'origin' remote. Thisamounts to a workaround for git-review's tendency to frighten you intothinking you're about to submit more patches than the ones you are workingon. It makes git-review more pleasant to work with, in my opinion.To enable this behavior, you first need to upgrade to the latest version ofgit-review, by running "pip install -U git-review". Then you need to createa configuration file: either /etc/git-review/git-review.conf (system-wide)or ~/.config/git-review/git-review.conf (user-specific).The file should contain these two lines:[gerrit]defaultremote = originOnce you've made the change, any new Gerrit repos you clone using anauthenticated URI will just work.You'll need to perform an additional step to migrate existing repositories.In each repository, run the following commands: git remote set-url origin $(git config --get remote.gerrit.url) git remote rm gerrit git review -sHope you find this useful.
13 15
0 0
Can we help Tor users make legitimate edits?
by Sumana Harihareswara 28 Sep '13

28 Sep '13
TL;DR: A few ideas follow on how we could possibly help legit editorscontribute from behind Tor proxies. I am just conversant enough withthe security problems to make unworkable suggestions ;-), so pleasecorrect me, critique & suggest solutions, and perhaps volunteer to help.The current situation:https://en.wikipedia.org/wiki/Wikipedia:Advice_to_users_using_Tor_to_bypass…We generally don't let anyone edit or upload from behind Tor; theTorBlock extension stops them. One exception: a person can create anaccount, accumulate lots of good edits, and then ask for an IP blockexemption, and then use that account to edit from behind Tor. This isunappealing because then there's still a bunch of in-the-clear editingthat has to happen first, and because then site functionaries know thatthe account is going to be making controversial edits (and couldpossibly connect it to IPs in the future, right?). And right nowthere's no way to truly *anonymously* contribute from behind Torproxies; you have to log in. However, since JavaScript delivery is hardfor Tor users, I'm not sure how much editing from Tor -- vandalism orlegit -- is actually happening. (I hope for analytics on this and thusadded it tohttps://www.mediawiki.org/wiki/Analytics/Dreams .) We knowat least that there are legitimate editors who would prefer to use Torand can't.People have been talking about how to improve the situation for sometime -- seehttp://cryptome.info/wiki-no-tor.htm andhttps://lists.torproject.org/pipermail/tor-dev/2012-October/004116.html. It'd be nice if it could actually move forward.I've floated this problem past Tor and privacy people, and here are afew ideas:1) Just use the existing mechanisms more leniently. Encourage thecommunities (Wikimedia & Tor) to usehttps://en.wikipedia.org/wiki/Wikipedia:Request_an_account (to get anaccount from behind Tor) and to let more people get IP block exemptionseven before they've made any edits (< 30 people have gotten exemptionson en.wp in 2012). Add encouraging "get an exempt account" language tothe "you're blocked because you're using Tor" messaging. Then ifthere's an uptick in vandalism from Tor then they can just tighten up again.2) Encourage people with closed proxies to re-vitalizehttps://en.wikipedia.org/wiki/Wikipedia:WOCP . Problem: using closedproxies is okay for people with some threat models but not others.3) Look at Nymble -http://freehaven.net/anonbib/#oakland11-formalizingandhttp://cgi.soic.indiana.edu/~kapadia/nymble/overview.php . It wouldallow Wikimedia to distance itself from knowing people's identities, butstill allow admins to revoke permissions if people acted up. The usershows a real identity, gets a token, and exchanges that token over torfor an account. If the user abuses the site, Wikimedia site admins canblacklist the user without ever being able to learn who they were orwhat other edits they did. More:https://cs.uwaterloo.ca/~iang/ IanGolberg's, Nick Hopper's, and Apu Kapadia's groups are all working onNymble or its derivatives. It's not ready for production yet, I bet,but if someone wanted a Big Project....3a) A token authorization system (perhaps a MediaWiki extension) wherethe server blindly signs a token, and then the user can use that tokento bypass the Tor blocks. (Tyler mentioned he saw this somewhere in aBugzilla suggestion; I haven't found it.)4) Allow more users the IP block exemption, possibly even automaticallyafter a certain number of unreverted edits, but with some kind ofFlaggedRevs integration; Tor users can edit but their changes have to bereviewed before going live. We could combine this with (3); Nymbleadministrators or token-issuers could pledge to review edits coming fromTor. But that latter idea sounds like a lot of social infrastructure toset up and maintain.Thoughts? Are any of you interested in working on this problem? #tor onthe OFTC IRC server is full of people who'd be interested in talkingabout this.-- Sumana HarihareswaraEngineering Community ManagerWikimedia Foundation
16 25
0 0

25 Sep '13
Based on many ideas that were put forth, I would like to seek comments onthis ZERO design. This HTML will be rendered for both M and ZERO subdomainsif varnish detects that request is coming from a zero partner. M and ZEROwill be identical except for the images - ZERO substitutes images withlinks to File:xxx namespace through a redirector.* All non-local links always point to a redirector. On javascript capabledevices, it will load carrier configuration and replace the link with localconfirmation dialog box or direct link. Without javascript, redirector willeither silently 301-redirect or show confirmation HTML. Links to images onZERO.wiki and all external links are done in similar way.* The banner is an ESI link to */w/api.php?action=zero&banner=250-99* -returns HTML <div> blob of the banner. (Not sure if banner ID should bepart of the URL)Expected cache fragmentation for each wiki page:* per subdomain (M|ZERO)* if M - per "isZeroCarrier" (TRUE|FALSE). if ZERO - always TRUE.3 variants is much better then one per carrier ID * 2 per subdomain.P.S.Redirector is a Special:Zero page, but if speed is an issue, it could be anAPI calls (which seem to load much faster). The API call would redirect tothe target, or could either redirect to the special page for confirmationrendering, or output HTML itself (no skin support, but avoids an extraredirect). Might not be worth it as javascript will be available on most ofour target platforms now or soon.
3 4
0 0
Results per page:

[8]ページ先頭

©2009-2025 Movatter.jp