Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Wikitech-lMarch 2013

wikitech-l@lists.wikimedia.org
  • 151 participants
  • 173 discussions
Start a nNew thread
Database dumps
by Byrial Jensen 17 Apr '25

17 Apr '25
Until some weeks agohttp://dumps.wikimedia.org/backup-index.html usedto show 4 dumps in progress at the same time. That meant that newdatabase dumps normally was available within about 3 weeks for alldatabases except for enwiki and maybe dewiki where the dump process dueto size took longer time.However the 4 dumps processes at one time become 3 some weeks ago. Andafter massive failures at June 4, only one dump has been in progress atthe same time. So at the current speed it will take several months tocome thru all dumps.Is it possible to speed up the process again using several dumpprocesses at the same time?Thank you,Byrial
3 2
0 0
User-Agent:
by Domas Mituzas 17 Apr '25

17 Apr '25
Hi!from now on specific per-bot/per-software/per-client User-Agent header is mandatory for contacting Wikimedia sites.Domas
19 61
0 0
EBNF grammar project status?
by Steve Bennett 01 Apr '25

01 Apr '25
What's the status of the project to create a grammar for Wikitext in EBNF?There are two pages:http://meta.wikimedia.org/wiki/Wikitext_Metasyntaxhttp://www.mediawiki.org/wiki/Markup_specNothing seems to have happened since January this year. Also the comments onthe latter page seem to indicate a lack of clear goal: is this just a funproject, is it to improve the existing parser, or is it to facilititate anew parser? It's obviously a lot of work, so it needs to be of clearbenefit.Brion requested the grammar IIRC (and there's a comment to that effect athttp://bugzilla.wikimedia.org/show_bug.cgi?id=7), so I'm wondering what became of it.Is there still a goal of replacing the parser? Or is there some alternativeplan?Steve
26 217
0 0
Missing Section Headings
by Marc Riddell 13 Sep '24

13 Sep '24
Hello,I have been a WP editor since 2006. I hope you can help me. For some reasonI no longer have Section Heading titles showing in the Articles. This istrue of all Headings including the one that carries the Article subject'sname. When there is a Table of Contents, it appears fine and, when I clickon a particular Section, it goes to that Section, but all that is there is astraight line separating the Sections. There is also no button to edit aSection. If I edit the page and remove the "== ==" markers from the SectionTitles, the Title then shows up, but not as a Section Heading. Also, I don'thave any Date separators on my Want List. This started 2 days ago. Anythoughts?Thanks,Marc Riddell[[User:Michael David]]
10 11
0 0
MediaWiki 1.19.0beta2
by Sam Reed 09 May '14

09 May '14
I'm happy to announce the availability of the second beta release of thenew MediaWiki 1.19 release series.Please try it out and let us know what you think. Don't run it on anywikis that you really care about, unless you are both very brave andvery confident in your MediaWiki administration skills.MediaWiki 1.19 is a large release that contains many new features andbug fixes. This is a summary of the major changes of interest to users.You can consult the RELEASE-NOTES-1.19 file for the full list of changesin this version.Five security issues were discovered.It was discovered that the api had a cross-site request forgery (CSRF)vulnerability in the block/unblock modules. It was possible for a useraccount with the block privileges to block or unblock another user withoutproviding a token.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=34212It was discovered that the resource loader can leak certain kinds of privatedata across domain origin boundaries, by providing the data as an executableJavaScript file. In MediaWiki 1.18 and later, this includes the leaking ofCSRFprotection tokens. This allows compromise of the wiki's user accounts, saybychanging the user's email address and then requesting a password reset.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=34907Jan Schejbal ofHatforce.com discovered a cross-site request forgery (CSRF)vulnerability in Special:Upload. Modern browsers (since at least as early asDecember 2010) are able to post file uploads without user interaction,violating previous security assumptions within MediaWiki. Depending on the wiki's configuration, this vulnerability could lead tofurthercompromise, especially on private wikis where the set of allowed file typesisbroader than on public wikis. Note that CSRF allows compromise of a wikifroman external website even if the wiki is behind a firewall.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=35317George Argyros and Aggelos Kiayias reported that the method used to generatepassword reset tokens is not sufficiently secure. Instead we use variousmoresecure random number generators, depending on what is available on theplatform. Windows users are strongly advised to install either the opensslextension or the mcrypt extension for PHP so that MediaWiki can takeadvantageof the cryptographic random number facility provided by Windows.Any extension developers using mt_rand() to generate random numbers incontextswhere security is required are encouraged to instead make use of theMWCryptRand class introduced with this release.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=35078A long-standing bug in the wikitext parser (bug 22555) was discovered tohavesecurity implications. In the presence of the popular CharInsert extension,itleads to cross-site scripting (XSS). XSS may be possible with otherextensionsor perhaps even the MediaWiki core alone, although this is not confirmed atthis time. A denial-of-service attack (infinite loop) is also possibleregardless of configuration.For more details, seehttps://bugzilla.wikimedia.org/show_bug.cgi?id=35315********************************************************************* What's new?*********************************************************************MediaWiki 1.19 brings the usual host of various bugfixes and new features.Comprehensive list of what's new is in the release notes.* Bumped MySQL version requirement to 5.0.2.* Disable the partial HTML and MathML rendering options for Math, and render as PNG by default. * MathML mode was so incomplete most people thought it simply didn't work.* New skins/common/*.css files usable by skins instead of having to copypiles of generic styles from MonoBook or Vector's css.* The default user signature now contains a talk link in addition to theuser link.* Searching blocked usernames in block log is now clearer.* Better timezone recognition in user preferences.* Extensions can now participate in the extraction of titles from URL paths.* The command-line installer supports various RDBMSes better.* The interwiki links table can now be accessed also when the interwikicache is used (used in the API and the Interwiki extension).Internationalization- --------------------* More gender support (for instance in user lists).* Add languages: Canadian English.* Language converter improved, e.g. it now works depending on the page content language.* Time and number-formatting magic words also now depend on the page content language.* Bidirectional support further improved after 1.18.Release notes- -------------Full release notes:https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob_plain;f=RELEASE-NOTES-1.19;hb=1.19.0beta2https://www.mediawiki.org/wiki/Release_notes/1.19Co-inciding with these security releases, the MediaWiki source coderepository hasmoved from SVN (athttps://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3)to Git (https://gerrit.wikimedia.org/gitweb/mediawiki/core.git). So therelevantcommits for these releases will not be appearing in our SVN repository. Ifyou useSVN checkouts of MediaWiki for version control, you need to migrate these toGit.If you up are using tarballs, there should be no change in the process foryou.Please note that any WMF-deployed extensions have also been migrated to Gitalso, along with some other non WMF-maintained ones.Please bear with us, some of the Git related links for this release may notwork instantly,but should later on.To do a simple Git clone, the command is:git clonehttps://gerrit.wikimedia.org/r/p/mediawiki/core.gitMore information is available athttps://www.mediawiki.org/wiki/GitFor more help, please visit the #mediawiki IRC channel onfreenode.netirc://irc.freenode.net/mediawiki or email The MediaWiki-l mailing listat mediawiki-l(a)lists.wikimedia.org.**********************************************************************Download:http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gzPatch to previous version (1.19.0beta1), without interface text:http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gzInterface text changes:http://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patch.gzGPG signatures:http://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.tar.gz.sighttp://download.wikimedia.org/mediawiki/1.19/mediawiki-1.19.0beta2.patch.gz.sighttp://download.wikimedia.org/mediawiki/1.19/mediawiki-i18n-1.19.0beta2.patch.gz.sigPublic keys:https://secure.wikimedia.org/keys.html
5 5
0 0
Can we help Tor users make legitimate edits?
by Sumana Harihareswara 28 Sep '13

28 Sep '13
TL;DR: A few ideas follow on how we could possibly help legit editorscontribute from behind Tor proxies. I am just conversant enough withthe security problems to make unworkable suggestions ;-), so pleasecorrect me, critique & suggest solutions, and perhaps volunteer to help.The current situation:https://en.wikipedia.org/wiki/Wikipedia:Advice_to_users_using_Tor_to_bypass…We generally don't let anyone edit or upload from behind Tor; theTorBlock extension stops them. One exception: a person can create anaccount, accumulate lots of good edits, and then ask for an IP blockexemption, and then use that account to edit from behind Tor. This isunappealing because then there's still a bunch of in-the-clear editingthat has to happen first, and because then site functionaries know thatthe account is going to be making controversial edits (and couldpossibly connect it to IPs in the future, right?). And right nowthere's no way to truly *anonymously* contribute from behind Torproxies; you have to log in. However, since JavaScript delivery is hardfor Tor users, I'm not sure how much editing from Tor -- vandalism orlegit -- is actually happening. (I hope for analytics on this and thusadded it tohttps://www.mediawiki.org/wiki/Analytics/Dreams .) We knowat least that there are legitimate editors who would prefer to use Torand can't.People have been talking about how to improve the situation for sometime -- seehttp://cryptome.info/wiki-no-tor.htm andhttps://lists.torproject.org/pipermail/tor-dev/2012-October/004116.html. It'd be nice if it could actually move forward.I've floated this problem past Tor and privacy people, and here are afew ideas:1) Just use the existing mechanisms more leniently. Encourage thecommunities (Wikimedia & Tor) to usehttps://en.wikipedia.org/wiki/Wikipedia:Request_an_account (to get anaccount from behind Tor) and to let more people get IP block exemptionseven before they've made any edits (< 30 people have gotten exemptionson en.wp in 2012). Add encouraging "get an exempt account" language tothe "you're blocked because you're using Tor" messaging. Then ifthere's an uptick in vandalism from Tor then they can just tighten up again.2) Encourage people with closed proxies to re-vitalizehttps://en.wikipedia.org/wiki/Wikipedia:WOCP . Problem: using closedproxies is okay for people with some threat models but not others.3) Look at Nymble -http://freehaven.net/anonbib/#oakland11-formalizingandhttp://cgi.soic.indiana.edu/~kapadia/nymble/overview.php . It wouldallow Wikimedia to distance itself from knowing people's identities, butstill allow admins to revoke permissions if people acted up. The usershows a real identity, gets a token, and exchanges that token over torfor an account. If the user abuses the site, Wikimedia site admins canblacklist the user without ever being able to learn who they were orwhat other edits they did. More:https://cs.uwaterloo.ca/~iang/ IanGolberg's, Nick Hopper's, and Apu Kapadia's groups are all working onNymble or its derivatives. It's not ready for production yet, I bet,but if someone wanted a Big Project....3a) A token authorization system (perhaps a MediaWiki extension) wherethe server blindly signs a token, and then the user can use that tokento bypass the Tor blocks. (Tyler mentioned he saw this somewhere in aBugzilla suggestion; I haven't found it.)4) Allow more users the IP block exemption, possibly even automaticallyafter a certain number of unreverted edits, but with some kind ofFlaggedRevs integration; Tor users can edit but their changes have to bereviewed before going live. We could combine this with (3); Nymbleadministrators or token-issuers could pledge to review edits coming fromTor. But that latter idea sounds like a lot of social infrastructure toset up and maintain.Thoughts? Are any of you interested in working on this problem? #tor onthe OFTC IRC server is full of people who'd be interested in talkingabout this.-- Sumana HarihareswaraEngineering Community ManagerWikimedia Foundation
16 25
0 0

19 Jun '13
Hi all,I'd like to announce a recently created tool that might help the Wikimediatechnical community find stuff more easily. Sometimes relevant informationis buried in IRC chat logs, messages in any of several mailing lists, pagesinmediawiki.org, commit messages, etc. This tool (essentially a customgoogle search engine that filters results to a few relevant URL patterns)is aimed at relieving this problem. Test it here:http://hexm.de/mw-searchThe motivation for the tool came from a post by Niklas [1], specificallythe section "Coping with the proliferation of tools within your community".In the comments section, Nemo announced his initiative to create a customgoogle search to fit at least some of the requirements presented in thatsection, and I've offered to help him tweak it further. The URL list isstill incomplete and can be customized by editing the pagehttp://www.mediawiki.org/wiki/Wikimedia_technical_search (syncing with theactual engine still will have to happen by hand, but should be quick).Besides feedback on whether the engine works as you'd expect, I would liketo start some discussion about the ability for Google's bots to crawl someof the resources that are currently included in the URL filters, but returnno results. For example, the IRC logs atbots.wmflabs.org/~wm-bot/logs/.Some workarounds are used (e.g. using github for code search since gitwebisn't crawlable) but that isn't possible for all resources. What can we doto improve the situation?--Waldir1.http://laxstrom.name/blag/2013/02/11/fosdem-talk-reflections-23-docs-code-a…
6 7
0 0

04 Jun '13
*Marc-Andre Pelletier discovered a vulnerability in the MediaWiki OpenIDextension for the case that MediaWiki is used as a “provider” and the wikiallows renaming of users.All previous versions of the OpenID extension used user-page URLs asidentity URLs. On wikis that use the OpenID extension as “provider” andallows user renames, an attacker with rename privileges could rename a userand could then create an account with the same name as the victim. Thiswould have allowed the attacker to steal the victim’s OpenID identity.Version 3.00 fixes the vulnerability by using Special:OpenIDIdentifier/<id>as the user’s identity URL, <id> being the immutable MediaWiki-internaluserid of the user. The user’s old identity URL, based on the user’suser-page URL, will no longer be valid.The user’s user page can still be used as OpenID identity URL, but willdelegate to the special page.This is a breaking change, as it changes all user identity URLs. Providersare urged to upgrade and notify users, or to disable user renaming.Respectfully,Ryan Lanehttps://gerrit.wikimedia.org/r/#/c/52722Commit: f4abe8649c6c37074b5091748d9e2d6e9ed452f2*
6 9
0 0
How to load up high-resolution imagery on high-density displays has been anopen question for a while; we've wanted this for the mobile web site sincethe Nexus One and Droid brought 1.5x, and the iPhone 4 brought 2.0x densitydisplays to the mobile world a couple years back.More recently, tablets and a few laptops are bringing 1.5x and 2.0x densitydisplays too, such as the new Retina iPad and MacBook Pro.A properly responsive site should be able to detect when it's running onsuch a display and load higher-density image assets automatically...Here's my first stab:https://bugzilla.wikimedia.org/show_bug.cgi?id=36198#c6https://gerrit.wikimedia.org/r/#/c/24115/* adds $wgResponsiveImages setting, defaulting to true, to enable thefeature* adds jquery.hidpi plugin to check window.devicePixelRatio and replaceimages with data-src-1-5 or data-src-2-0 depending on the ratio* adds mediawiki.hidpi RL script to trigger hidpi loads after main imagesload* renders images from wiki image & thumb links at 1.5x and 2.0x andincludes data-src-1-5 and data-src-2-0 attributes with the targetsNote that this is a work in progress. There will be places where thisdoesn't yet work which output their imgs differently. If moving from a lowto high-DPI screen on a MacBook Pro Retina display, you won't see imagesload until you reload.Confirmed basic images and thumbs in wikitext appear to work in Safari 6 onMacBook Pro Retina display. (Should work in Chrome as well).Same code loaded on MobileFrontend display should also work, but have notyet attempted that.Note this does *not* attempt to use native SVGs, which is another potentialtactic for improving display on high-density displays and zoomed windows.This loads higher-resolution raster images, including rasterized SVGs.There may be loads of bugs; this is midnight hacking code and I make noguarantees of suitability for any purpose. ;)-- brion
12 22
0 0
Hello! So I tried convertinghttps://github.com/wikimedia/qa-browsertests/pull/1 into a Gerrit changeset(https://gerrit.wikimedia.org/r/#/c/54097/) , and was mostly successful. Itis also a relatively painless process - at least for single commits.This assumes you (person doing the GitHub -> Gerrit bridge) have a Gerritaccount. I wrote a small script that sortof makes this easy:https://gist.github.com/yuvipanda/5174162This only does things one time - it moves a set of commits in a pullrequest to a squashed single commit on gerrit, assuming your currentdirectory is a cloned version of the gerrit repo you want to commit to. Itshould not to be too hard to write an actual, idempotent sync script thatmaintains a 1-to-1 correspondence between Pull Requests and GerritChangesets, and I'll attempt to do that tomorrow.Note that this is a shitty bash script (to put it mildly) - but that seemsto be all I can write at 5:30 AM :) I'll probably rewrite it to be a properpython one soon. That should also allow me to use the GitHub API to alsomirror the GitHub Pull Request Title / Description to Gerrit.I also offer to manually sync pull requests into gerrit as they come untilthe automatic Gerrit integration is ready. Shall be writing another smallscript tomorrow to have me 'watch' all the wikimedia/* GitHub repositories.Thank you :) I'll update this thread as the script gets less shitty. Do letme know if you have build a far more complete script :)-- Yuvi Panda Thttp://yuvi.in/blog
7 12
0 0
Results per page:

[8]ページ先頭

©2009-2025 Movatter.jp