Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Wikitech-lDecember 2010

wikitech-l@lists.wikimedia.org
  • 95 participants
  • 60 discussions
Start a nNew thread
Database dumps
by Byrial Jensen 17 Apr '25

17 Apr '25
Until some weeks agohttp://dumps.wikimedia.org/backup-index.html usedto show 4 dumps in progress at the same time. That meant that newdatabase dumps normally was available within about 3 weeks for alldatabases except for enwiki and maybe dewiki where the dump process dueto size took longer time.However the 4 dumps processes at one time become 3 some weeks ago. Andafter massive failures at June 4, only one dump has been in progress atthe same time. So at the current speed it will take several months tocome thru all dumps.Is it possible to speed up the process again using several dumpprocesses at the same time?Thank you,Byrial
3 2
0 0
User-Agent:
by Domas Mituzas 17 Apr '25

17 Apr '25
Hi!from now on specific per-bot/per-software/per-client User-Agent header is mandatory for contacting Wikimedia sites.Domas
19 61
0 0
EBNF grammar project status?
by Steve Bennett 01 Apr '25

01 Apr '25
What's the status of the project to create a grammar for Wikitext in EBNF?There are two pages:http://meta.wikimedia.org/wiki/Wikitext_Metasyntaxhttp://www.mediawiki.org/wiki/Markup_specNothing seems to have happened since January this year. Also the comments onthe latter page seem to indicate a lack of clear goal: is this just a funproject, is it to improve the existing parser, or is it to facilititate anew parser? It's obviously a lot of work, so it needs to be of clearbenefit.Brion requested the grammar IIRC (and there's a comment to that effect athttp://bugzilla.wikimedia.org/show_bug.cgi?id=7), so I'm wondering what became of it.Is there still a goal of replacing the parser? Or is there some alternativeplan?Steve
26 217
0 0
Missing Section Headings
by Marc Riddell 13 Sep '24

13 Sep '24
Hello,I have been a WP editor since 2006. I hope you can help me. For some reasonI no longer have Section Heading titles showing in the Articles. This istrue of all Headings including the one that carries the Article subject'sname. When there is a Table of Contents, it appears fine and, when I clickon a particular Section, it goes to that Section, but all that is there is astraight line separating the Sections. There is also no button to edit aSection. If I edit the page and remove the "== ==" markers from the SectionTitles, the Title then shows up, but not as a Section Heading. Also, I don'thave any Date separators on my Want List. This started 2 days ago. Anythoughts?Thanks,Marc Riddell[[User:Michael David]]
10 11
0 0
MediaWiki to Latex Converter
by Hugo Vincent 18 Jun '12

18 Jun '12
Hi everyone,I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/) and I need to extra the content from it and convert it into LaTeX syntax for printed documentation. I have googled for a suitable OSS solution but nothing was apparent.I would prefer a script written in Python, but any recommendations would be very welcome.Do you know of anything suitable?Kind Regards,Hugo Vincent,Bluewater Systems.
6 13
0 0
Deprecating content_actions
by Daniel Friesen 31 Jan '11

31 Jan '11
Right now in the skins system (if you consider vector part of the skins system) we have two parallel methods of adding tabs to the page:- Into content_actions via SkinTemplateTabs, SkinTemplateBuildContentActionUrlsAfterSpecialPage, and SkinTemplateContentActions,- Into vector's navigation_urls via SkinTemplateNavigation (the missing two hooks should be added)The only important difference between these (besides some vector specific stuff that can stay in vector) is that content_actions is a flat array, and navigation_urls is an array of arrays organized into categories. Besides that, they are basically mirrors of each other with the same functional purpose, but you need to add tabs to both of them to avoid something not showing up in vector. There's also the misfortune that other skins can't take advantage of the organized navigation_urls because the actual implementation (which is basically a reimplementation of buildContentActionUrls with code duplication) without being a vector subskin because the code in question is inside of vector.Right now we have extensions using both methods of adding tabs to the page, code duplication on their part. And a few extensions that are broken in vector because they haven't added the hooks.Doing a quick grep, it appears the following extensions are missing vector support: Oversight, CommentPages, Todo, WikiTrust, Tasks, CategoryTree, DeleteQueue, Wikidata, Imagetabs, purgetab, Tab0, AuthorProtect, TidyTab, Purge, SpecialTalkShouldn't be to hard to fix, especially if we fix the bug of missing hooks for navigation_urls.Now onto my focal point. As I've been improving the skin system trying to pull out the thorns that make building skins troublesome and mesh in new features and helpers which are missing, I'd like to remove the content_actions hooks and deprecate content_actions in 1.18 and start using navigation_urls style data everywhere.Since content_actions and navigation_urls are the same, content_actions can be built by having SkinTemplate take the navigation_urls data and flatten it into a single array. Similarly to how I already have $wgFooterIcons work and fold it for some skins like Monobook which don't organize it the way vector does.The effects will be like this:- The three content_actions related hooks will no longer work in 1.18, thus extensions that haven't started supporting vector tabs will also stop showing tabs in other skins- In their place extensions will use 3 navigation_urls related hooks (most extensions are already using the one hook available)- Extension code for those already using both forms of hooks will stay the same, the only difference being that 1.18 will use the navigation_urls related hooks and the content_actions related ones will become redundant code which the extensions can keep for back compat but drop once they stop supporting pre-1.18 installations- All standard skins will be using navigation_url based data and content_actions will be available but deprecated- 3rd party skins will still function using content_actions but it would be preferred for them to be updated to use the new BaseTemplate and use the helpers in there (a navigation_urls related one would be added) once they don't need to support pre-1.18- SkinTemplatePreventOtherActiveTabs will probably still work, though I may want to find a cleaner method to transition to (ie: one that says "this is the active tab" rather than "don't make other tabs active").Any comments, rejections?-- ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
7 18
0 0
Christmas server failure report
by Platonides 12 Jan '11

12 Jan '11
Earlier today, /a filled with binlogs in db27, which was s3 & s7 master.nagios had warned too early / nobody noticed. Slaves lagged, lots oflocks, the wikis got to a halt.Revisions between 6:50 and 8:20 pm UTC were lost (although they can bemanually reimported from db27).The new s3 and s7 master is db17, with only one slave: db25.After the master switch, we started having problems due to cachedrevision text in memcached, due to the duplication of old_id values,so we made them read-only until UTC midnight.We decided not to disable $wgRevisionCacheExpiry but to remove thefaulty entries, thus I quickly prepared the scriptmaintenance/purgeStaleMemcachedText.php to clean them.There were problems in hewiki, since data there didn't clean. On oneinstance doing $wgMemc->get persisted even after a $wgMemc->delete onthat same key (???).Other than the hewiki issues, it seemed to run fine. There will be lotsof wrong entries in diff and parser cache needing a manual action=purgebut a purge will clean them.Flagged revs caches were not touched. Wikis using it may show the wrongcontent (with the additional fun of some users viewing the right one).There are also PPFrame_DOM->expand errors that started around the sametime, even on wikis on a different cluster. They usually only happenonce, and it succeeds just reloading.https://bugzilla.wikimedia.org/show_bug.cgi?id=26429
4 4
0 0
dataset1, xml dumps
by Ariel T. Glenn 11 Jan '11

11 Jan '11
For folks who have not been following the saga onhttp://wikitech.wikimedia.org/view/Dataset1we were able to get the raid array back in service last night on the XMLdata dumps server, and we are now busily copying data off of it toanother host. There's about 11T of dumps to copy over; once that's donewe will start serving these dumps read-only to the public again.Because the state of the server hardware is still uncertain, we don'twant to do anything that might put the data at risk until that copy hasbeen made.The replacement server is on order and we are watching that closely. We have also been working on deploying a server to run one round ofdumps in the interrim.Thanks for your patience (which is a way of saying, I know you are allout of patience, as am I, but hang on just a little longer).Ariel
14 31
0 0

10 Jan '11
[crossposted to foundation-l and wikitech-l]"There has to be a vision though, of something better. Maybe somethingthat is an actual wiki, quick and easy, rather than the templatecoding hell Wikipedia's turned into." - something Fred Bauder justsaid on wikien-l.Our current markup is one of our biggest barriers to participation.AIUI, edit rates are about half what they were in 2005, even as ourfame has gone from "popular" through "famous" to "part of thestructure of the world." I submit that this is not a good or healthything in any way and needs fixing.People who can handle wikitext really just do not understand howoffputting the computer guacamole is to people who can cope with textthey can see.We know this is a problem; WYSIWYG that works is something that's beenwanted here forever. There are various hideous technical nightmares inits way, that make this a big and hairy problem, of the sort where thehair has hair.However, I submit that it's important enough we need to attack it withactual resources anyway.This is just one data point, where a Canadian government office got*EIGHT TIMES* the participation in their intranet wiki by putting in a(heavily locally patched) copy of FCKeditor:http://lists.wikimedia.org/pipermail/mediawiki-l/2010-May/034062.html"I have to disagree with you given my experience. In one governmentdepartment where MediaWiki was installed we saw the active user basespike from about 1000 users to about 8000 users within a month of havingenabled FCKeditor. FCKeditor definitely has it's warts, but it veryclosely matches the experience non-technical people have gotten used towhile using Word or WordPerfect. Leveraging skills people already havecuts down on training costs and allows them to be productive almostimmediately."http://lists.wikimedia.org/pipermail/mediawiki-l/2010-May/034071.html"Since a plethora of intelligent people with no desire to learn WikiCodecan now add content, the quality of posts has been in line with theadoption of wiki use by these people. Thus one would say it has gone up."In the beginning there were some hard core users that learned WikiCode,for the most part they have indicated that when the WYSIWYG fails, theyare able to switch to WikiCode mode to address the problem. This usuallyoccurs with complex table nesting which is something that few of theusers do anyways. Most document layouts are kept simple. Additionally,we have a multilingual english/french wiki. As a result the browserspell-check is insufficient for the most part (not to mention it hasissues with WikiCode). To address this a second spellcheck button wasadded to the interface so that both english and french spellcheck couldbe available within the same interface (via aspell backend)."So, the payoffs could be ridiculously huge: eight times the number ofsmart and knowledgeable people even being able to *fix typos* onmaterial they care about.Here are some problems. (Off the top of my head; please do add more,all you can think of.)- The problem:* Fidelity with the existing body of wikitext. No conversion flag day.The current body exploits every possible edge case in the regularexpression guacamole we call a "parser". Tim said a few years ago thatany solution has to account for the existing body of text.* Two-way fidelity. Those who know wikitext will demand to keep it andwill bitterly resist any attempt to take it away from them.* FCKeditor (now CKeditor) in MediaWiki is all but unmaintained.* There is no specification for wikitext. Well, there almost is -compiled as C, it runs a bit slower than the existing PHP compiler.But it's a start!http://lists.wikimedia.org/pipermail/wikitext-l/2010-August/000318.html- Attempting to solve it:* The best brains around Wikipedia, MediaWiki and WMF have dashedtheir foreheads against this problem for at least the past five yearsand have got *nowhere*. Tim has a whole section in the SVN repositoryfor "new parser attempts". Sheer brilliance isn't going to solve thisone.* Tim doesn't scale. Most of our other technical people don't scale.*We have no resources and still run on almost nothing*.($14m might sound like enough money to run a popular website, but forcomparison, I work as a sysadmin at a tiny, tiny publishing companywith more money and staff just in our department than that to do*almost nothing* compared to what WMF achieves. WMF is an INCREDIBLYefficient organisation.)- Other attempts:* Starting from a clear field makes it ridiculously easy. Thegovernment example quoted above is one. Wikia wrote a good WYSIWYGthat works really nicely on new wikis (I'm speaking here as anexperienced wikitext user who happily fixes random typos on Wikia). Ofcourse, I noted that we can't start from a clear field - we have anexisting body of wikitext.So, specification of the problem:* We need good WYSIWYG. The government example suggests that a simpleword-processor-like interface would be enough to give tremendousresults.* It needs two-way fidelity with almost all existing wikitext.* We can't throw away existing wikitext, much as we'd love to.* It's going to cost money in programming the WYSIWYG.* It's going to cost money in rationalising existing wikitext so thatthe most unfeasible formations can be shunted off to legacy forchewing on.* It's going to cost money in usability testing and so on.* It's going to cost money for all sorts of things I haven't eventhought of yet.This is a problem that would pay off hugely to solve, and that willtake actual money thrown at it.How would you attack this problem, given actual resources for grunt work?- d.
44 135
0 0

04 Jan '11
Hi all,I have looked through the web for the 20080726 version of the dump file"pages-articles.xml.bz2".But I can't find any result.Can anybody provide me a download link? Thank a lot!Following are the summarization of other versions of this file I have yetfound.Wish they are useful for you.2010-10-11http://download.wikimedia.org/enwiki/20101011/2010-09-16http://download.wikimedia.org/enwiki/20100916/2010-09-04http://download.wikimedia.org/enwiki/20100904/2010-08-17<http://download.wikimedia.org/enwiki/20100904/>enwiki-20100817-pages-articles.xml.bz2<http://www.monova.org/details/3873361/enwiki-20100817-pages-articles.xml.bz…>(6.06GiB) onmonova.org2010-07-30enwiki-20100730-pages-articles.xml.bz2<http://www.monova.org/details/3869561/enwiki-2010730-pages-articles.xml.bz2…>(6.07GiB) onmonova.org2010-05-14enwiki-20100514-pages-articles.xml.bz2<http://www.monova.org/details/3780808/enwiki-20100514-pages-articles.xml.bz…>(5.87GiB) onmonova.orghttp://dumps.wikimedia.org/archive/enwiki/20100514/2010-03-12http://dumps.wikimedia.org/archive/enwiki/20100312/2010-01-30http://download.wikimedia.org/enwiki/20100130/2009-10-09http://jeffkubina.org/data/download.wikimedia.org/enwiki/20091009/2009-06-18<http://download.wikimedia.org/enwiki/20100130/>PirateBay<http://thepiratebay.org/torrent/4978482> has enwiki-20090618-pages-articles.xml<http://torrents.thepiratebay.org/4978482/enwiki-20090618-pages-articles.xml…>,4.9 GiB (5258589574 Bytes)2008-10-08http://jeffkubina.org/data/download.wikimedia.org/enwiki/20081008/2008-06-21http://www.torrentportal.com/details/4621368/Wikipedia+Wiki+Static+HTML+Dum…2008-01-03http://jeffkubina.org/data/download.wikimedia.org/enwiki/20080103/English Wikipedia dump from2008-01-03<http://www.archive.org/details/enwiki-20080103>Best,Monica
7 24
0 0
Results per page:

[8]ページ先頭

©2009-2025 Movatter.jp