Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Wikitech-lJune 2008

wikitech-l@lists.wikimedia.org
  • 86 participants
  • 93 discussions
Start a nNew thread
Database dumps
by Byrial Jensen 17 Apr '25

17 Apr '25
Until some weeks agohttp://dumps.wikimedia.org/backup-index.html usedto show 4 dumps in progress at the same time. That meant that newdatabase dumps normally was available within about 3 weeks for alldatabases except for enwiki and maybe dewiki where the dump process dueto size took longer time.However the 4 dumps processes at one time become 3 some weeks ago. Andafter massive failures at June 4, only one dump has been in progress atthe same time. So at the current speed it will take several months tocome thru all dumps.Is it possible to speed up the process again using several dumpprocesses at the same time?Thank you,Byrial
3 2
0 0
EBNF grammar project status?
by Steve Bennett 01 Apr '25

01 Apr '25
What's the status of the project to create a grammar for Wikitext in EBNF?There are two pages:http://meta.wikimedia.org/wiki/Wikitext_Metasyntaxhttp://www.mediawiki.org/wiki/Markup_specNothing seems to have happened since January this year. Also the comments onthe latter page seem to indicate a lack of clear goal: is this just a funproject, is it to improve the existing parser, or is it to facilititate anew parser? It's obviously a lot of work, so it needs to be of clearbenefit.Brion requested the grammar IIRC (and there's a comment to that effect athttp://bugzilla.wikimedia.org/show_bug.cgi?id=7), so I'm wondering what became of it.Is there still a goal of replacing the parser? Or is there some alternativeplan?Steve
26 217
0 0
MediaWiki to Latex Converter
by Hugo Vincent 18 Jun '12

18 Jun '12
Hi everyone,I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/) and I need to extra the content from it and convert it into LaTeX syntax for printed documentation. I have googled for a suitable OSS solution but nothing was apparent.I would prefer a script written in Python, but any recommendations would be very welcome.Do you know of anything suitable?Kind Regards,Hugo Vincent,Bluewater Systems.
6 13
0 0

13 Oct '09
I've been putting placeholder images on a lot of articles on en:wp.e.g. [[Image:Replace this image male.svg]], which goes to[[Wikipedia:Fromowner]], which asks people to upload an image if theyown one.I know it's inspired people to add free content images to articles inseveral cases. What I'm interested in is numbers. So what I'd need isa list of edits where one of the SVGs that redirects to[[Wikipedia:Fromowner]] is replaced with an image. (Checking which ofthose are actually free images can come next.)Is there a tolerably easy way to get this info from a dump? AnyWikipedia statistics fans who think this'd be easy?(If the placeholders do work, then it'd also be useful convincing somewikiprojects to encourage the things. Not that there's ownership ofarticles on en:wp, of *course* ...)- d.
7 11
0 0
<p>s all over the place
by Sergey Chernyshev 03 Jul '08

03 Jul '08
Hi,I'm getting complaints from users using my Widgets and HeaderTabsextensions' parser functions that their output is mangled with <p> tags.Looking into the issues I identified two separate problems and I'll be veryhappy if you can confirm that I'm correct and help me resolve them.First issue: Output of all parser functions is preceded with "\n\n" inParser.php line 2975 (on current stable 1.12 branch) which purposefullyforces closing </p><p> combination which contradicts with users expectationsthat it will actually be inline if they put parser function inline. Here'sthe code: # Replace raw HTML by a placeholder # Add a blank line preceding, to prevent it from mucking up # immediately preceding headings if ( $isHTML ) { $text = "\n\n" . $this->insertStripItem( $text ); }This is quite distracting since there is no way to work around this inextensions or page text.Second issue: If output of the function has line breaks in it, it getspopulated by lots of <p> tags which might not be desirable if extension issupposed to preserve HTML structure (e.g. Widgets extension). I found apiece of instruction on how to avoid it by using unique markers and'ParserAfterTidy' hook:http://www.mediawiki.org/wiki/Manual:Tag_extensions#How_can_I_avoid_modific…please let me know if this is still going to work correctly with newparser implementation.I'll greatly appreciate your help with the matter.Thank you, Sergey-- Sergey Chernyshevhttp://www.sergeychernyshev.com/
3 8
0 0
Deprecate $wgSysopUserBans?
by Max Semenik 02 Jul '08

02 Jul '08
This option was making sense when blocking registered users was anexperimental feature, but currently it's set to true everywhere andhardly there's any third-party wiki that actually use it. So, does itmake sense to remove this option completely?-- Max Semenik ([[User:MaxSem]])
8 9
0 0
On Fri, Jun 27, 2008 at 11:06 AM, <demon(a)svn.wikimedia.org> wrote:> Log Message:> -----------> Add no-ops for the (un)lock functions.>> Modified Paths:> --------------> trunk/phase3/includes/db/DatabaseMssql.php> trunk/phase3/includes/db/DatabaseOracle.php> trunk/phase3/includes/db/DatabaseSqlite.phpYou know, maybe it would be an interesting idea to actually use realpolymorphism in the Database class rather than making Database ==DatabaseMySQL and have everything else override that? How about theno-op lock() and unlock() go in Database, and get overridden inDatabaseMySQL (which is the only one where they're different)?
2 1
0 0
Feed Icon in Browsers Adress Bar
by Alexander Kluge 29 Jun '08

29 Jun '08
Hi List,how is it possible to get the (rss/atom) feed icon in the address bar of the browser. At the moment I use the WikiFeeds extension which creates an own feed being shown in the address bar.But I want the mediawiki's internal feed (which displays the changes much better) used when you click the Recent Changes page - so how do i show this feed in the address bar of the browser so that the feed can be subscribed to on every page of the wiki?PS: Is there an implementation planned with which you can configure the feed?Thx in advance,Regards, Alex-- *Game based eVideo*Ein ESF-gefördertes Projekt der FHTW Berlin.eLearning | eVideo | Web 2.0 | Serious Gamesbüro: +49 (0)30 50 19 26 47http://evideo.fhtw-berlin.deAlexander Klugehttp://www.alexkluge.deStudentischer Mitarbeitermobil: +49 (0)163 60 51 036alexander.kluge(a)fhtw-berlin.deFHTW - Fachhochschule für Technik und WirtschaftTreskowallee 8, 10318 Berlin
3 3
0 0
GSoC Project: Category Moving
by Tim Johansson 29 Jun '08

29 Jun '08
Hello, wikitech.I have applied to Google Summer of Code with the project to enablecategory moving without using bots. After some correspondance withCatrope, the following text is my project idea. Any feedback would bewelcome.SynopsisI will provide capability of moving categories to achieve an effectfor the end-user similar to that of moving other pages. Currently,contributors must apply to use a bot that recreates the category pageand changes the category link on all relevant articles.ProjectThe object can be divided into three parts. First, the category pageis moved, along with its history, just as renaming of articles works.A redirect is optionally placed on the old category page, and thecategory discussion is moved as well.Second, all articles in the relevant category must have their categorylinks changed. There are several obstacles involved in this task:1. Finding all alternative ways of categorizing articles. It is simpleto match the simple category links and category lists, but moredifficult to find e.g. categories included from a template. RoanKattouw (Catrope) suggested category redirects for this, such that allarticles categorised as [[Category:A]] would also be listed at[[Category:B]] if the prior has been redirected to the latter.2. Articles might be in the process of being edited as the movement isdone. This, however, can be solved in the same manner as editcollisions are currently solved.3. The algorithm would likely have high complexity and would thus notscale well with very large categories.This is likely to constitute a significant and challenging part of the project.As the last step, the relevant entries in the categorylinks tablewould need to be changed. This is accomplished by a simple SQL query.This could be avoided if bug #13579 [1] ("Category table should usecategory ID rather than category name") is fixed, which it could be aspart of this project.The project would preferably be written as a patch to the core.Catrope suggested setting up a separate SVN branch for the project,such that everyone can see my progress.Profits for MediaWikiDeveloping a means of moving categories would decrease dependency onbots, gaining in administrative time. Additionally, the solution wouldbe faster than any bot-relying solution could be due to, among otherthings, the removed need of loading pages.Category moving would also increase the consistency in layout on thedifferent article types. The only real reason for a "move" tab not toreside on category pages is that the feature is not yet implemented.RoadmapPublishing this document to the MediaWiki development community(wikitech-l) and awaiting comments on the planned procedure would bethe first step.After the community bonding period specified by the time line, a weekshould be enough to get comfortable with the relevant MediaWiki codeand implement the first section, moving the category page along withits discussion and history. Much old code should be reusable here,such as the Title::moveTo() method for moving pages.Until mid of July, most of the second part of the project should befinished. In a week from there, the last part would be completed, too.A month is then reserved for bug-testing, tweaking and as a buffer forunexpected obstacles. The MediaWiki community is very important inthis step for testing and feedback.Regards-- Tim Johanssonhttp://timjoh.com/
8 13
0 0

28 Jun '08
Hi,I'm still working on upgrading us to 1.12 and after upgrading theextensions, I'm noticing that the 1.12 upgrade still takes a lot longer torender a page. For ( $elapsed = $now - $wgRequestTime;) for 1.9, I usuallysee on average about 0.15 seconds to serve a page, while 1.12 I'm seeingmore like 0.50 seconds, with both installations running on the same serverand connected to the same DB server, etc.Here are some of the functions I'm seeing take awhile in the profiler. 462.884 MediaWiki::initialize 395.097 MediaWiki::performAction 328.284 Article::view 233.001 Parser::parse 232.899 Parser::parse-Article::outputWikiText 134.212 Parser::internalParse 103.388 Parser::replaceVariables 252.385 MediaWiki::finalCleanup 251.857 OutputPage::output 251.402 Output-skin 251.332 SkinTemplate::outputPage 209.752 SkinTemplate::outputPage-execute 855.008 -totalThe full profile is here:http://69.20.102.10/x/profile_deep.txtAny ideas? I've tried drilling down into finding out why some of these aretaking a long time. replaceInternalLinks seems to take a long time sometimesbecause the LocalFile::loadFromDB DB select statement sometimes takes over50ms, but it looks like our indices in the image table are fine.mysql> check table image;+------------------+-------+----------+----------+| Table | Op | Msg_type | Msg_text |+------------------+-------+----------+----------+| wikidb_112.image | check | status | OK |+------------------+-------+----------+----------+1 row in set (1.73 sec)Thanks,Travis
4 13
0 0
Results per page:

[8]ページ先頭

©2009-2025 Movatter.jp