Movatterモバイル変換


[0]ホーム

URL:


Keyboard Shortcuts

Thread View

  • j: Next unread message
  • k: Previous unread message
  • j a: Jump to all threads
  • j l: Jump to MailingList overview
List overview
Download

Wikitech-lMarch 2007

wikitech-l@lists.wikimedia.org
  • 115 participants
  • 173 discussions
Start a nNew thread
MediaWiki to Latex Converter
by Hugo Vincent 18 Jun '12

18 Jun '12
Hi everyone,I recently set up a MediaWiki (http://server.bluewatersys.com/w90n740/) and I need to extra the content from it and convert it into LaTeX syntax for printed documentation. I have googled for a suitable OSS solution but nothing was apparent.I would prefer a script written in Python, but any recommendations would be very welcome.Do you know of anything suitable?Kind Regards,Hugo Vincent,Bluewater Systems.
6 13
0 0
10k
by Domas Mituzas 01 Dec '07

01 Dec '07
Hi,today we came over 10k HTTP requests per second (even with inter-squidtraffic eliminated). Especially thanks to Mark and Tim, who've beenimproving our caching, as well as doing lots of other work, andachieved incredible results (while I was slacking). Really, thanks!Domas
16 21
0 0
Rating Articles Extension
by Travis Derouin 19 Jun '07

19 Jun '07
Hey,I've put together an extension for rating articles if anyone isinterested. It's just a first version and hasn't been tested much, butthe details can be found here:http://www.wikihow.com/WikiHow:RateArticle-ExtensionYou can see an example here on our development server:http://wiki16.wikidiy.com/Get-a-Better-Deal-on-a-Home-Loan(username password wikihow / wikihow2006) - scroll down to the bottomof the page for the checkmarks.I'd appreciate feedback if anyone has any. If someone wants to addthis to extensions in svn, that'd be great.Thanks,Travis
11 22
0 0
table text is full
by Travis Derouin 22 Apr '07

22 Apr '07
We have reached the maximum table size limit of 4GB for our text table- what's the best way around or to fix this?Travis
6 13
0 0

06 Apr '07
Dear Wikitechnicians,My name is Reid Priedhorsky, and I'm a Ph.D. student at GroupLens Research, which is the human-computer interaction group at the University of Minnesota.We are currently working on some research which is investigating Wikipedia contribution and vandalism. To this end, statistics on the view rate of different articles would be extremely helpful to us -- something along the lines of Leon Weber's WikiCharts tool, but with a larger limit (ideally all 1.7 million articles).It seems to me that the easiest way to accomplish this would be to get copies of your sampled Squid logs (as described on <http://lists.wikimedia.org/pipermail/wikitech-l/2007-January/029000.html> and its links). We do not need the client IP or any other similarly sensitive data, though if you gave it to us we would protect it carefully as we protect the other sensitive research data we handle.Would it be possible for us to have access to these log files?If not, I would love to begin a discussion on what it would be possible for us to access.Your help would be greatly appreciated. Please let me know if you have any questions.Thanks,Reid
11 18
0 0
Saving pages from WP with stylesheet
by Erik Moeller 05 Apr '07

05 Apr '07
Whenever I save a page from Wikipedia, or any other MW site, usingFirefox, the resulting HTML loses its screen stylesheet. This seems tobe due to the way we are embedding the stylesheet, i.e.:<style type="text/css" media="screen,projection">/*<![CDATA[*/ @import"/skins-1.5/monobook/main.css?61"; /*]]>*/</style>This appears to cause Firefox to not treat the stylesheet as part ofthe page it needs to retrieve. Is there a way to avoid this behavior,either on the client or on the server side?-- Peace & Love,ErikDISCLAIMER: This message does not represent an official position ofthe Wikimedia Foundation or its Board of Trustees."An old, rigid civilization is reluctantly dying. Something new, open,free and exciting is waking up." -- Ming the Mechanic
11 23
0 0
It would be really nice if wikipedias Special:random-function could takearguments, making it possible to get a random article from user-specifiedcategories.A suggestion to do this is as follows. When a user click on the "randomarticle" link the random article is loaded with a comment on the top of thepage saying something like "you requested a random article. click here toview options" and so the the user can mark which categories the articleshould come from. The settings can either be stored in a cookie or they canbe sent in the URI.Everyone would love this feature!regards,Manne Tallmarken-- View this message in context:http://www.nabble.com/Feature-request%3A-Random-article-in-specified-catego…Sent from the Wikipedia Developers mailing list archive atNabble.com.
10 9
0 0
Possible changes regarding *all* Wikimedia IRC bots
by Sean Whitton (Xyrael) 01 Apr '07

01 Apr '07
Hey all. Please excuse the fact that this is posted to multiplemailing lists, but I need to ensure that it reaches all concerned.I'm writing as an IRC Group Contact to call all operators of WikimediaIRC bots to state their activity and use of their bot to me, using aprivate e-mail, as I'm doing a bit of reorganisation. I have a fewissues to address and would like to try and smooth out the ratherconvoluted situation we have right now.As many will know, I'm a freenode staffer and as such am in a positionto notice the lack of continuity present. In particular many of ourbots use flood-protection exemption due to the nature of theirmuch-data roles. Unfortunately, with the death of freenode's founderRob Levin, there is no longer a very organised record of which botsare doing what and where, and I would really like to establish abetter one. By doing this I aim to make things a lot easier for botoperators to get the information/permissions they need. Please notethat this has nothing to do with MediaWiki bot flags or communitypermission, which is still very important for bots that edit as wellas speak on IRC.I would like to make two major changes to what we do at the moment.Firstly, I would like to cloak all active bots with wikimedia/bot/nick(wikimedia can be replaced by wikipedia, wikisource etc.) and add themto a list I'll keep on meta (not set up yet, will see how this goesfirst!). At the moment, there are a good few usercloak/bot/botnick(e.g. I have wikimedia/xyrael/bot/winesteward) and a fewwikimedia/bot/botnicks around. I would like to make the distinctionthat usercloak/bot/botnick is a non-Wikimedia bot and that theproject/bot/botnick is for those that are run specifically for one ofour projects.Secondly, I would like to use one bot o:line for flood protection.Currently, there are a good few floating about, and I'm not sure whois actually using them actively - a cull of bots no longer in usewould be good from freenode's perspective. There are obvious trustissues with this in that one password leak would be a lot of troublecompared with bot operators guarding their own personal passwords, butI think it's worth it because of the two stage process of changing theo:lines: me as group contact and then as staff, then actually gettinghold of someone to make the change.I realise that I've rambled a bit here, and so I'll summarise my requests:* That all operators of IRC bots contact me via e-mail telling metheir bot nickname, what is does and what it is cloaked with, as wellas a note if it uses an o:line. I can then recloak them with yourassistance. This also has the purpose of weeding out inactive bots (noaction being taken yet, though).* That anyone who knows an operator who doesn't read any of the listsforwards this to them and asks them to complete my request, perhapstranslating if necessary.* That any ideas/complaints/ways-that-are-significantly-better-than-thisare expressed in this mailing list conversation!Thank you,[[m:User:Xyrael]]Friendly IRC group contact-- —Sean Whitton (Xyrael/xyr)sean(a)silentflame.comhttp://xyrael.net/
1 1
0 0
MediaWiki automated test run failure 2007-03-31
by brion@pobox.com 31 Mar '07

31 Mar '07
An automated run of parserTests.php showed the following failures:This is MediaWiki version 1.10alpha (r20860).Reading tests from "maintenance/parserTests.txt"...Reading tests from "extensions/Cite/citeParserTests.txt"...Reading tests from "extensions/Poem/poemParserTests.txt"... 17 still FAILING test(s) :( * URL-encoding in URL functions (single parameter) [Has never passed] * URL-encoding in URL functions (multiple parameters) [Has never passed] * Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html) [Has never passed] * Link containing double-single-quotes '' (bug 4598) [Has never passed] * message transform: <noinclude> in transcluded template (bug 4926) [Has never passed] * message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed] * BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed] * HTML bullet list, unclosed tags (bug 5497) [Has never passed] * HTML ordered list, unclosed tags (bug 5497) [Has never passed] * HTML nested bullet list, open tags (bug 5497) [Has never passed] * HTML nested ordered list, open tags (bug 5497) [Has never passed] * Inline HTML vs wiki block nesting [Has never passed] * Mixing markup for italics and bold [Has never passed] * dt/dd/dl test [Has never passed] * Images with the "|" character in the comment [Has never passed] * Parents of subpages, two levels up, without trailing slash or name. [Has never passed] * Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]Passed 494 of 511 tests (96.67%)... 17 tests failed!
1 0
0 0
After running importDump.php to completion, then running initStats.php to update the statistics for the newly imported database,I get the following output:[root@gadugi /]#[root@gadugi /]#[root@gadugi /]# cd /wikidump/en[root@gadugi en]#[root@gadugi en]# php maintenance/initStats.phpRefresh Site StatisticsCounting total edits...4798436Counting number of articles...1980988Counting total pages...4797798Counting number of users...1Counting number of admins...1Counting number of images...1092505Counting total page views...67977Updating site statistics...done.[root@gadugi en]#If I subsequently invoke rebuildall.php against this database, it reports double the number of pages, runs up to about 277000 articles, then thephp process goes to sleep and never wakes up. rebuildall.php reports the following wrong article count:[root@gadugi en]#[root@gadugi en]# php maintenance/rebuildall.php** Rebuilding fulltext search index (if you abort this will break searching; run this script again to fix):Rebuilding index fields for 9648325 pages...1500Jeff
3 4
0 0
Results per page:

[8]ページ先頭

©2009-2025 Movatter.jp