Village Pump (proposals)
Discuss new proposals regarding the Wiki that are not policy related.
News about BI and this Wiki.
Discuss existing and proposed, new policies.
New proposals that are not policy related.
|Things To Do
Areas of the Biki that need attention.
Request assistance for the creation and update of content.
Technical issues related to the MediaWiki software.
Topics that do not fit into any other category.
Mirror of the VBS2 wiki
I maintain a number of VBS2 machines that are, by policy, forbidden to ever touch the internet. I would like to keep a local copy of the wiki for reference. Is it possible to obtain an offline snapshot of the bistudio wiki?
Wikipedia provides this at http://download.wikimedia.org/
- Try to track down the wiki export script of CrashDome. --.kju 23:41, 12 May 2009 (CEST)
What is a CrashDome? Is this wiki export script something 'out on the web' or buried someplace in the BI wiki/forum/blog pages? What I'm looking for is an SQL dump of the pages so I can build a functional, searchable mirror of the wiki off-line.
- First google hit for 'crashdome wiki export': http://www.armaholic.com/page.php?id=1202 :) --.kju 23:12, 14 May 2009 (CEST)
I am not allowed to install any programs on my internet-connected computer. I can work with the bureaucracy to have the script installed, but I imagine that anything named "Crashdome" will be frowned upon. I am looking for something I can fetch, burn to DVD, and carry to the off-the-net computers. If there is an SQL dump like from wikimedia/wikipedia, I can work with that. If there is a .zip file with the results of the crashdome script, I can work with that. (I could go home and install crashdome, but exchange of data from home computers to work computers is also frowned upon by the bureaucracy.) To make things more fun, the "VBS2" computers are not even in the same room as an internet connected computer.
I propose that a downloadable snapshot of this wiki be available from this wiki...
- Wiki administrators can create an XML dump of this wiki and compress it and post it for download. The process could be automated with a cron job to run the script as root: php maintenance/dumpBackup.php --current | gzip > wikibackup.gz --Grenadier f 17:42, 8 July 2010 (CEST)
- This would be nice if it could be automated. I noticed in the link you mentioned, their last update was over a year ago... so WikiMedia's system isn't exactly up to date. You can download your own copy of this wiki, as an html file, using HTTrack (http://www.httrack.com/). The search function of the wiki won't work, but the links will. --General Barron 20:41, 18 August 2009 (CEST)
- If your problem is you can't copy files from a physical drive, yet you are allowed to download files from the internet, then I would suggest: (1) download the files at home (using one of the above programs). (2) add them to a zip archive. (3) upload that archive to a free file hosting service like www.rapidshare.de, www.yousendit.com, www.mediafire.com, and so on (google "free file hosting"). (4) download the file on your internet-connected work computer, (5) internally transfer that file to your non-internet VBS2 computer. --General Barron 20:51, 18 August 2009 (CEST)
If I fetch using httrack or wget, I end up with a very nice but unsearchable version of the wiki. I have added ht://dig so I can search my internal site, but the search box on the left-side of the browser is too tempting for most people to resist. I am still looking for a snapshot in time of the mediawiki. They provide instructions: http://meta.wikimedia.org/wiki/MediaWiki#Database_dump. If you need automatic, you can have cron run the example php script once a (hour|day|week|month|year) - whatever is appropriate.