Notices


Reply
Thread Tools
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#41
Originally Posted by sollos View Post
Hi, crei,

I'm working with the Vietnamese dump of wikipedia. However, after get the 'commons' part of wikipedia, the program always give me an message like this:


then breaks.

Could you please give me an advice on how to solve this case? I'm just a noob in this field. Thanks!

P.S: I've tried to download the file manually, commented the "getSourceDumps $language" in the script the run it. After that, I've received another error:


As I understand, the guide in maemowiki tell me to create an empty database, so there should be no table in this one. So why I'm getting this message?
Sorry, I don't know the cause of your problem with wget, but just downloading the relevant files (you need more than one!) manually, as you did, should do it, too. Just take care to put the files in the right places and change their names as in the function getSourceDumps.

For the second problem: Have you already completely imported commons? The same problem should occur there, too.

The database needs to be empty before starting the script. The function importLanguage calls createTables and this one creates the needed tables in the database. Could you check if everything worked in that call?
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#42
Originally Posted by hcm View Post
That sounds interesting! I'm missing the formulas in the german dump!
How do I create a dump with rendered formulas? And what is stored in the database while creating a dums? (I would like to know this because I make a daily backup of my database, with the whole wikipedia this could become quite large )
The math formulas feature was completed on Tuesday, so the German dump has to be recreated to contain math formulas. As the need for dumps increases, we should perhaps think about some infrastructure for distributed creation of dumps, at least for the languages with large Wikipedias. Also the server hosting the dumps is slowly running out of harddisk space...

So if someone wants to contribute to some kind of dump farm, please contact me.
 
Posts: 4 | Thanked: 0 times | Joined on Feb 2010
#43
Hmm,

I think I figured out where the problem is. I don't know whether this is because of the server or my internet connection, but there are some files I can't download, both in commonswiki and viwiki (for Vietnamese wiki). In detail, I can't download the file 'commonswiki-latest-categorylinks.sql.gz' in commonswiki and in viwiki it's 'viwiki-latest-image.sql.gz'.

Is there anybody facing the same problem with me?
 
Posts: 119 | Thanked: 14 times | Joined on Nov 2009
#44
just roadtested the offline reader. preloaded the maps in meap and it work fine. Sometimes it gets in a 'no connection, do you want to connect' loop. keeps asking ever minute or so. (maybe something to do with the gps that wants a connection for a fix) Furthermore there is no search inside the articles function. all you can do is search for the topic name, searching inside the articles is probably to much for the cpu.
 
Posts: 344 | Thanked: 73 times | Joined on Jan 2010
#45
The link for the English dump at the front of this thread is dead.

Will the regular Wikipedia dumps work for Evopedia?

There here
http://download.wikimedia.org/enwiki/

Ed. Well, those links are nonworking, too.

So, how about that English dump for Evopiedia, then, any news on it?
__________________
N900.... thick like computer

Last edited by oldpmaguy; 2010-02-15 at 01:28.
 
Posts: 119 | Thanked: 14 times | Joined on Nov 2009
#46
So. Any new on the dumps? Regards, Ruud
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#47
Originally Posted by RDJEHV View Post
So. Any new on the dumps? Regards, Ruud
Unfortunately, no. I hope I can finish a first test version of "dump at home" this weekend. I'll put a link on the wiki page once it's ready.

Last edited by crei; 2010-02-27 at 16:11.
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#48
For people waiting for an English dump, the old converted dump has been uploaded again. Images and math formulas do not work, but everything else should be fine. Please download it from

http://wiki.maemo.org/Evopedia

"dump at home" is almost ready for release but unfortunately not completely.
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#49
 
Posts: 58 | Thanked: 42 times | Joined on Jan 2010
#50
"dump at home" has produced its first Wikipedia dump (in English). Please download it from http://wiki.maemo.org/Evopedia.

We plan to start creating the next dump (Dutch was proposed), which should be generated much faster than the English dump, in the next days. Please consider joining and tell me if you want dumps of specific languages. Also note that this system should work (though it was never tried) with any Wikipedia, so evopedia also works with wikiversity, wikibooks, wiktionary, ..., i.e. everything listed here.
So please tell me which languages/wikipedias you would like; small wikipedias should be done in less than a day (assuming enough participants).
 
Reply


 
Forum Jump


All times are GMT. The time now is 06:58.