|
2009-01-23
, 16:46
|
Posts: 110 |
Thanked: 52 times |
Joined on Sep 2007
|
#32
|
|
2009-01-23
, 17:27
|
Posts: 1,208 |
Thanked: 1,028 times |
Joined on Oct 2007
|
#33
|
Well, it seems that for the full enwiki, Wikipedia Dump Reader is very slow. I wonder why that is. Is the amount of data just too large to expect anything else, or is there room for optimization? (Perhaps switching from a linear to a logarithmic search, or something).
gzip -cdf indexfile | grep searchword
|
2009-01-23
, 18:03
|
Posts: 1,208 |
Thanked: 1,028 times |
Joined on Oct 2007
|
#34
|
The Following 3 Users Say Thank You to mikkov For This Useful Post: | ||
|
2009-01-23
, 21:19
|
Posts: 110 |
Thanked: 52 times |
Joined on Sep 2007
|
#35
|
|
2009-01-23
, 21:43
|
|
Posts: 3,397 |
Thanked: 1,212 times |
Joined on Jul 2008
@ Netherlands
|
#36
|
I plugged in a 2GB microsd card, through the same adapter. I then installed the GParted maemo hack from another thread and formatted the 2GB card as ext3.
|
2009-01-25
, 01:35
|
Posts: 110 |
Thanked: 52 times |
Joined on Sep 2007
|
#37
|
|
2009-01-25
, 21:54
|
Posts: 1,208 |
Thanked: 1,028 times |
Joined on Oct 2007
|
#38
|
The Following 2 Users Say Thank You to mikkov For This Useful Post: | ||
|
2009-01-26
, 18:14
|
|
Moderator |
Posts: 7,109 |
Thanked: 8,820 times |
Joined on Oct 2007
@ Vancouver, BC, Canada
|
#39
|
It does hold the record for the longest word published in an English language publication in a serious context — that is, for some reason other than to publish a very long word...
|
2009-02-05
, 06:55
|
Posts: 6 |
Thanked: 9 times |
Joined on Oct 2008
|
#40
|
The Following 3 Users Say Thank You to Entonian For This Useful Post: | ||
cd /media
sudo gainroot
chown -R user mmc1
chgrp -R users mmc1
And the error goes away. Now the 2GB card is working brilliantly with ext3, I'm going to do the same to the 8GB card, and then I can copy over the Wikipedia dump.