Reply
Thread Tools
suitti's Avatar
Posts: 96 | Thanked: 7 times | Joined on Sep 2007
#1
I perform backups by making a copy of everything on my desktop over wifi. I've been having problems with my 16 GB SD card, and have recreated the file system and restored data repeatedly. I currently use about 7 GB of space. So, i created a single gzipped tar archive on my Unix desktop, copied it to the n800 via ftp over wifi, and attempted to extract it.

Though ftp will happily create the large file, gzip, tar and cat refuse to extract it. They say the file is too large. Further, busybox 'ls' refuses to list it (even if asked for just the file name!), and busybox 'rm' refuses to delete it.

So these commands fail:
tar xzf misc.tgz
gzip -d -c misc.tgz | tar xf -
cat misc.tgz | tar xzf -
cat misc.tgz | gzip -d -c | tar xf -
rm misc.tgz
ls

I wrote a short 'cat' equivelent, compiled it with gcc on the n800, and it too failed to open the file. I used the open(2) system call. Is there a version of open(2) that handles large files? Or is it some ulimit noise? 'ulimit' reports 'unlimited'.

I wrote a quick perl script to delete the file:

cat > u.pl
unlink 'misc.tgz';
^d
perl u.pl

...which worked. And a good thing. I thought i was going to have to recreate the file system, just to delete the file.

Then, i created archives of less than 2 GB, and extracted those.

What's going on?

I could be wrong, but it seems my older OS2008 didn't have these issues.

Linux in general has been handling multi-gigabyte files routinely on 32 bit systems for some time.
 
ace's Avatar
Posts: 296 | Thanked: 80 times | Joined on Dec 2007
#2
I assume you're not using a FAT filesystem.
 
suitti's Avatar
Posts: 96 | Thanked: 7 times | Joined on Sep 2007
#3
I'm using ext3.
 
Reply


 
Forum Jump


All times are GMT. The time now is 18:29.