Active Topics

 


Reply
Thread Tools
Posts: 6 | Thanked: 0 times | Joined on Nov 2007
#1
Hi ,

Would anyone know if there is available an offline web browser like webcopier for the N800?

Thanks,
FG
 
Posts: 18 | Thanked: 2 times | Joined on Dec 2007
#2
for OS2008 I see an application called httrack that "Copies websites to your computer (offline browser)"
It is in the maemo Extras repository

I have never tried it though.
 
Posts: 133 | Thanked: 8 times | Joined on Aug 2007 @ SF, CA
#3
Look at the RSS threads, they do this - but maybe its not exactly what you want. A thread very recently mentioned an app that takes the web pages for offline. Also, what about Google reader - can it do this? Sorry not to give you a direct answer - just suggestions.
 
Posts: 66 | Thanked: 9 times | Joined on Nov 2007
#4
There have been two groups working on google gears ports that I have seen.
The first is at garage
https://garage.maemo.org/svn/browser...e-gears/trunk/

And the second apparently built off that:
http://groups.google.com/group/googl...e78e0183695d36

My incomplete understanding is that if you can tinker with Linux, then you could probably get a semi working version built/installed from those projects. But for those of us who shouldn't tinker under the hood, then we'll have to wait.

I too am really looking forward to getting these sorts of services working. Think of 'remember the milk' or Zimbra available all the time.
 
Posts: 566 | Thanked: 150 times | Joined on Dec 2007
#5
There is Feedelity for RSS feeds. I think RSS is a particularly useful way to read newssites and blogs on the tablet. For Firefox there is the ScrapBook extension to create an off-line copy of a site. Maybe that could be ported to Microb?
 
Posts: 91 | Thanked: 16 times | Joined on Dec 2007
#6
Originally Posted by iamthewalrus View Post
There is Feedelity for RSS feeds. I think RSS is a particularly useful way to read newssites and blogs on the tablet. For Firefox there is the ScrapBook extension to create an off-line copy of a site. Maybe that could be ported to Microb?
I agree, Scrapbook should really be ported to maemo. At least for me and for those who don't always have available wireless internet connection, that add-on would be very useful.
 
Posts: 90 | Thanked: 2 times | Joined on Mar 2006
#7
I agree, some method to save a web page would be great. httrack is to much work. A print to pdf function would be handy, something like that.
 
Posts: 35 | Thanked: 17 times | Joined on Jan 2008
#8
what about wget?
 
Posts: 35 | Thanked: 17 times | Joined on Jan 2008
#9
more specifically, this:

wget -m -l 2 -k -E news.bbc.co.uk

-m : Causes it to mirror the site
-l 2: links of depth 2. Note that the higher this, the more it will take to download. A link depth of 1 is recommended, otherwise you will be dl'ing a lot.
-k and -E: ensure that the files are locally browsable and converts the extensions to html

Then, create an alias for wget -m -l 2 -k -E such as mirror in your bash profile and you are ready to go. You can even make simple bash scripts so that you download several web sites everyday to be readable offline.

Last edited by ustunozgur; 2008-02-17 at 14:51.
 

The Following User Says Thank You to ustunozgur For This Useful Post:
Posts: 566 | Thanked: 150 times | Joined on Dec 2007
#10
Originally Posted by ustunozgur View Post
more specifically, this:

wget -m -l 2 -k -E news.bbc.co.uk

-m : Causes it to mirror the site
-l 2: links of depth 2. Note that the higher this, the more it will take to download. A link depth of 1 is recommended, otherwise you will be dl'ing a lot.
-k and -E: ensure that the files are locally browsable and converts the extensions to html

Then, create an alias for wget -m -l 2 -k -E such as mirror in your bash profile and you are ready to go. You can even make simple bash scripts so that you download several web sites everyday to be readable offline.
How about simple frontend for wget as a Microb plugin? Seems doable.
 
Reply


 
Forum Jump


All times are GMT. The time now is 13:38.