Menu

Main Menu
Talk Get Daily Search

Member's Online

    User Name
    Password

    Faster access to a website

    Reply
    Page 1 of 2 | 1   2   | Next
    Addison | # 1 | 2012-04-29, 06:02 | Report

    Hey all.

    I found a really terrific music site.
    http://mp3juices.com/

    It runs really slow and chunky except for when downloading on my tablet.

    I was wondering if perhaps there was a way to just have a terminal front end to this site.

    Kind of like the bash script that was written for GoEar:

    Code:
    #!/bin/bash
    # example: goear corey+sunglasses+at+night
    # author: at least 3 bash coders
    #Si el usuario no pone lo que busca en la linea de comandos, se lo pedimos.
    ##################
    if [ -z "$1" ]; then
        echo "Sintaxis:"
        echo "  $0 <artist+word1+word2>"
        echo   Note: Argument must be plus\(+\) separated or between \"\"
        exit
    fi
    ##################
    tmpdir=/tmp
    prefix=$RANDOM
    SEARCH=$tmpdir/$prefix-titulos.txt
    CANCIONES=$tmpdir/$prefix-canciones.txt
    ENLACES=$tmpdir/$prefix-enlaces.txt
    TITULOS=$tmpdir/$prefix-titulos.txt
    TODOWNLOAD=$tmpdir/$prefix-download.txt
    ##################
    
    if [ $1 ]; then
        TITULO="$@"
    else
        echo "Title and Artist:"
        read TITULO
    fi
    
    #Descargamos el PHP correspondiente al título.
    wget http://goear.com/search.php?q="$TITULO" -O $SEARCH
    
    #line number containing links is always changing so looking for a pattern
    head -$(grep -i -n -e 'ventana independiente' $SEARCH | cut -d ":" -f 1) $SEARCH | tail -1 > $CANCIONES
    
    #Mediante ER, obtenemos una lista de canciones y una lista de enlaces.
    #By using RegExp... we get song list and links
    egrep -o 'listen/......./[^"]' $CANCIONES > $ENLACES
    egrep -o '"Escuchar[^"]' $CANCIONES | grep -v 'en una ventana independiente' > $TITULOS
    
    #Mostramos al usuario los que ha encontrado en la primera página.
    Linea=1
    cat $TITULOS | while read line;
        do {
            echo $Linea: ${line:9}
            let 'Linea += 1'
        }
        done
    
    #Si no encuentra nada, sale.
    CONDICION=`wc -l $TITULOS | awk '{print $1}'`
    if [ $CONDICION == 0 ]; then
        echo "No hay resultados. Prueba buscando otra cosa."
        rm -f $SEARCH $CANCIONES $ENLACES $TITULOS
        exit
    fi
    
    #Leemos qué canción quiere el usuario bajarse.
    echo "¿Cuál te quieres bajar? Indica el número: (0 para Cancelar)"
    read NUMERO
    
    #Cero para Cancelar
    if [ $NUMERO = 0 ]; then
        rm -f $SEARCH $CANCIONES $ENLACES $TITULOS
        echo "Hasta pronto."
        exit
    fi
    
    #Concatenamos http://www.goear.com con el contenido de aBajar.txt.
    #PD: Alguien sabe hacerlo de manera más sencilla?
    GOEAR=http://www.goear.com/
    head -$NUMERO $ENLACES | tail -1 > $TODOWNLOAD
    for LISTEN in `cat $TODOWNLOAD`
    do
        ENLACE=${GOEAR}${LISTEN}
    done
    echo $ENLACE
    
    #A partir de aquí el script no es mío, pero es muy sencillo de leer.
    fileid=`echo $ENLACE | cut -d '/' -f 5`
    xmlurl="http://www.goear.com/tracker758.php?f="$fileid
    infoline=`wget -qO- $xmlurl | grep ".mp3"`
    mp3url=`echo $infoline | cut -d '"' -f6`
    artist=`echo $infoline | cut -d '"' -f10`
    title=`echo $infoline | cut -d '"' -f12`
    rm -f $SEARCH $CANCIONES $ENLACES $TITULOS $TODOWNLOAD
    wget $mp3url -O "$artist"_-_"$title.mp3"
    #mplayer -cache 1024 -softvol -softvol-max 1000 $mp3url
    Any thoughts to if this could be possible?

    Thanks!

    Edit | Forward | Quote | Quick Reply | Thanks
    The Following 2 Users Say Thank You to Addison For This Useful Post:
    don.edri, wesgreen

     
    auouymous | # 2 | 2012-04-30, 07:30 | Report

    Originally Posted by Addison View Post
    It runs really slow and chunky except for when downloading on my tablet.

    Any thoughts to if this could be possible?
    Anything is possible when you're a coder!

    Just have to split the downloaded HTML into patterns and extract the data you need. You could get some books on learning BASH, sed and awk and do it yourself or hope someone else will be interested in using the site and write the script for your.

    There are a couple things you can do to make websites faster on the tablet.

    1) Turn off Javascript before opening the page. Some sites might not work properly when disabled, just turn it back on when using them.
    2) Turn off images before opening the page. Sadly, that site does not support blind users (no alt tags on the download and listen buttons). This means you need images enabled in order to click the download button. If you know any Javascript you could try writing a bookmarklet that finds all of the download images and adds an alt tag. This is untested but might work, don't add any spaces.
    Code:
    javascript:allimgs=getElementsByTagName('img');for(i=0;i<allimgs.length;i++)if(/download_button/.test(allimgs[i].src))allimgs[i].alt='download';
    3) Blacklist social and tracking IPs in your /etc/hosts file. Load the site in Chrome, click the cookie icon in URL bar and show all cookies. Add each one to /etc/hosts after "127.0.0.1 " with one per line. Don't add cookies for the site itself or any social networks you want to use or you won't be able to load anything from them. Some images might be loaded from dedicated image servers and those can be blacklisted as well. Getting rid of the third-party cookies reduces the number of remote requests and in many cases removes ad images at the same time.
    Code:
    127.0.0.1 pixel.quantserve.com
    127.0.0.1 l.sharethis.com
    ...
    I would have posted my blacklist but my router currently has a corrupted config and wifi isn't working, and not going to copy them by hand.

    Edit | Forward | Quote | Quick Reply | Thanks
    The Following User Says Thank You to auouymous For This Useful Post:
    Addison

     
    Addison | # 3 | 2012-04-30, 07:39 | Report

    Okay.

    I'm going to attempt all of your suggestions right now.

    Thanks!

    And yeah, I think this is one of the best sites I have ever found online.

    The only problem, well, not really, is that it appends the song title with the site's name for each download.

    What's the command line to mass delete "- [MP3JUICES.COM]" from every file in a folder?

    Edit | Forward | Quote | Quick Reply | Thanks

     
    Addison | # 4 | 2012-04-30, 07:42 | Report

    Oh, I just thought of something....

    I think I'll try Qwerty's simple bash Web Browser.

    I forget what it's called but it's actually pretty nifty.

    Edit | Forward | Quote | Quick Reply | Thanks

     
    auouymous | # 5 | 2012-04-30, 07:45 | Report

    Originally Posted by Addison View Post
    What's the command line to mass delete "- [MP3JUICES.COM]" from every file in a folder?
    https://www.google.com/search?q=unix...from+filenames

    Edit | Forward | Quote | Quick Reply | Thanks
    The Following User Says Thank You to auouymous For This Useful Post:
    Addison

     
    Addison | # 6 | 2012-07-19, 11:01 | Report

    A friend of mine wrote this script and here's what he had to say.

    Originally Posted by
    This is juicer. Copy it to you tablet to /home/user then in bash as root:
    chmod +x juicer
    mv juicer /usr/bin/

    And now you can just go to directory where you want the file downloaded (as normal user preferrably, something like 'cd /home/user/MyDocs' or 'cd /media/mmc1/music/') and just run:
    juicer what+you+want+to+download

    Your download should start soon and should download all the results from the what+you.... search results page (use + instead of spaces like with goear).

    I've tried it but I keep running into the following error:
    -bash: /usr/bin/juicer: /bin/osso-xterm^M: bad interpreter: No such file or directory

    Any suggestions?

    I current have bash installed.

    Edit | Forward | Quote | Quick Reply | Thanks
    Attached Files
    File Type: zip juicer.zip (467 Bytes, 102 views)

     
    lma | # 7 | 2012-07-19, 11:51 | Report

    Originally Posted by Addison View Post
    I've tried it but I keep running into the following error:
    -bash: /usr/bin/juicer: /bin/osso-xterm^M: bad interpreter: No such file or directory
    The attachment doesn't have any mention of osso-xterm, is that really the same as the script you are trying to run?

    Other than that, it seems to have DOS/Windows line endings, you should convert it to Unix somehow.

    Originally Posted by
    I current have bash installed.
    Incidentally, I don't see anything in that script that's bash-specific, you should be able to use it with /bin/sh as well.

    Edit | Forward | Quote | Quick Reply | Thanks
    The Following User Says Thank You to lma For This Useful Post:
    Addison

     
    Addison | # 8 | 2012-07-19, 11:57 | Report

    Yeah, this is the script I'm trying to run.

    I only mentioned goear with you earlier because I once had that working.

    I did download the file to my Windows XP computer so I could upload it here though.

    But the one I have on my tablet, I downloaded directly to that, no Windows.

    Edit | Forward | Quote | Quick Reply | Thanks

     
    lma | # 9 | 2012-07-19, 12:13 | Report

    How does osso-xterm enter the picture then? Can you try "bash -x /usr/bin/juicer" and also "sh -x /usr/bin/juicer" and post the output here?

    Edit | Forward | Quote | Quick Reply | Thanks
    The Following User Says Thank You to lma For This Useful Post:
    Addison

     
    Addison | # 10 | 2012-07-19, 12:21 | Report

    Whoops. I changed bash to osso-xterm hoping that would fix it.

    Okay, I changed it back.

    juicer guster
    -bash: /usr/bin/juicer: /bin/bash^M: bad interpreter: No such file or directory


    bash -x /usr/bin/juicer guster
    + $'\r'
    : command not foundne 2:
    + $'\r'
    : command not foundne 3:
    /usr/bin/juicer: line 27: syntax error: unexpected end of file



    sh -x /usr/bin/juicer
    +
    : not foundicer: line 2:
    +
    : not foundicer: line 3:
    /usr/bin/juicer: line 27: syntax error: end of file unexpected (expecting "then")

    Edit | Forward | Quote | Quick Reply | Thanks

     
    Page 1 of 2 | 1   2   | Next
vBulletin® Version 3.8.8
Normal Logout