Register
It is currently Mon Sep 01, 2014 7:56 pm

bashscript wget save urls in array


All times are UTC - 6 hours


Post new topic Reply to topic  [ 2 posts ] 
Author Message
 PostPosted: Fri May 23, 2008 6:05 am   

Joined: Fri May 23, 2008 5:32 am
Posts: 1
Hi,

I have to write a script that implements a simple web browser (in text mode), using Wget and links -dump to display HTML pages to the user. The user has 3 choices: enter a URL, enter b for back and q to quit. The last 10 URL's entered by the user are stored in an array, from which the user can restore the URL by using the back (b) functionality.
I have no idea how to do this.

Thank you


Top
 Profile  
 PostPosted: Mon Oct 20, 2008 2:28 pm   

Joined: Thu Oct 16, 2008 3:05 pm
Posts: 13
what is the variable that the users type the url?

for instance:

URL=$(echo -n "Please enter URL: ")

The variable in red is what your looking for.


Top
 Profile  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 2 posts ] 

All times are UTC - 6 hours


Who is online

Users browsing this forum: Google [Bot] and 5 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
cron


BashScripts | Promote Your Page Too
Powered by phpBB © 2011 phpBB Group
© 2003 - 2011 USA LINUX USERS GROUP