Register
It is currently Wed Apr 23, 2014 11:28 pm

grabs the nasdaq top performers


All times are UTC - 6 hours


Post new topic Reply to topic  [ 1 post ] 
Author Message
 PostPosted: Wed Aug 14, 2013 7:47 pm   

Joined: Fri Jul 19, 2013 5:17 pm
Posts: 8
its just a simple script that pulls the NASDAQ top 25 from yahoo finance and parses them into a a simple text file from the initial data dump to the cut down data to a file with a single column of stock symbols.
This is the pull script it gets a dump file from yahoo using the w3m web browser, also lynx works too
#begin mover script
#!/bin/bash

#movers
#EXTRA OPTIONS
uagent="firefox/22.0" #useragent (fake a browser)
sleeptime=01 #add pause between requests
{
w3m -dump "http://finance.yahoo.com/gainers?e=o" -T text/html >>'/home/user/Desktop/file1.txt'

#INITIAL PAGE
echo "[+] Fetching" && sleep $sleeptime
initpage=`curl -s -b "cookie.txt" -c "cookie.txt" -L --sslv3 -A "$uagent" "http://finance.yahoo.com/gainers?e=o"`
token=`echo "$initpage" | grep "authenticity_token" | sed -e 's/.*value="//' | sed -e 's/" \/>.*//'`

/bin/bash /home/user/Desktop/csvscript
}
rm "cookie.txt"
done
#end mover script

Te following is the csv script it is the chain saw that hacks it down through files 2,3,4 to the file 5 iteration using sed and awk commands to remove the unnecessary data
the "awk 'BEGIN{ found=0} /All headlines for/{found=1} {if (found) print }' " is where it really gets interesting I tripped across this on stack overflow and it works. It'll take you to with in a line of the wanted data and then the sed -e '1,2d' (change the numbers to what lines you want removed) shaves the rest of the unwanted data. However it is still in csv format. So I experimented with sed and found this arrangement sed 's/, /\r/g' puts a line feed in place of the comma and returns a txt file with a single column of stock symbols ready for further parsing.

#begin csvscript
#!/bin/bash



sleep 5

cat /home/user/Desktop/file1.txt | awk 'BEGIN{ found=0} /All headlines for/{found=1} {if (found) print }' > /home/user/Desktop/file2.txt

sed -e '1,2d' /home/user/Desktop/file2.txt > /home/user/Desktop/file3.txt

sed -e '3,33d' /home/user/Desktop/file3.txt > /home/user/Desktop/file4.txt

sed 's/, /\r/g' /home/user/Desktop/file4.txt > /home/user/Desktop/file5.txt

sleep 5



rm /home/user/Desktop/file2.txt

sleep 5

rm /home/user/Desktop/file3.txt

sleep 5

rm /home/user/Desktop/file1.txt

sleep 5

rm /home/user/Desktop/file4.txt
#end csvscript

Now, does any one have a script that will pull data from a text file for input to a website? :-B


Top
 Profile  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 1 post ] 

All times are UTC - 6 hours


Who is online

Users browsing this forum: Bing [Bot] and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
cron


BashScripts | Promote Your Page Too
Powered by phpBB © 2011 phpBB Group
© 2003 - 2011 USA LINUX USERS GROUP