Register
It is currently Wed Oct 01, 2014 8:22 pm

Search firefox log files on a K12LTSP server ...


All times are UTC - 6 hours


Post new topic Reply to topic  [ 5 posts ] 
Author Message
 PostPosted: Wed Jun 15, 2005 1:28 pm   
Site Admin
User avatar

Joined: Sun May 15, 2005 9:36 pm
Posts: 667
Location: Des Moines, Iowa
Original thread was here:
http://www.usalug.org/phpBB2/viewtopic.php?t=6830

Basically someone wanted to be able to tell which student was looking at , shall we say, in-appropriate websites. I suggested a grep of log files. I'm sure that the script could be refined quite a bit, mine was just a 45 second hack :D :D ...... so refine away if you would like :D


------------------- below is reposted from usalug.org ---------------
perhaps a grep of the firefox file logs for all the students is in order ? ;)

/home/USERNAME/.mozilla/firefox/wb5xfjqk.default/history.dat

grep google.com /home/USERNAME/.mozilla/firefox/*.default/history.dat

-----------------------------------------

With a bit of tweaking you could have a bash script that would let you search the logs of all the students ..... and enter a specific SEARCH term.....

something like:

Code:
#!/bin/bash
#A bash script to search firefox history logs on all users....
read -p "Enter a website or url to search for : " searchurl

grep -H $searchurl /home/*/.mozilla/firefox/*.default/history.dat

# Should print out a nice list and get you the info you need as long as you submit the url you want to search for ;) 


ends up looking something like this when you run it......


crouse@linux:~/Scripts> sh firefoxhistorysearch.sh
Enter a website or url to search for : midiowacamaroclub.com
/home/crouse/.mozilla/firefox/wb5xfjqk.default/history.dat:<(F2F3=http://midiowacamaroclub.com/)(F2F4=1118786369098893)(F2F5
/home/crouse/.mozilla/firefox/wb5xfjqk.default/history.dat: =midiowacamaroclub.com)(F2F6=M$00i$00d$00 $00I$00o$00w$00a$00 $00C$00a$00m\
/home/crouse/.mozilla/firefox/wb5xfjqk.default/history.dat:<(F301=1118786457418506)(F300=http://www.midiowacamaroclub.com/)>
crouse@linux:~/Scripts>

obviously the offending party will have his/her username showing up right beside the offending url ;)


Top
 Profile WWW  
 PostPosted: Wed Jun 15, 2005 9:40 pm   
Site Admin

Joined: Tue May 17, 2005 7:31 pm
Posts: 251
Location: Georgia
maybe have it run through a for loop with the users names as the different "steps" so you could parse the results a little faster
like say
Code:
for /home/* in $userid
do
            grep -H $searchurl /home/${userid}/.mozilla/firefox/*.default/history.dat >> ${userid}-results.txt
done

or something along those lines...ofcourse that isn't tested or anything...


Top
 Profile  
 PostPosted: Wed Jun 15, 2005 9:44 pm   
Site Admin
User avatar

Joined: Sun May 15, 2005 9:36 pm
Posts: 667
Location: Des Moines, Iowa
for /home/* in $userid

i thought of that....but my brain wasn't functioning very well today lol..... couldn't decide how that for loop would look..... :D I didn't have time to play with it much, so i just posted as i originally wrote it.... it seemed to work.

REALLY all that was needed was userid/name and just PART of the resulting line..... but i like your way better..... with the for loop. I may test it later :)


Top
 Profile WWW  
 PostPosted: Wed Jun 15, 2005 11:03 pm   
Site Admin

Joined: Tue May 17, 2005 7:31 pm
Posts: 251
Location: Georgia
ofcourse if you really wanted to get technical...you could parse the passwd file and get all the home directories for all users and go from there...something like
Code:
cut -d: -f6 /etc/passwd >> homedirs.lst
cut -d: -f1 /etc/passwd >> users.lst

# uncomment the next line to make a master list of both sets of info
# paste -d: homedirs.lst users.lst >> master.lst


that should create a file called homedirs.lst that will have all the home directories listed in it and a file called users.lst that will have all the users listed in it

then you could go through those lists filling a couple arrays with the info

Code:
# first declare the arrays
declare -a user;
declare -a homedir

# fill the first array
index=0;
until ! read $current
do
       user[$index]=$current
       ((index++));
done < users.lst

# reset the index and fill the second array
index=0;
until ! read $current
do
       homedir[$index]=$current
       ((index++));
done < homedirs.lst

# now we do that loop for the script using the number of
# elements in the arrays
for ((i=0; i < `echo ${#user[*]}`; i++))
do
       # ofcourse you want to make sure you still
       # have a way to populate the variable $searchurl
       grep -H $searchurl ${homedir[$i}/.mozilla/firefox/*.default/history.dat >> ${user[$i]}-results.txt
done



but that's just a little much for a once simple script...don't you think :lol:


Top
 Profile  
 PostPosted: Thu Jun 16, 2005 10:19 am   
Site Admin
User avatar

Joined: Sun May 15, 2005 9:36 pm
Posts: 667
Location: Des Moines, Iowa
:o :o :o

:lol: :lol: :lol:

Very cool though.


Top
 Profile WWW  
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 5 posts ] 

All times are UTC - 6 hours


Who is online

Users browsing this forum: No registered users and 4 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
cron


BashScripts | Promote Your Page Too
Powered by phpBB © 2011 phpBB Group
© 2003 - 2011 USA LINUX USERS GROUP