Tuesday, May 21, 2019

Automatically delete all files inside a directory older than a set number of days on linux

Use the following command:

find /home/user/Backup -type f -ctime +10 -exec rm -rf {} +

where +10 is the number of days. Replace by another number to set this to the appropriate number of days for your application.

To schedule this action, place the command on your crontab. Something like this:

0 5 * * 1 find /home/user/Backup -type f -ctime +10 -exec rm -rf {} +

This will delete all files on the /home/user/Backup folder that are older than 10 days at 5am every week.

I used this on one of my applications and the inspiration for this came from this link:

Tuesday, May 7, 2019

How to download all files on an apache index directory with linux

Use the following command:

wget -r -np -nH --cut-dirs=3 -R index.html http://hostname/aaa/bbb/ccc/ddd/

Friday, December 30, 2016

Create your laptop battery report on windows 10

Open a command prompt with admin rights (left click on the windows logo on the left bottom corner) and run:

powercfg /batteryreport /output "C:\battery_report.html"

Go to C:\ and open battery_report.html and see all that has to do with  your battery usage since the beginning of time...

Wednesday, March 23, 2016

Battery icon disappears from windows 10 system tray - Get it back!

After windows 10 last update, the battery icon disappeared from my system tray. This was not a simple matter of reactivating the icon on my system tray options: the power management option (or similar) was there but it was greyed out and I could not switch it on. The following procedure made the icon return to my system tray:

1. do crtl + shift + esc to open the task manager
2. click details
(read to the end now before you proceed as the information on your browser may disappear momentarily after the next step and you will not know what to do)
3. find explorer.exe from the list, right click it and click "End process tree"
4. All windows and the task bar will not visible but you will still see the task manager window. Click "file" and select "Run new task".
5. Type "explorer.exe" and click enter
6. All windows and the system tray should now be visible and the battery icon should have returned.

This worked for me on an Asus V300 laptop, windows 10 home edition 64 bit. Good luck!

Saturday, July 11, 2015

Raspberry Pi as backup sever of remote files and remote mySQL databases

Project Goals

To use a raspberry pi with an external USB hard disk as a backup media for my data (files and mySQL databases) stored on a remote server - not on my LAN.


Because I had a raspberry pi around and an available and unused external USD drive that was begging for some action!

Before you start

0. You have an OS on your raspberry pi

1. You have ssh access to it.

2. The pi can access the internet.

3. You have ssh access from your pi to the remote server.

4. You would like to schedule the backup to run daily.

5. You would like to always perform full data backups.

ok, we have our list of requirements, lets jump into the implementation of the project.



SSH connections


1. From a computer on your network, ssh to your Pi (password is raspberry)

ssh pi@ip_address_of_pi

2. From your pi, ssh to your remote server (mine was not on default port 22, so I had to change it to 2222)

sudo ssh -p 2222 username@remote_server
So you have confirmed your ssh connection to your pi and from your pi to the remote server that contains the files you would like to backup.


External HDD


1. Find out what is your drive device name:

sudo blkid

You will get a list that looks like this:

/dev/mmcblk0p6: LABEL="root" UUID="2b177520-9ff1-4ad5-95f5-6ec7cf61ccaf" TYPE="ext4"
/dev/sda1: UUID="cb980263-9238-469d-87ea-5b1ce2f5f421" TYPE="ext2"

You can see my drive is at /dev/sda1 and is formatted as ext2.

2. Mount the device:

sudo mount /dev/sda1 /mnt

3. Create your backup folder on the drive:

sudo mkdir /mnt/backup


Prepare for database backups


1. Run the following commands to install some tools you will need to reach and dump (save) your remote databases:

sudo apt-get update
sudo apt-get install mysql-server

2. Do a manual test to see if you can backup mysql databases from your remote host.

sudo mkdir /test
sudo mkdir /test/databases

sudo mysqldump --host ip_address_of_host -u username -pyourpassword /mnt/backup/test/databases/.sql
sudo mysqldump --host ip_address_of_host -u username -pyourpassword database_name > /mnt/backup/test/databases/database_name.sql

On my setup, I had to specifically allow remote mysql connections from my ISP on cpanel, otherwise I cannot reach the database from outside. Note there's no space between the -p command and the password.


Prepare ssh connection from pi to remote host


We will automate the backup procedure and we want it to be fully automatic, so we do not wish to have to enter the ssh connection password all the time we make a connection request. Proceed as follows to implement a no-password ssh connection:

---> The following is from this page http://www.linuxproblem.org/art_9.html I add the text here with full credit as this seems like a very very old page and I would like to not loose this information for future applications

SSH login without password

Your aim

You want to use Linux and OpenSSH to automate your tasks. Therefore you need an automatic login from host A / user a to Host B / user b. You don't want to enter any passwords, because you want to call ssh from a within a shell script.

How to do it

First log in on A as user a and generate a pair of authentication keys. Do not enter a passphrase:
a@A:~> ssh-keygen -t rsa
Generating public/private rsa key pair.
Enter file in which to save the key (/home/a/.ssh/id_rsa): 
Created directory '/home/a/.ssh'.
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/a/.ssh/id_rsa.
Your public key has been saved in /home/a/.ssh/id_rsa.pub.
The key fingerprint is:
3e:4f:05:79:3a:9f:96:7c:3b:ad:e9:58:37:bc:37:e4 a@A
Now use ssh to create a directory ~/.ssh as user b on B. (The directory may already exist, which is fine):
a@A:~> ssh b@B mkdir -p .ssh
b@B's password: 
Finally append a's new public key to b@B:.ssh/authorized_keys and enter b's password one last time:
a@A:~> cat .ssh/id_rsa.pub | ssh b@B 'cat >> .ssh/authorized_keys'
b@B's password: 
From now on you can log into B as b from A as a without password:
a@A:~> ssh b@B


The bash file


Use any variation of the following:


#Create folder under /backup with a name of the format 20151020_23:59:00
DATE=`date +%m.%d.%Y_%H.%M.%S`

#Backup files
sudo mkdir -p /mnt/backup/$DATE
sudo chown pi /mnt/backup/$DATE
mkdir -p /mnt/backup/$DATE/log
mkdir -p /mnt/backup/$DATE/databases
touch /mnt/backup/$DATE/log/backup.log

date >> /mnt/backup/$DATE/log/backup.log
echo "******************************Starting..." >> /mnt/backup/$DATE/log/backup.log

#############################################START BACKUP PROCEDURES####################################
#Backups on Saturday:
if [[ $(date +%u) -eq 6 ]]
        declare -a arr=("" "")
        for i in "${arr[@}}"
                echo "BACKUP OF" >> /mnt/backup/$DATE/log/backup.log
                echo $i >> /mnt/backup/$DATE/log/backup.log
                IFS=/ read -a path <<< "$i"
                rsync -arv -e 'ssh -p 2222' @:/$i/ /mnt/backup/$DATE/${path[0]}/ >> /mnt/backup/$DATE/log/backup.log
        echo "Not Saturday. Weekly backup not performed." >> /mnt/backup/$DATE/log/backup.log

#Backups daily
declare -a arr=("" "")
for i in "${arr[@]}"
        echo "BACK UP OF " >> /mnt/backup/$DATE/log/backup.log
        echo $i >> /mnt/backup/$DATE/log/backup.log
        IFS=/ read -a path <<< "$i"
        rsync -arv -e "ssh -p 2222" @:/$i/ /mnt/backup/$DATE/${path[0]} >> /mnt/backup/$DATE/log/backup.log

if [[ $(date +%u) -eq 6 ]]
        sudo mysqldump --host $HOST -u -p > /mnt/backup/$DATE/databases/
        echo "Not Saturday. Weekly database backup not performed"

sudo mysqldump --host $HOST -u -p > /mnt/backup/$DATE/databases/

#Delete folder of name backup_XXXXX older than XXXX months
#coming later

#Send report email
#coming later

echo "******************************Finished..." >> /mnt/backup/$DATE/log/backup.log
date >> /mnt/backup/$DATE/log/backup.log

This file will do a full backup of certain files on Saturday and certains files daily. The same for the databases. It will create a folder with the date and time at which it runs and will create a log file to store debug information.

Ensure that the file runs manually (for example, if it's called backup.sh):



Make this an automatically scheduled cron job


Run the following command to edit your crontab:

crontab -e

And enter the following command (this will run the file everyday at 2:00 am. Of course you can set it to match your requirements):

0 2 * * * /home/pi/backup.sh


Setup FTP access to your pi


I need to be able to access the files from computers on my LAN. One way that works for me is to setup an FTP server on the pi with the connection folder being /mnt/backup. Everytime I FTP to the pi, I'm sent to the backup folder where I can see all the folders per day and get what I'm interested in.

To install the FTP server:

sudo apt-get install vsftpd

Configure the FTP server to not allow anonymous logins and to redirect connecting users to the /mnt/backup directory:

nano /etc/vsftpd.conf

Edit the following configuration file to match your case:
# Uncomment this to allow local users to log in.
# Uncomment this to enable any form of FTP write command.
# Default umask for local users is 077. You may wish to change this to 022,
# if your users expect that (022 is used by most other ftpd's)
# Activate directory messages - messages given to remote users when they
# go into a certain directory.
# Activate logging of uploads/downloads.
# Make sure PORT transfer connections originate from port 20 (ftp-data).
# If you want, you can have your log file in standard ftpd xferlog format
local_root= /mnt/backup/
#enable for standalone mode

I ran into a few issues with FTP the server. Namely I had to create the file vsftpd.user_list file:

sudo touch vsftpd.user_list

Then I had to make my backup folder accessible to the FTP server:

sudo chmod a-w /mnt/backup


Things to do


I'm happily retrieving my data everyday but there's a couple of things I need to improve. These may be the subject of another post if there's any interest:

1. Receive report email

I have not taken too much time reading it, but this page seems to show the way to go to have email sent from the pi: http://www.sbprojects.com/projects/raspberrypi/exim4.php

2. Spin-down my external drive

My drive seems to be always operating even when it's not doing anything (which is most of the time actually) so I would like to save power and working hours by having it go into sleep mode or something like that. Not sure if that can be achieved with an external USB drive but the following pages are good places to start:

Good Luck! 

Saturday, September 6, 2014

Weather station github page

I'm actively updating all the code and libraries for the new weather station project. I'm collecting all the old information for historic purposes, as follows:

1. github project pages are here: https://github.com/RuiAlves/weather

2. Original SHT1x arduino library: https://www.dropbox.com/s/12cpvenrntuworb/SHT1x-master_orig.zip?dl=0

***NOTE extract the library above, rename the extracted folder from "SHT1x-master" to "SHT1" and copy it into the libraries forlder of your arduino IDE installation.

3. Original Timer 1 arduino library: https://www.dropbox.com/s/plrpx35agdubf74/TimerOne-r11.zip?dl=0

 ***NOTE extract the library above,create a folder called "Timer1" and copy it to the libraries folder of your arduino IDE installation.

You will need to shutdown and restart the IDE for the libraries to be ready for use by the system.

Once you do this with the original IDE 0017 code, the code verification will fail on the creation of an instance of Webserver. No doubt this is now done another way. I have to investigate how much the http library has changed in the last few years. To work!

More information will be posted as it becomes available.

Sunday, August 31, 2014

Personal Weather Station V2 - Revamping an old project

After more than 4 years working reliably, I decided to retire my weather station... No, I did not, I decided it was time for an upgrade!!!

Inside the weather station controller enclosure - 4 years after. You have served me well!

The weather station project was one of the most interesting projects I ever did and I miss all the excitement that went along with it at the time. Even now it's one of the most visited projects on my site (right after the wifi speakers project).

I have recently come across a good use for the data that the weather station collects: I now live near the sea and being the outdoors man that I am, I like to be out as much as I can and the current is very important to know before going out. Yeah, I can just look out the window but what's the fun in that? Actually there's another think the weather data is important for and that is all my friends that do not live near but that still come over frequently to enjoy the beauties of the sea. It would be great to provide an updated current weather view along with a camera feed on the current sea or beach conditions.

As such, the main project goals of the this project are:

1. Use as much of the existing weather station hardware as possible (might be a challenge as a lot of the equipment is already 4 years old and it shows!)
The weather station sensors as they stand after 4 years of reliable data gathering (note the wind speed sensor with a broken segment...). A little more rust then when I got it but everything still works well.
2. Eliminate the need for a dedicated network (I currently have a separate network for the weather station. I would like to install a wireless bridge to connect the weather station to a single network.
A refurbished and quite old Buffalo wifi router will be our wireless bridge.

3. Move data storage to my own cloud server instead of on my computer or thinkspeak.
4. Provide a connection page to all users with the current weather data, historical weather data and a camera feed.
5. Collect, store and display the following data:
  • Temperature
  • Relative humidity
  • Wind direction (will not be perfect because of installation restrictions)
  • Wind speed (will not be perfect because of installation restrictions)
  • Luminosity
  • Dust/ Particles sensor
  • Provide camera feed of current conditions
6. Technically, I will use the following equipment, software and tools:
The Sharp GP2Y10 particle sensor.
  • [hardware] Old buffalo router as network bridge, model WZR2-G300N, where I intend to replace the stock firmware by 
  • [software] DD-WRT to allow for more advanced features. Apparently this router is not supported by the open-source firmware, but there's some files that were made available that have seen users report successful firmware changes. I will make the files available soon as I had to so some digging around to get them. The instructions for what needs to be done are here - look for the post dated Fri Jul 30, 2010 11:14 by REPASSAC.
I will use this blog as data collection tool for me and for other users that may require the information that I collect as I go (like software I will use and all code that I develop or equipment datasheets).

I will usually work on this on Sundays, so don't expect posts more frequent