Check for Hosting Server Load with Linux (SSH) Command ‘uptime’

Some of the hosting providers out there are infamous for overselling who try their best to stuff in as many users (websites) as possible into a single web hosting server. High server load is an indicator of how your server is performing and whether it is laboring too much thus jeopardizing the performance of your websites. You can get to know the average load in the last 15 minutes of your server by the simple Linux command below (via SSH):

uptime

Which will typically return a line of data similar to this:

21:39:33 up 10:45, 3 users, load average: 4.46, 3.92, 3.64

That says there are currently 3 users logged on and the load average of this server in the last minute, last 5 minutes and last 15 minutes are 4.46, 3.92 and 3.64. These figures represent the number of runnable processes at the same time on average for the CPUs (processor) to process. Combined with number of processors of the server, you may know how many processes are being processed by any single CPU.

Considering the fact that any CPU can only take on one process at any given time, there will possibly be processes waiting in the queue – meaning server is overloaded. Therefore, if the number of processors of your hosting server is 4, in the last minute, it is overloaded by ( 4.46 / 4 ) – 100% = 11.5%.

Extracting and creating archive files

Sometimes you would need to extract or create an archive file (i.e to install a script, you would usually download an archive and extract it to continue the installation). The very first step in the process would be to identify the exact archive type by looking at the file extension. The most common archive types are zip (ending with .zip), tar (.tar), Tar+Gunzip (.tar.gz), Bzip (.bz2) and Rar (.rar).

Each archive type has its own command for compressing/extracting as listed below.

To extract a ZIP file, please use:

unzip archive.zip

To extract a Tar file, please use:

tar -xvf archive.tar

To extract a Tar.Gz file, please use:

tar -zxvf archive.tar.gz

To extract a Rar file, please use:

rar x archive.rar

Each archive type has its own mechanism to create a new archive file. The most commonly used however is the tar.gz format. An example of creating such a file can be seen below:

tar -zcf archive-name.tar.gz foldername/

Source: http://www.siteground.com/tutorials/ssh/ssh_extracting.htm

SSH Commands

Login to SSH using terminal

ssh [email protected] -p22456

Extract a tar.bz2 file

tar jxf file.tar.bz2

Backing up Database: The command for backing up your database is the following:

mysqldump -u username -p database_name -hServerIP > /path/backup.sql

Type the above command in your command line with your username, database_name, and storage_path for dump to be saved in, press Enter and you will be prompted for your database user’s password.

Restoring your Database: The command for restoring your database is the following:

mysql -u username -pPassword -hServerIP database_name < path/db.sql
&#91;/shell&#93;

Note the lack of a space between -p and Password. Also, the password is for the MySQL database.

<strong>Compressed dumps/restores</strong>: In order to restore a compressed database, you will have to uncompress the file first. The following command combines both the steps of uncompressing your file and transferring its contents to a database:

[shell]
gunzip < backup.sql.gz | mysql -u blarg -pPassword db_name
&#91;/shell&#93;

If you want to directly compress data from a mySQL database without having to gzip separately, enter the following command:

&#91;shell&#93;
mysqldump -u username -p -hServerIP db_name | gzip > backup.sql.gz

After you type the above and press enter, you should be prompted for your password. Enter your password, and you should be all set.

Compress/Uncompress: If you want to compress your existing .sql file, you will need to use the following command:

gzip -X path/to/backup.sql

In the above command, X is a number between 1 and 9 that specifies the level of compression used. The higher your number, the more compressed (smaller) your file will be.

For example, this is the command I use to compress my data:

gzip -9 forum_backup.sql

Likewise, to uncompress a compressed file, you will use the following command:

gunzip backup.sql.gz

Cron Job: Delete Files & Directories Older Than x Days or Minutes on Linux

The find utility on linux allows you to pass in a bunch of interesting arguments, including one to execute another command on each file. We’ll use this in order to figure out what files are older than a certain number of days, and then use the rm command to delete them.

find /path/to/files* -mtime +5 -exec rm {} \;

EX.

find /home/ACCOUNT-USER/public_html/test-delete/* -mtime +1 -exec rm {} \;

Note that there are spaces between rm, {}, and \;

Explanation

* The first argument is the path to the files. This can be a path, a directory, or a wildcard as in the example above. I would recommend using the full path, and make sure that you run the command without the exec rm to make sure you are getting the right results.
* The second argument, -mtime, is used to specify the number of days old that the file is. If you enter +5, it will find files older than 5 days.
* The third argument, -exec, allows you to pass in a command such as rm. The {} \; at the end is required to end the command.

This should work on Ubuntu, Suse, Redhat, or pretty much any version of linux.

In order to force delete the dirs, you need to do ‘rm -fr’ instead of just ‘rm’. -f will force deletion and -r will do it recursively on all subdirs.

find can accept a -type argument where ‘find -type f’ will only find files, ‘find -type d’ will find dirs, and ‘find -type l’ will find links.

find /home/ACCOUNT-USER/public_html/test-delete/* -mmin +1 -exec rm {} \;

Source:
http://www.howtogeek.com/howto/ubuntu/delete-files-older-than-x-days-on-linux/

Cron Job: to backup database

DATE=`date +\%d-\%m-\%y-\%H:\%m:\%S`; /usr/bin/mysqldump -h localhost -u USERNAME -pPASSWORD DATABASENAME | gzip > /home/ACCOUNT-USER/mysql_backups/DATABASENAME-${DATE}.sql.gz

Where
USERNAME -> Database username
PASSWORD -> Database password
DATABASENAME -> Database name

Note:
Don’t remove “-p” let say that your database name is “mydatabase” and password is “1234abcd” and username is “myname”. It will be like that:

DATE=`date +\%d-\%m-\%y-\%H:\%m:\%S`; /usr/bin/mysqldump -h localhost -u myname -p1234abcd mydatabse | gzip > /home/ACCOUNT-USER/mysql_backups/mydatabse-${DATE}.sql.gz

Change default directory page

Most probably you have been wondering how the Webserver decides which page from your site to use as a main/default page of your site. There is a directive named DirectoryIndex which takes care of this.

On most web servers there is a pre-defined set of file names which server a start page.
The most commonly used are: index.html, default.html, index.php, index.asp, etc.

The good news is that you can set your custom file to be a start page of your site using .htaccess.

For example the following line set home-page.html as a main page of your site:

DirectoryIndex home-page.html

The DirectoryIndex directive can accept more than one name:

DirectoryIndex home-page.html Home.html, index.html index.php index.cgi

So when a visitors goes to http://www.example.com the first page to be loaded will be the home-page.html if it cannot be found the server will look then for Home.html, index.html, etc until it finds a match.

Prevent Script Execution

You can disable scripts being run in the directory of your choice by adding the following code to your .htaccess file in that directory:

Options -ExecCGI
AddHandler cgi-script .php .php3 .php4 .php5 .phtml .pl .py .jsp .asp .htm .html .shtml .sh .cgi .xml

You can replace the file types in the example with the file types you wish to disallow.

This would be particularly useful if you allow visitors to upload files to your server, but want to be sure that any potentially harmful files they upload are not allowed to execute.

Page 2 of 3123

Pin It on Pinterest