Posts Tagged ‘CLI’

Plesk AWStats view statistics for all domain

June 8th, 2010 No comments

By default in Plesk you can view AWstats statistics for each domain separately at http://<yourdomain>/webstat . And you must type here FTP password for the domain. It’s not comfortably when your server holds much more than one domain.

So, our task is to create superadmin-access for AWStats for viewing all domains statistics with only one password.

First of all, find you file. It must be located in cgi-bin directory which lies near your vhosts. In my case it’s /srv/www/cgi-bin

Put here 2 files:


AuthUserFile /srv/www/cgi-bin/.htpasswd
AuthName "AWStats"
AuthType Basic
require valid-user


use strict;
use warnings;

use CGI qw(:standard);

print header;
print start_html(-title => 'AWStats');

opendir (DIRH, '/usr/local/psa/etc/awstats') || die; # path depends on your system

print '<div align="center">';

my $file; my @files;
# awstats config files are named like awstats.<yourdomain>-http.conf
# if you need to see also ftp etc, modify my code
while ($file = readdir DIRH) {
 $_ = $file; # my perl doesn't works without assignment result of readdir
 next unless m/\-http\.conf$/;
 push @files, $file;

for $file (sort @files) {
 $_ = $file;
 print "<a target='_blank' href='",$_,"'>";
 print '</a><br>';

print "</div>";

Don’t forget to chmod +x

To finish with this dir, create .hpasswd file:

htpasswd -c .htpasswd <username>

At last you need to config Apache web-server.

Put in /etc/apache2/conf.d/awstats.conf (or create a new one):

ScriptAlias /awstats-full /srv/www/cgi-bin
Alias  /awstats-icon /usr/share/apache2/icons/awstats

<Directory /srv/www/cgi-bin>
 AllowOverride All

Please check all paths, they are depend on your system. For exampe, apache config dir may be /etc/httpd/

Finally, restart Apache graceful (recommended)

apachectl -k graceful

or restart fully

apachectl -k restart

Now you can see all domain statistics on http://<your ip>/awstats-full/

Categories: Linux, Plesk Tags: , , , ,

MySQL backup databases into separate files by weekdays with compression

May 15th, 2010 No comments

I’ve found in Net script, which only backups databases into separate files and modified it. I’ve added gzip compression and backup by weekday. So, you can cron this job daily and get backups for each weekday. It’s very flexible backup algorithm.

Script is below:


# This script backups every MySQL database to its own file

#Some variables you can set how you like
USER='<possible root>'
DAYOFWEEK=`/bin/date +"%w"`
OUTPUTDIR="/usr/backup/mysql/$DAYOFWEEK" # backup dir
MYSQLDUMP='/usr/local/bin/mysqldump'     # path to mysqldump, may be /usr/bin/mysqldump
MYSQL='/usr/local/bin/mysql'             # path to mysql, may be /usr/bin/mysql  

#Clean up any old backups
rm -f $OUTPUTDIR/*

#Get a list of databases names except the system one
databases=`$MYSQL --user=$USER --password=$PASSWORD -e 'SHOW DATABASES;' | grep -Ev '(Database|information_schema)'`

#Dump each database in turn and compress the output with gzip
for db in $databases; do
$MYSQLDUMP --opt --hex-blob --force --user=$USER --password=$PASSWORD $db | gzip > $OUTPUTDIR/$db.gz

Download script (800 kB)

Categories: Linux, MySQL Tags: , ,

Manage Services/Daemons in Fedora Core, CentOS, FreeBSD, Debian, Ubuntu

January 25th, 2010 No comments

I introduce you my small manual for Linux Administrators how in most popular Linux-family OS config your services/daemons with comfortable tools.

Manage Services/Daemons in Fedora Core / CentOS / Red Hat

After you have created your own service, placed it in /etc/init.d/ dir, you need to set the runlevel for it.

To control services either use




if you are using the command line, or use


in the GUI. Gnome users: System > Administration > Server Settings > Services.

Manage Services/Daemons in Debian / Ubuntu

For Debian I recommend you to use sysvconfig package.

To install do the following:

apt-get install sysvconfig


Manage Services/Daemons in FreeBSD

Unfortunately, in FreeBSD there is no such comfortable utility to manage services. You need do edit
/etc/rc.conf manually.

Categories: Linux Tags: ,

Monitor changed files on Linux using find command and XML+XSLT

September 25th, 2009 3 comments

Once I’ve decided to write my own monitor for updated files on my Linux server. I’ve selected XML files as storage, and Bash-scripts and Cron as monitor.

Bash script for generating XML-file with list of daily changed files looks like:

echo '<?xml version="1.0" encoding="utf-8"?>'
echo "<?xml-stylesheet type='text/xsl' href='template.xsl'?>"
echo '<files>'
find /var/www/vhosts/ -mtime -1 -print | /var/www/newfiles/
echo '</files>'

Additionally I’ve used a filter for excluding logfiles, dirs etc.

#!/usr/local/bin/perl -w

use strict;
use warnings;

use POSIX qw(locale_h strftime);

while (my $filename = <>) {

if (length($filename) &&
    $filename !~ m/webstat(\-ssl)?/ &&
    $filename !~ m#/statistics/webstat(\-ssl)?/# &&
    $filename !~ m#/statistics/ftpstat/# &&
    $filename !~ m#/statistics/logs# &&
    $filename !~ m#/templates_c$# &&
    $filename !~ m#/statistics/(anon_)?ftpstat# &&
    ! (-d $filename)
   ) {
   use File::stat;
   my $sb = stat($filename);
   print "\t",'',"\n";
   print "\t\t", "",$filename,"\n";
   print "\t\t", "",strftime ("%a %b %e %H:%M:%S %Y", localtime $sb->mtime),"\n";
   print "\t\t", "",(getpwuid($sb->uid))[0],"\n";
   print "\t\t", "",(getgrgid($sb->gid))[0],"\n";
   print "\t\t", "",$sb->size,"\n";
   print "\t\t", "",sprintf("%04o",$sb->mode & 07777),"\n";

   print "\t",'',"\n";


Bash-script for cron:

cd /var/www/newfiles/
dd=`date "+%Y-%m-%d"`
./ | gzip > "$dd.xml.gz"
echo "$dd"

Notice. I’ve used gzip compression for disk space saving.

For formatting the output I’ve used an XSL template. For better usability I’ve added here an jQuery Plugin Tablesorter. It allows you to sort data in table clicking on the column header. XSL Template source:

<html xsl:version="1.0" xmlns:xsl="" xmlns="">
<link rel="stylesheet" type="text/css" href="themes/style.css" media="screen"/>
<link rel="stylesheet" type="text/css" href="styles.css" media="screen"/>
<div align="center"><a href="index.php">back</a></div>
<table id="myTable">
<th>Modify time</th>
<xsl:for-each select="files/file">
<td><xsl:value-of select="name"/></td>
<td nowrap="nowrap"><xsl:value-of select="mtime"/></td>
<td nowrap="nowrap"><xsl:value-of select="owner"/></td>
<td nowrap="nowrap"><xsl:value-of select="size"/></td>
<td nowrap="nowrap"><xsl:value-of select="mode"/></td>
<td nowrap="nowrap"><xsl:value-of select="group"/></td>
<div align="center"><a href="index.php">back</a></div>
<script type="text/javascript" language="javascript" src="js/jquery.js" />
<script type="text/javascript" language="javascript" src="js/jquery.tablesorter.js" />
<script type="text/javascript">
$(document).ready(function() {

And finally, the index.php script code:

header('Content-type: text/html; charset=utf-8');


if (!isset($_GET['date']) && !isset($argv[1])) {
 echo '<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<title>Modified files</title>
<link rel="stylesheet" type="text/css" href="styles.css">
<body><div align="center">';
 if ($handle = opendir(getcwd()))  {
 $files = array();

 while (false !== ($file = readdir($handle))) {
 if (preg_match('/^\d{4}\-\d{2}\-\d{2}\.xml\.gz$/',$file)) {
 $files[] = $file;
 $files = array_reverse($files);

 foreach($files as $file) {
 $d = preg_replace('/\.xml\.gz$/','', $file);
 printf('<a href="?date=%s">%s</a><br>', $d, dateToSovok($d));
echo '</div></body></html>';
} else {
 $date = isset($_GET['date']) ? $_GET['date'] : $argv[1];

 if (!is_file($date.'.xml.gz')) {
 header('Location: index.php');

 // Load the XML source
 $xml = new DOMDocument;

 $xsl = new DOMDocument;

 // Configure the transformer
 $proc = new XSLTProcessor;
 $proc->importStyleSheet($xsl); // attach the xsl rules

 $doc = $proc->transformToDoc($xml);
 echo $doc->saveHTML();

function dateToSovok($dt) {
 $pos1 = strpos($dt,'-');
 $pos2 = strrpos($dt,'-');

 $year = substr ($dt, 0, $pos1);
 $month = substr ($dt, $pos1 + 1, $pos2-$pos1-1);
 $day = substr ($dt, $pos2 + 1, strlen($dt));
 return ($day.".".$month.".".$year);
$content = ob_get_clean();

if(function_exists('gzencode') && ($encoding = checkCanGzip()) ) {
 header("Content-Encoding: ".$encoding);
 echo gzencode( $content . '<!-- gzencoded -->', 6 );
} else
 echo $content . '<!-- without compression -->';

/*  ------------------------------------------------------------ */

function checkCanGzip() {
 global $_SERVER;;

 if (!isset($_SERVER['HTTP_ACCEPT_ENCODING'])) return 0;
 if (strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'x-gzip') !== false) return "x-gzip";
 if (strpos($_SERVER['HTTP_ACCEPT_ENCODING'],'gzip') !== false) return "gzip";
 return 0;

function gzdecode($data) {
 $len = strlen($data);
 if ($len < 18 || strcmp(substr($data,0,2),"\x1f\x8b")) {
 return null;  // Not GZIP format (See RFC 1952)
 $method = ord(substr($data,2,1));  // Compression method
 $flags  = ord(substr($data,3,1));  // Flags
 if ($flags & 31 != $flags) {
 // Reserved bits are set -- NOT ALLOWED by RFC 1952
 return null;
 // NOTE: $mtime may be negative (PHP integer limitations)
 $mtime = unpack("V", substr($data,4,4));
 $mtime = $mtime[1];
 $xfl   = substr($data,8,1);
 $os    = substr($data,8,1);
 $headerlen = 10;
 $extralen  = 0;
 $extra     = "";
 if ($flags & 4) {
 // 2-byte length prefixed EXTRA data in header
 if ($len - $headerlen - 2 < 8) {
 return false;    // Invalid format
 $extralen = unpack("v",substr($data,8,2));
 $extralen = $extralen[1];
 if ($len - $headerlen - 2 - $extralen < 8) {
 return false;    // Invalid format
 $extra = substr($data,10,$extralen);
 $headerlen += 2 + $extralen;

 $filenamelen = 0;
 $filename = "";
 if ($flags & 8) {
 // C-style string file NAME data in header
 if ($len - $headerlen - 1 < 8) {
 return false;    // Invalid format
 $filenamelen = strpos(substr($data,8+$extralen),chr(0));
 if ($filenamelen === false || $len - $headerlen - $filenamelen - 1 < 8) {
 return false;    // Invalid format
 $filename = substr($data,$headerlen,$filenamelen);
 $headerlen += $filenamelen + 1;

 $commentlen = 0;
 $comment = "";
 if ($flags & 16) {
 // C-style string COMMENT data in header
 if ($len - $headerlen - 1 < 8) {
 return false;    // Invalid format
 $commentlen = strpos(substr($data,8+$extralen+$filenamelen),chr(0));
 if ($commentlen === false || $len - $headerlen - $commentlen - 1 < 8) {
 return false;    // Invalid header format
 $comment = substr($data,$headerlen,$commentlen);
 $headerlen += $commentlen + 1;

 $headercrc = "";
 if ($flags & 1) {
 // 2-bytes (lowest order) of CRC32 on header present
 if ($len - $headerlen - 2 < 8) {
 return false;    // Invalid format
 $calccrc = crc32(substr($data,0,$headerlen)) & 0xffff;
 $headercrc = unpack("v", substr($data,$headerlen,2));
 $headercrc = $headercrc[1];
 if ($headercrc != $calccrc) {
 return false;    // Bad header CRC
 $headerlen += 2;

 // GZIP FOOTER - These be negative due to PHP's limitations
 $datacrc = unpack("V",substr($data,-8,4));
 $datacrc = $datacrc[1];
 $isize = unpack("V",substr($data,-4));
 $isize = $isize[1];

 // Perform the decompression:
 $bodylen = $len-$headerlen-8;
 if ($bodylen < 1) {
 // This should never happen - IMPLEMENTATION BUG!
 return null;
 $body = substr($data,$headerlen,$bodylen);
 $data = "";
 if ($bodylen > 0) {
 switch ($method) {
 case 8:
 // Currently the only supported compression method:
 $data = gzinflate($body);
 // Unknown compression method
 return false;
 } else {
 // I'm not sure if zero-byte body content is allowed.
 // Allow it for now...  Do nothing...

 // Verifiy decompressed size and CRC32:
 // NOTE: This may fail with large data sizes depending on how
 //       PHP's integer limitations affect strlen() since $isize
 //       may be negative for large sizes.
 if ($isize != strlen($data) || crc32($data) != $datacrc) {
 // Bad format!  Length or CRC doesn't match!
 return false;
 return $data;

Download all source codes Monitor changed files on Linux using find command and XML+XSLT (28 kb)

Categories: Linux Tags: ,

Find all large files in Unix / Linux using shell prompt

August 31st, 2009 No comments

In Linux there is a problem to measure all folders and subfolders.

You can find all large files (in this example bigger 600 000 kb and from root folder) from shell prompt with command:

find / -type f -size +600000k -exec ls -l {} \; | awk '{ print $9 ": " $5 }'
Categories: Linux Tags: , ,

Command to list directories only

August 19th, 2009 No comments

If you want to list only directories in current directory, use command:

ls -l ./ | grep ^d | awk '{print $9}'

It can be useful for input for another application or command.

Categories: Linux Tags: ,

Plesk backup domains from command line (multibackups)

January 18th, 2009 No comments

Once I needed to backup all my 200 domains from Plesk, exclude 5-6 domains. There is no opportunity to do it via Plesk web-interface.. (You can only schedule each(!) domain separately).

So, after long time search, I’ve found my own solution.

I’ve wrote simple bash-script:


maxbackups=2 # 3-1=2, numering from zero

counter=`head -n 1 $file`
let counter=counter+1
if [ "$counter" -gt "$maxbackups" ]; then
    let counter=0
`echo $counter > $file`

for domainname in `ls -l /var/www/vhosts/ | grep ^d | awk '{print $9}'`
`/usr/local/bin/pleskbackup --domains-name $domainname --output-file=ftp://user:password@server/$domainname-$counter.xml.tar --exclude-domain-file=/root/pleskbackup/excludelist`

New feature: you can now store more than one backup! Just create file with name “counter” and contents “0”. Enjoy!

Categories: Linux Tags: , ,