Manage Services/Daemons in Fedora Core, CentOS, FreeBSD, Debian, Ubuntu

January 25th, 2010 Comments off

I introduce you my small manual for Linux Administrators how in most popular Linux-family OS config your services/daemons with comfortable tools.

Manage Services/Daemons in Fedora Core / CentOS / Red Hat

After you have created your own service, placed it in /etc/init.d/ dir, you need to set the runlevel for it.

To control services either use

chkconfig

or

ntsysv

if you are using the command line, or use

system-config-services

in the GUI. Gnome users: System > Administration > Server Settings > Services.

Manage Services/Daemons in Debian / Ubuntu

For Debian I recommend you to use sysvconfig package.

To install do the following:

apt-get install sysvconfig

Enjoy!

Manage Services/Daemons in FreeBSD

Unfortunately, in FreeBSD there is no such comfortable utility to manage services. You need do edit
/etc/rc.conf manually.

Categories: Linux Tags: ,

Plesk: RewriteRule to redirect from subdomains mail, ftp, ns1, ns2 to main domain

November 11th, 2009 Comments off

By default in Plesk the is such interesting configuration. When you try to access virtual subdomain, for example mail.domain.com or ftp.domain.com, via HTTP, in browser you see the last added in Plesk domain!

To fix it you’ll need to write additional Apache .conf file:

<VirtualHost <ip1>:80 <ip2>:80 <ip3>:80>
 ServerName mail
 ServerAlias mail.*
 ServerAlias ns1.*
 ServerAlias ns2.*
 ServerAlias ftp.*

 RewriteEngine On
 RewriteCond %{HTTP_HOST} ^(?:www\.)?(?:ftp|mail|ns1|ns2)\.(.*)$ [NC]
 RewriteRule ^(.*) http://%1$1 [L,R]

</VirtualHost>

<IfModule mod_ssl.c>
<VirtualHost  <ip1>:443 <ip2>:443 <ip3>:443>
 ServerName mail
 ServerAlias mail.*
 ServerAlias ns1.*
 ServerAlias ns2.*
 ServerAlias ftp.*

 RewriteEngine On
 RewriteCond %{HTTP_HOST} ^(?:www\.)?(?:ftp|mail|ns1|ns2)\.(.*)$ [NC]
 RewriteRule ^(.*) https://%1$1 [L,R]

</VirtualHost>
</IfModule>

Name this file subdomains_redirect.conf and place it to /etc/httpd/conf.d or /etc/apache2/conf.d

Download subdomains_redirect.conf (590 bytes)

Categories: Linux Tags: ,

Smarty parse a string as template

October 16th, 2009 Comments off

It’s the best way to parse a string as Smarty template. You need to follow only 3 steps!

First of all we need to register a new template resource – string.

$smarty->register_resource("string", array("string_get_template",
 "string_get_timestamp",
 "string_get_secure",
 "string_get_trusted"));

Then add function implementations

function string_get_template ($tpl_name, &$tpl_source, &$smarty_obj) {
    global $smartyStringTemplates;
    $tpl_source = $smartyStringTemplates[$tpl_name];
    return true;
}

function string_get_timestamp($tpl_name, &$tpl_timestamp, &$smarty_obj) {
    // do database call here to populate $tpl_timestamp.
    $tpl_timestamp = time;
    return true;
}

function string_get_secure($tpl_name, &$smarty_obj) {
    // assume all templates are secure
    return true;
}

function string_get_trusted($tpl_name, &$smarty_obj) {
    // not used for templates
}

And the last step. Create an array of string templates and parse a string by passing it’s index in array

$smartyStringTemplates = array('{php}date(){/php}', '{$helloWold}');

print $smarty->fetch("string:1"); // will fetch a string {$helloWold}
Categories: PHP Tags: ,

Perl. HTTP-request from specified IP

October 2nd, 2009 2 comments

For libwww-perl-5.834.

Use the method local_address of LWP::UserAgent object

my $ua = LWP::UserAgent->new( );
$ua->local_address( '192.168.0.1' )

For older versions.

Before creating an LWP::UserAgent object you should add following code:

push(@LWP::Protocol::http::EXTRA_SOCK_OPTS,
LocalHost => '192.168.0.1'
);

there 192.168.0.1 is the required interface IP address.

Categories: Perl Tags: ,

Monitor changed files on Linux using find command and XML+XSLT

September 25th, 2009 3 comments

Once I’ve decided to write my own monitor for updated files on my Linux server. I’ve selected XML files as storage, and Bash-scripts and Cron as monitor.

Bash script findnewfiles.sh for generating XML-file with list of daily changed files looks like:

#!/bin/bash
echo '<?xml version="1.0" encoding="utf-8"?>'
echo "<?xml-stylesheet type='text/xsl' href='template.xsl'?>"
echo '<files>'
find /var/www/vhosts/ -mtime -1 -print | /var/www/newfiles/findfilter.pl
echo '</files>'

Additionally I’ve used a filter findfilter.pl for excluding logfiles, dirs etc.

#!/usr/local/bin/perl -w

use strict;
use warnings;

use POSIX qw(locale_h strftime);

while (my $filename = <>) {
chomp($filename);

if (length($filename) &&
    $filename !~ m/webstat(\-ssl)?/ &&
    $filename !~ m#/statistics/webstat(\-ssl)?/# &&
    $filename !~ m#/statistics/ftpstat/# &&
    $filename !~ m#/statistics/logs# &&
    $filename !~ m#/templates_c$# &&
    $filename !~ m#/statistics/(anon_)?ftpstat# &&
    ! (-d $filename)
   ) {
   use File::stat;
   my $sb = stat($filename);
   print "\t",'',"\n";
   print "\t\t", "",$filename,"\n";
   print "\t\t", "",strftime ("%a %b %e %H:%M:%S %Y", localtime $sb->mtime),"\n";
   print "\t\t", "",(getpwuid($sb->uid))[0],"\n";
   print "\t\t", "",(getgrgid($sb->gid))[0],"\n";
   print "\t\t", "",$sb->size,"\n";
   print "\t\t", "",sprintf("%04o",$sb->mode & 07777),"\n";

   print "\t",'',"\n";
}
}

1;

Bash-script startfindnewfiles.sh for cron:

#!/bin/bash
cd /var/www/newfiles/
dd=`date "+%Y-%m-%d"`
./findnewfiles.sh | gzip > "$dd.xml.gz"
echo "http://yourserver.com/newfiles/?date=$dd"

Notice. I’ve used gzip compression for disk space saving.

For formatting the output I’ve used an XSL template. For better usability I’ve added here an jQuery Plugin Tablesorter. It allows you to sort data in table clicking on the column header. XSL Template source:

<html xsl:version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns="http://www.w3.org/1999/xhtml">
<head>
<link rel="stylesheet" type="text/css" href="themes/style.css" media="screen"/>
<link rel="stylesheet" type="text/css" href="styles.css" media="screen"/>
</head>
<body>
<div align="center"><a href="index.php">back</a></div>
<table id="myTable">
<thead>
<tr>
<th>Filename</th>
<th>Modify time</th>
<th>Owner</th>
<th>Size</th>
<th>Rights</th>
<th>Group</th>
</tr>
</thead>
<tbody>
<xsl:for-each select="files/file">
<tr>
<td><xsl:value-of select="name"/></td>
<td nowrap="nowrap"><xsl:value-of select="mtime"/></td>
<td nowrap="nowrap"><xsl:value-of select="owner"/></td>
<td nowrap="nowrap"><xsl:value-of select="size"/></td>
<td nowrap="nowrap"><xsl:value-of select="mode"/></td>
<td nowrap="nowrap"><xsl:value-of select="group"/></td>
</tr>
</xsl:for-each>
</tbody>
</table>
<div align="center"><a href="index.php">back</a></div>
<script type="text/javascript" language="javascript" src="js/jquery.js" />
<script type="text/javascript" language="javascript" src="js/jquery.tablesorter.js" />
<script type="text/javascript">
<xsl:comment>
$(document).ready(function() {
$("#myTable").tablesorter({
sortList:[[1,1],[0,0]]
});
}
);
</xsl:comment>
</script>
</body>
</html>

And finally, the index.php script code:

<?php
header('Content-type: text/html; charset=utf-8');

ob_start();

if (!isset($_GET['date']) && !isset($argv[1])) {
 echo '<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<title>Modified files</title>
<link rel="stylesheet" type="text/css" href="styles.css">
</head>
<body><div align="center">';
 if ($handle = opendir(getcwd()))  {
 $files = array();

 while (false !== ($file = readdir($handle))) {
 if (preg_match('/^\d{4}\-\d{2}\-\d{2}\.xml\.gz$/',$file)) {
 $files[] = $file;
 }
 }
 natcasesort($files);
 $files = array_reverse($files);

 foreach($files as $file) {
 $d = preg_replace('/\.xml\.gz$/','', $file);
 printf('<a href="?date=%s">%s</a><br>', $d, dateToSovok($d));
 }
 }
echo '</div></body></html>';
} else {
 $date = isset($_GET['date']) ? $_GET['date'] : $argv[1];

 if (!is_file($date.'.xml.gz')) {
 header('Location: index.php');
 die;
 }

 // Load the XML source
 $xml = new DOMDocument;
 $xml->loadXML(gzdecode(file_get_contents($date.'.xml.gz')));

 $xsl = new DOMDocument;
 $xsl->load('template.xsl');

 // Configure the transformer
 $proc = new XSLTProcessor;
 $proc->importStyleSheet($xsl); // attach the xsl rules

 $doc = $proc->transformToDoc($xml);
 echo $doc->saveHTML();
}

function dateToSovok($dt) {
 $pos1 = strpos($dt,'-');
 $pos2 = strrpos($dt,'-');

 $year = substr ($dt, 0, $pos1);
 $month = substr ($dt, $pos1 + 1, $pos2-$pos1-1);
 $day = substr ($dt, $pos2 + 1, strlen($dt));
 return ($day.".".$month.".".$year);
}
$content = ob_get_clean();

if(function_exists('gzencode') && ($encoding = checkCanGzip()) ) {
 header("Content-Encoding: ".$encoding);
 echo gzencode( $content . '<!-- gzencoded -->', 6 );
} else
 echo $content . '<!-- without compression -->';

/*  ------------------------------------------------------------ */

function checkCanGzip() {
 global $_SERVER;;

 if (!isset($_SERVER['HTTP_ACCEPT_ENCODING'])) return 0;
 if (strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'x-gzip') !== false) return "x-gzip";
 if (strpos($_SERVER['HTTP_ACCEPT_ENCODING'],'gzip') !== false) return "gzip";
 return 0;
}

function gzdecode($data) {
 $len = strlen($data);
 if ($len < 18 || strcmp(substr($data,0,2),"\x1f\x8b")) {
 return null;  // Not GZIP format (See RFC 1952)
 }
 $method = ord(substr($data,2,1));  // Compression method
 $flags  = ord(substr($data,3,1));  // Flags
 if ($flags & 31 != $flags) {
 // Reserved bits are set -- NOT ALLOWED by RFC 1952
 return null;
 }
 // NOTE: $mtime may be negative (PHP integer limitations)
 $mtime = unpack("V", substr($data,4,4));
 $mtime = $mtime[1];
 $xfl   = substr($data,8,1);
 $os    = substr($data,8,1);
 $headerlen = 10;
 $extralen  = 0;
 $extra     = "";
 if ($flags & 4) {
 // 2-byte length prefixed EXTRA data in header
 if ($len - $headerlen - 2 < 8) {
 return false;    // Invalid format
 }
 $extralen = unpack("v",substr($data,8,2));
 $extralen = $extralen[1];
 if ($len - $headerlen - 2 - $extralen < 8) {
 return false;    // Invalid format
 }
 $extra = substr($data,10,$extralen);
 $headerlen += 2 + $extralen;
 }

 $filenamelen = 0;
 $filename = "";
 if ($flags & 8) {
 // C-style string file NAME data in header
 if ($len - $headerlen - 1 < 8) {
 return false;    // Invalid format
 }
 $filenamelen = strpos(substr($data,8+$extralen),chr(0));
 if ($filenamelen === false || $len - $headerlen - $filenamelen - 1 < 8) {
 return false;    // Invalid format
 }
 $filename = substr($data,$headerlen,$filenamelen);
 $headerlen += $filenamelen + 1;
 }

 $commentlen = 0;
 $comment = "";
 if ($flags & 16) {
 // C-style string COMMENT data in header
 if ($len - $headerlen - 1 < 8) {
 return false;    // Invalid format
 }
 $commentlen = strpos(substr($data,8+$extralen+$filenamelen),chr(0));
 if ($commentlen === false || $len - $headerlen - $commentlen - 1 < 8) {
 return false;    // Invalid header format
 }
 $comment = substr($data,$headerlen,$commentlen);
 $headerlen += $commentlen + 1;
 }

 $headercrc = "";
 if ($flags & 1) {
 // 2-bytes (lowest order) of CRC32 on header present
 if ($len - $headerlen - 2 < 8) {
 return false;    // Invalid format
 }
 $calccrc = crc32(substr($data,0,$headerlen)) & 0xffff;
 $headercrc = unpack("v", substr($data,$headerlen,2));
 $headercrc = $headercrc[1];
 if ($headercrc != $calccrc) {
 return false;    // Bad header CRC
 }
 $headerlen += 2;
 }

 // GZIP FOOTER - These be negative due to PHP's limitations
 $datacrc = unpack("V",substr($data,-8,4));
 $datacrc = $datacrc[1];
 $isize = unpack("V",substr($data,-4));
 $isize = $isize[1];

 // Perform the decompression:
 $bodylen = $len-$headerlen-8;
 if ($bodylen < 1) {
 // This should never happen - IMPLEMENTATION BUG!
 return null;
 }
 $body = substr($data,$headerlen,$bodylen);
 $data = "";
 if ($bodylen > 0) {
 switch ($method) {
 case 8:
 // Currently the only supported compression method:
 $data = gzinflate($body);
 break;
 default:
 // Unknown compression method
 return false;
 }
 } else {
 // I'm not sure if zero-byte body content is allowed.
 // Allow it for now...  Do nothing...
 }

 // Verifiy decompressed size and CRC32:
 // NOTE: This may fail with large data sizes depending on how
 //       PHP's integer limitations affect strlen() since $isize
 //       may be negative for large sizes.
 if ($isize != strlen($data) || crc32($data) != $datacrc) {
 // Bad format!  Length or CRC doesn't match!
 return false;
 }
 return $data;
}
?>

Download all source codes Monitor changed files on Linux using find command and XML+XSLT (28 kb)

Categories: Linux Tags: ,

Won’t login in Horde IMP twice?

September 15th, 2009 Comments off

I’ve installed on my server Horde with IMP, DIMP and other services. All was excellent, while I’ve try to login to IMP as user. It was asked my login and password twice!

To login directly in IMP (enter login data once) do the follow:

  1. Login in Horder as Administrator
  2. Go to Administration -> Setup -> Horde -> Authentication
  3. Change the option $conf[auth][driver] to Let a Horde application handle authentication
  4. Change the option $conf[auth][params][app] to imp

Now you can login directly to IMP without logging to Horde with URL: http[s]://<myip>/horde/imp/

Categories: Mail Tags:

Relevant search in MySQL

September 11th, 2009 2 comments

We have an query. Query is a long phrase, for example 4 words. We want to found in table most relevant results.

There is built-in operator in MySQL which called MATCH – AGAINST. This operator executes relevant search in specified FULLTEXT indexes.

Our table:

CREATE TABLE `clients` (
`id` int(10) unsigned NOT NULL auto_increment,
`firm` varchar(150) NOT NULL default '',
`name_` varchar(30) NOT NULL default '',
`address` varchar(150) default NULL,
PRIMARY KEY  (`id`),
FULLTEXT KEY `firm` (`firm`,`name_`,`address`)
) ;

To find all rows which fields `firm`,`name_`,`address` contain ‘lawyer Braun park avenue’ use the query (for MySQL 5.0)

SELECT * FROM `clients` WHERE MATCH ( `firm`,`name_`,`address` ) AGAINST ( 'lawyer Braun park avenue' )

Important. All fields you are searching through must be in one FULLTEXT index. You can create FULLTEXT index for each search type.

Important. If you want to search by a part of word you can be dissapointed. While MySQL indexes the words from the start, the Fulltext-search also available only with ending wildcard * (wildcards enabled only in boolean mode). When you using a boolean mode, you need to sort the results manually with ORDER clause.

SELECT *, MATCH ( `firm`,`name_`,`address` ) AGAINST ( 'lawy* Braun park avenu* IN BOOLEAN MODE' ) AS `score` FROM `clients` WHERE MATCH ( `firm`,`name_`,`address` ) AGAINST ( 'lawy* Braun park avenu* IN BOOLEAN MODE' ) ORDER BY `score` DESC
Categories: MySQL Tags:

Find all large files in Unix / Linux using shell prompt

August 31st, 2009 Comments off

In Linux there is a problem to measure all folders and subfolders.

You can find all large files (in this example bigger 600 000 kb and from root folder) from shell prompt with command:

find / -type f -size +600000k -exec ls -l {} \; | awk '{ print $9 ": " $5 }'
Categories: Linux Tags: , ,

Command to list directories only

August 19th, 2009 Comments off

If you want to list only directories in current directory, use command:

ls -l ./ | grep ^d | awk '{print $9}'

It can be useful for input for another application or command.

Categories: Linux Tags: ,

Plesk backup domains from command line (multibackups)

January 18th, 2009 Comments off

Once I needed to backup all my 200 domains from Plesk, exclude 5-6 domains. There is no opportunity to do it via Plesk web-interface.. (You can only schedule each(!) domain separately).

So, after long time search, I’ve found my own solution.

I’ve wrote simple bash-script:

#!/bin/bash

maxbackups=2 # 3-1=2, numering from zero
file=./counter

counter=`head -n 1 $file`
let counter=counter+1
if [ "$counter" -gt "$maxbackups" ]; then
    let counter=0
fi
`echo $counter > $file`

for domainname in `ls -l /var/www/vhosts/ | grep ^d | awk '{print $9}'`
do
`/usr/local/bin/pleskbackup --domains-name $domainname --output-file=ftp://user:password@server/$domainname-$counter.xml.tar --exclude-domain-file=/root/pleskbackup/excludelist`
done

New feature: you can now store more than one backup! Just create file with name “counter” and contents “0”. Enjoy!

Categories: Linux Tags: , ,