Updating, recompiling, VirtualHost templates, customizations, php safemode …How to restrict IP access to an Apache directory using an .htaccess file?

Updating, recompiling, VirtualHost templates, customizations, php safemode …How to restrict IP access to an Apache directory using an .htaccess file?

If you want a path with sensitive data that you don’t want it to be public, then you can restrict the path to only be accessible by your own IP address, using an .htaccess file.

Creating the .htaccess file in the path you would like to protect.
Add this code, where you would replace  with your own IP.

<Limit GET POST>
order deny,allow
deny from all
allow from 1.2.3.4
</Limit>
<Limit PUT DELETE>
order deny,allow
deny from all
</Limit>

Other variations on this are possible, google should have many guides on it.

Redirect domain.com to www.domain.com

If you want to force clients to use www.domain.com, then you can redirect them from domain.com to the www version with an .htaccess file.

In your public_html folder, create a file calls

.htaccess

and add the code:

RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^www\.(.*)$ [NC]
RewriteRule ^(.*)$ http://%1/$1 [R=301,L]

The Other versions of the same thing to do a negation check to see if the domain is not www.domain.com, but that does not work if you have subdomains.. therefore the need for the explicit check for the value we don’t want.

Search engine crawlers are increasing my system load

Since a search engine like google need to parse your website to determine what to search for, if your website has a lot of data, this can often cause a high load on your system if the crawl is done in a short amount of time.

By creating a robots.txt file in your public_html folder, you can instruct these crawlers to slow down.

A sample robots.txt might look like this:

User-agent: *
Crawl-delay: 300

And Which tells all crawlers to wait 300 seconds before each request.

Without it, a cralwer might make multiple requests per second, thus increasing your system load.

I need awstats to rebuild the static html pages for previous months

Assuming the data for the require months does exist in here:

/home/username/domains/domain.com/awstats/.data

you should be able to re-generate your static html pages for those months.  The script below can do it for you:

#!/bin/sh
if [ “$#” -eq 0 ]; then
echo “Usage:”;
echo ”    $0 <MM> <YY>”;
exit 1;
fi

month=$1
short_year=$2
full_year=20${short_year}

for u in `ls /usr/local/directadmin/data/users`; do
{
for d in `cat /usr/local/directadmin/data/users/$u/domains.list`; do
{
echo “”;
echo “$u: $d: $month $full_year”;
DATA=/home/$u/domains/$d/awstats/.data/awstats${month}${full_year}.${d}.txt
if [ ! -s $DATA ]; then
echo “Cannot find $DATA for $month $full_year. Skipping.”;
continue;
fi

/usr/bin/perl /usr/local/awstats/tools/awstats_buildstaticpages.pl \
-config=$d -configdir=/home/$u/domains/$d/awstats/.data -update \
-diricons=icon -awstatsprog=/usr/local/awstats/cgi-bin/awstats.pl \
-dir=/home/$u/domains/$d/awstats -builddate=${short_year}${month} \
-year=$full_year -month=$month

echo “”;
}
done;
}
done;
exit 0;

save this to a script, say old_awstats.sh, and set it to 755.

Run it for each month, for example, April (month 04) 2014:

./old_awstats.sh 04 14

Leave a Reply

Your email address will not be published. Required fields are marked *