Reducing bandwidth usage and slow queries in blogs hosted on Bluehost.com

Web Talk, WebmasterComments Off on Reducing bandwidth usage and slow queries in blogs hosted on Bluehost.com

Bookmark and Share

One of the biggest challenges I have to face daily, as far as my blog is concerned, is to reduce the amount of bandwidth it uses, as well as the number of slow queries it produces. This issue is all the more compelling since my blog is hosted on Bluehost servers. In fact, their strict policy doesn’t allow you to have more than a certain number of slow queries per day. As a consequence, once they find out that your blog is producing too much trouble for their server and those blogs hosted on it (don’t forget that, at the time I am writing, Bluehost hosts websites on shared servers), they will deactivate “your toy” right away! This happened to me some months ago and since then, believe me, I have constantly been  on a “quest for perfection” in order not to give any issue both to Bluehost and my co-tenants (other blogs hosted on the same sever as mine).

One of the first things I did was to reduce the amount of data visitors (and spiders) downloaded while visiting my blog. I achieved an excellent result by manually installing XCache. Have a look at the picture below and see the amazing results I managed to achieve in terms of number of Kilobytes downloaded per day and  time spent downloading a page.

xcache-and-other-tweaks-resultsRight after that  I decided to add to my .htaccess file some nice piece of code which cached certain items on those  webpages visited by readers. I also used a couple of tricks to reduce the total weight of my blog, giving in this way a chance to those people without a fast Internet connection to read it  with ease. It is important to notice how in this “quest for perfection” I haven’t used any plugin  which, in many cases, do nothing but increasing the time databases have to use to display your blog correctly and run its applications flawlessly, with the risk of getting a lot of slow queries. I well understand that manually installing modules on your server and modifying/adding files may be a risky  activity but, how much are you willing to pay in terms of risk and “hard labour” to make your blog run smoothly and without any issue? As far as I am concerned, I am willing to risk a lot and I am also willing to experiment a lot. The risks I take are always, always performed in a “controlled-environment”, so to speak. In fact, I have installed on my server a PHP code which automatically backup my database and sends it to my e-mail account twice a day. I also have a copy of my blog  installed locally on my computer, just in case I have to try out things which are a little too…daring.  In this way I can play with my blog without being afraid to screw thing beyond remedy. By the way, the PHP application able to backup any blog’s database is called Backup2Mail. It is quite easy to install and work flawlessly.

Anyway, back on track! Despite all the attempts to enhance the way my blog used its server-resources, I still continued to get a lot of slow queries. That’s why I started thinking right away about some not well optimized plugin. I therefore uninstalled a stat-plugin which was giving  some problems without getting amazing results. At that time, I used to get around 10 slow queries a day which was still far away from my goal of getting none to a couple of them a day. So, what else was left? I cached my blog, I cached its items, I tweaked the .htaccess file, I optimized my theme, I uninstalled a couple of plugins. What else was left to do? I was really hopeless.  I even hired  a couple of guys to give a look at my slow queries, but their reply was that they weren’t caused by my blog. Inspiration came from a little detail (don’t forget the old saying…God is in the details). Analyzing my traffic I saw some IP addresses accessing my blog on a regular basis and no, they weren’t Google-bot.  After Googling a little bit I understood what they were. They were the so called bad-bots or spy-bots trying to find security holes in my blog to inject their venom!

The issue with them was that they tried to exploit my blog using different IP addresses. It was very difficult for me to block one thousands IP!  Again, I Googled, this time for three days, and in the end  I got the answer to my issues. The solution came from installing an excellent PHP application called Bot-Trap which, upon spotting any odd behaviour coming  from a web spider, blocks it by automatically putting its IP address in my .htaccess file. If you want to  have a good idea of the evil bots which tried to access my blog, visit my personal blacklist page.

I have to say that since then I just get a couple of slow queries a day, on average. Furthermore, but take this with salt of grain, it looks like Google is sending me more visitors. Maybe this is due to the fact that having a lot of bad-bots around my blog was not a sign of trust before Google’s eyes. Whatever the reason, I have managed to achieve many results: I hardened my blog security, I got rid of evil bots, I optimized my blog and reduced the amount of server-resources my blog uses per day. By the way, I also learned a lot of things, the same things which I have written here, for you. Enjoy!

Related Articles Latest Articles

Comments are closed.

Copyright © 2007-2017 | Sitemap | Privacy | Back To Top
Best screen resolution 1280x800 or higher.
Web Talk is best viewed in Firefox.