Archive for the ‘Performance Optimization’ Category

New Website for Activo:

November 23rd, 2010
Comments Off

I have good news and bad news to tell you, well it is not really bad news, but just wanted to use that saying…

The bad news is that this website is has just become absolute. Meaning we will no longer publish anything new or introduce any new content on this website. The good news is that we already have a new website and the new website can be found at Notice that we got the domain name that we always wanted to have for Activo – Yeah!


For those of you who are interested, the new website was launched in late August of 2010. The domain was purchased back in November of 2009. The site currently already have twice the amount of traffic that this website has. The site also reflects a few changes in Activo:

  1. We moved our offices from Santa Clara to Los Angeles – and we love it down here. In fact LA is a great hub for techies and entrepreneurs, believe it or not.
  2. We now have a virtual team of developers, designers, and project managers. Yes – it was a decision we had to make back in the days before we moved and it turned out to work great. Our customers love it.
  3. We have decided to focus on Magento Development and Magento Extensions. I got to say, what a great decision that was!

So, check out the new website and our new blog. I (Ron Peled) will continue to post regularly about our daily grind and share with you as much as possible from what I am doing at any given time. Drop me a line if you have any suggestions or recommendations.

.NET Framework, AJAX, Content Management Systems, eCommerce, Ektron, Joomla, LAMP: Linux Apache MySQL PHP, Magento, Performance Optimization, PHP/MySQL, Project Management, Search Engine Optimization (SEO), Web Application Hosting, Web Design, Web Development, Web-based User Interfaces, ZenCart

Speed Optimized Websites Rank Higher with Search Engines

July 16th, 2009
Comments Off

Website performance should not be taken lightly. Now, when I say website performance optimization in general I mean the time that it takes a webpage to fully render in the browser. Many different factors can influence that including the number of files that make your page, the size of the files, whether it renders in standard XHTML or quirks mode, etc. But, for search engines all that matters is the raw HTML output of your site. One of the ways that search engines measure a site’s validity is by measuring the speed it takes it to serve the HTML portion. Yes, raw web server power. Why?

Search engines try to guess which websites out there should gain more respect than others, one characteristics is speed. If you think of it, the speed it takes to serve a page reflects how much the owner invested in it and hence reflects on the ranking that it should get in a backwards way. In other words, a site that is served on a dedicated server with serious horse power should get higher rankings than a site that is served on the cheapest shared hosting plan. Another fact is that major search engines researched user return rate and have found that the return is higher for faster sites and even microseconds count. That is why the best search engines focus on speedier results and favor results from faster websites. Really?


Look at the graph above, you will see a direct correlation between the website’s speed and the number of indexed pages. There might be a delay and it is not 100% accurate because the speed is not the only factor here, but over time it seems to have an effect. These graphs are from Google Webmaster Tools, under the crawler stats. Ok, how should I increase the performance of my site?

Here are a few things to consider:

  • Invest in a good hosting package. If you are serious, get at least a VPS with your own IP address (dedicated IP is also a measure). A VPS or a dedicated server will always trump the performance of shared hosting over time. Notice that some shared hosting environments reach 500+ websites on the same piece of hardware.
  • If you use PHP make sure to use APC: Alternative PHP Caching.
  • Always turn on caching at all levels: Apache, PHP, and your application. All levels usually have some sort of a caching mechanism – use it!
  • Research your biggest bottleneck and tackle it, always repeat over time. Just like you do with SEO – it is always work in progress.
  • Look in the logs: every time that your server experiences an error or a warning it has to trigger the error handling mechanism which in most environments require additional resources. Especially unhandled exceptions in ASP.NET/IIS7 environments.

The list is really long and can get very technical but in general you always want to keep website performance optimization in the back of your head. It is well worth it!

What is your experience with speed optimized websites? how did it affect your SEO results?

Performance Optimization, Web Application Hosting, Web Development ,

Manage Application Pool Recycling in IIS7

May 21st, 2009
Comments Off

If you manage a website that is hosted with the latest Windows Server 2008 and IIS7 you probably want to be aware of the Application Pool settings in general, and in particular the Application Pool Recycle settings. As it turns out, by default, Windows Server 2008 sets the Application Pool to recycle every 1740 minutes. Which is exactly 29 hours or one full day and 5 hours or the number of lattes I had in the winter. All kidding aside, this number is a bit random, especially because it determines when the website’s application pool will recycle and the website will need to recompile, recache, etc. Here is a screenshot:


Instead, what I recommend is that you uncheck the regular time intervals checkbox and use the Specific time one. I chose here 2:00 AM because it is when the site sees the lowest numbers of hits and it is the best time to handle a recycle. You should setup your webserver to recycle when your site is experiencing the lowest traffic levels. So, you’ll probably need to dig into the analytics a bit. Here is a screenshot of how I setup my server:


- Recycles during off peak hours
- You actually control when it recycles
- Typically a performance boost on average

The application will now recycle every 24 hours, instead of 29 hours. In fact, if you are certain that your website has no major problems and no memory leaks you can potentially set the application pool to not recycle automatically at all. This state needs to be monitored but may result in a longer smooth ride. Enjoy!

.NET Framework, Performance Optimization, Web Application Hosting

pdnsd – Decrease DNS response time and save bandwidth

December 25th, 2008
Comments Off

Sometimes, when you realize that you could have improved the system with so little effort, we blush. This is what happened to me when I realized that most of the neworking delays could have been avoided with this tiny but wity utility. I knew that having a local caching DNS or the like is the answer but I did not want to use a full fletched DNS server. I found pdnsd – a small proxy DNS server with permanent caching. Perfect!

In a nutshel, pdnsd is a small utility that caches DNS translations locally on the HD, hence next time the server queries the address the response time is likely to be minimal. Usually, the server has to query your ISP’s DNS or whatever DNS server you specified in the /etc/resolve.conf file. In a high performing web servers you are constantly competing with other packets on the network or your network resources. This is a great advantage. By installing pdnsd you achieve the following:

  • Decrease the average DNS response time sharply!
  • Increase your server performance, especially if this server needs to communicate externally a lot like an eCommerce server which constantly needs to communicate with shipping and credit card servers.
  • Save on bandwidth.

Here is how you go about setting up pdnsd on a CentOS server:

1. Download the latest stable rpm:
go to pdnsd download page and look for your relevant rpm. For CentOS 5.2 64bit I got the latest version as of yesterday:


2. Install the rpm:

rpm -i pdnsd-1.2.7-par_sl5.x86_64.rpm

3. Configure pdnsd to use your current DNS servers:

vi /etc/pdnsd.conf

Paste the following, of-course you should use your DNS servers instead:

server {
ip =,;

4. Start pdnsd and test that it is actually working

service pdnsd start
 dig @

If you get the IP, it is working. Notice the response time, if you try again you will see a sharp decrease in response time. My servers’ second response time is almost always between 1-0 ms.

5. Set pdnsd to start automatically on boot

vi /etc/default/pdnsd

Enter the following and save:


Also make sure the daemon is set to auto start on boot. I use ‘ntsysv’, you can use chkconfig or whatever you are used to.

6. Set your server to use the pdnsd instead of your DNS servers

vi /etc/resolv.conf

Make sure that the first nameserver line is ’′. Should look like this:


7. Restart your network service:

service network restart

How do you know that it is working? try to use any script that needs to go outside to the network, like ‘yum update’. In most cases, you will notice that the second time is much faster. Enjoy!

LAMP: Linux Apache MySQL PHP, Performance Optimization, Web Application Hosting, Web Development , , , ,

Understanding MySQL Query Caching Process

November 5th, 2008

These days, websites are expected to perform. No excuses. I also mentioned before that website performance is key for SEO. Well, one easy thing that can be done is turning on MySQL caching. Most servers ship with MySQL in its default configuration which has caching turned off. There are many resources out there if you search enough, however I have found a presentation by Baron Shcwartz: MySQL Query Cache which explains in detail advanced concepts of MySQL Caching. Here is the slide that summarizes the process of MySQL Query Caching:

LAMP: Linux Apache MySQL PHP, Performance Optimization, PHP/MySQL , , ,

3 Pitfalls to Avoid for a Faster Ektron CMS400 Website

March 17th, 2008

Server performance is one of the most important functions of websites today. Users expect immediate response when clicking around your site. Even a 3.5 seconds delay may send them somewhere else. Also,  search engine crawlers (like Google) will rank you lower as a result of high latency. Hence, it is not only important to practice a faster website delivery, it’s a necessity. Recently, we have assisted our clients with server performance, which we supported using Ektron CMS400 v7.0.4. Here are the three main server performances we have noted after testing and tracking down the cause of website delays by using the Trace technique in .NET:

1. Avoid XML/XSLT Tranformations for Controls Output

After researching the cause of a huge latency greater than 2 seconds on every page refresh, we have discovered that about 50% of the latency was during the Page_Load occurrence. A more thorough research revealed that the 50% in delay was occurring during the XSLT transformations of all the controls on the page.
By caching these controls, (this solution is only partial and not recommended), and changing the way controls are rendered onto the page, we were able to reduce this latency to less than half. Therefore, we recommend building your Ektron site with the basic Ektron controls, and if you need a special way to present the information, use the code behind to generate a display of the data while you gather the data through the Ektron API and process the data programmatically. In other words, avoid XSLT altogether.

2. Make Use of the Flex Menu Ektron Control

Most of the Ektron sites that we’ve had the chance to work on were structured similarly. The main menu was a set of multi-level menus, which are all rendered by a style-specific XSLT. In some cases, before running through the XSLT, a script was passing through the menu items to find the one that needed to be ‘selected’.
Why should we reinvent the wheel?
If you read Ektron’s documentation, you will find a few menu controls that can be very handy: DHTMLMenu, Menu, SmartMenu, and FlexMenu. Each one has its advantages and disadvantages. In short about each one:

  • DHTMLMenu: My least favorite. Uses too much JavaScript and doesn’t render nicely for SEO
  • Menu: The simplest one to use for basic menu systems
  • SmartMenu: I like this menu because it’s a styled and nested unordered list. It can also support section 508 and highlights the selected menu item by a client side script, which is a lot more performance friendly
  • FlexMenu: Our tests indicate that this menu control is the fastest if you have a sophisticated XSLT. It seems like Ektron simply provided a flexible menu control specifically for XML transformations.

We recommend the use of the SmartMenu, and if you insist in using XSLT to display a menu, use the FlexMenu as the alternative.

3. Make Use of the .NET Caching Mechanism

A simple thing for developers to set, isn’t it? Well, you can’t imagine how many sites we’ve seen without any caching beyond what the default settings allow. There is so much more to cache, it is almost a crime not to make use of it in our technology-driven age.

Ultimately, the above lists are just a few main performance issues that we have found with many Ektron sites. The items above alone can improve the site’s performances by up to 50%. However, this list is far from complete by any means. Hardware, Paging, Deadlocks, Server Environment, and even Bandwidth need to all be reviewed in order to improve performances.

Fast Surfing!

.NET Framework, Ektron, Performance Optimization