Archive for the ‘Search Engine Optimization (SEO)’ Category

New Website for Activo:

November 23rd, 2010
Comments Off

I have good news and bad news to tell you, well it is not really bad news, but just wanted to use that saying…

The bad news is that this website is has just become absolute. Meaning we will no longer publish anything new or introduce any new content on this website. The good news is that we already have a new website and the new website can be found at Notice that we got the domain name that we always wanted to have for Activo – Yeah!


For those of you who are interested, the new website was launched in late August of 2010. The domain was purchased back in November of 2009. The site currently already have twice the amount of traffic that this website has. The site also reflects a few changes in Activo:

  1. We moved our offices from Santa Clara to Los Angeles – and we love it down here. In fact LA is a great hub for techies and entrepreneurs, believe it or not.
  2. We now have a virtual team of developers, designers, and project managers. Yes – it was a decision we had to make back in the days before we moved and it turned out to work great. Our customers love it.
  3. We have decided to focus on Magento Development and Magento Extensions. I got to say, what a great decision that was!

So, check out the new website and our new blog. I (Ron Peled) will continue to post regularly about our daily grind and share with you as much as possible from what I am doing at any given time. Drop me a line if you have any suggestions or recommendations.

.NET Framework, AJAX, Content Management Systems, eCommerce, Ektron, Joomla, LAMP: Linux Apache MySQL PHP, Magento, Performance Optimization, PHP/MySQL, Project Management, Search Engine Optimization (SEO), Web Application Hosting, Web Design, Web Development, Web-based User Interfaces, ZenCart

Google is on a diet!

December 1st, 2008
Comments Off

It seems like the search giant is going through some housekeeping: laying off 3,000 employees and shutting down over ten of its services. We know from our clients that the first thing in slow times is to shut down the advertisement dollars. I guess, easy to setup and easy to stop, all it takes is a click of a button.

Well, we are seeing the result. Here is a shortlist of the services that Google just stopped offering:

1. Lively – used to be a virtual world kind of like second life. No more.

2. Google Answers – similar to Yahoo Answers. No mas.

3. SearchMash – non Google branded experimental search for design purposes. Gone with the dinosaurs.

4. Page Creator – a simple way to create a simpleton website. Caput.

5. GDrive – allow online storage on Google servers. Negative.

6. Browser Sync – extension for FireFox to synch your browser settings across computers. Belly up.

7. Hello – online photo sharing integrated with Picasa. Dead.

8. SMS – send SMS to a phone from online. Departed.

9. Click to call – was used inside Google Maps. Extinct.

10. Send to phone extension – FireFox extension. Vanished.

11. Related Links – relating links between sites. Defunct.

Now, this is not to say that Google is no longer the search giant. I only suspect that this is a focus shift and we will eventually see many more useful tools and systems come out of Mountain View, CA. Google is still one of the biggest think tanks in the world and did I mention the largest cloud computing network?

Search Engine Optimization (SEO), Web Application Hosting, Web Development, Web-based User Interfaces , , , , ,

How to Use Mod_Rewrite to Set a Canonical URL

October 31st, 2008
Comments Off

The importance of the Canonical URL is well known in the SEO world. However, for most web developers and website owners is something that is often overlooked. The theory in short explains that search engines rank each page individually and typically penalize multiple pages with duplicate content. So, if a site does not have a mechanism that identifies a Canonical URL, (in other words, a unique URL for the main page), search engines may evaluate multiple links that result in the home page separately. As a result, your site may get penalized altogether or simply suffer from lower pagerank due to the fact that the pagerank is now shared among multiple pages. An example:

The home page of a site can be displayed with any one of the following URLs:


To avoid the above, simply use apache’s mod_rewrite and include the following code in an .htaccess file that should be located at the root folder of the web server (replace the domain name text with your real one):
RewriteCond %{HTTP_HOST} !^$ [NC]
RewriteRule ^(.*)$ http://www.$1 [R=permanent,L]

Now, you’ll need to make sure that all your links that directs traffic to the home page use the chosen canonical url.

Search Engine Optimization (SEO), Web Development , , , , , ,

How to Use the Footer to Improve SEO and Increase Traffic

October 25th, 2008

The Footer is often overlooked by web designers and web developers alike. But it can be the hidden gem of any well thought website; delivering SEO optimization advantage and bringing additional traffic. Lately, it seems, the footer is showing a new trend in web design altogether. Let’s look at how we can combine beauty and brains in the innocent footer.

Can the Footer improve my SEO?

Yes. The footer is one of the elements of any good website that can boost the overall amount of traffic coming to your site. This is where you work on boosting the ranking of the collection of your pages and even the collection of your sites. The top of every page is SEO optimized for a specific subject which is supported by the page itself. The bottom of all your pages is SEO optimized for a wide range of subjects that is covered by your entire site.

In the recent past, the footer was where you placed the copyright statement and perhaps the link to the site map and the privacy policy. Later, websites started showing links to other important web pages or categories within the same site following the all pages should link to all pages within a site thinking. Today, the footer is used as a basket with all the important keywords and key-phrases.

All you got to do to use the footer for your site’s SEO is add a bunch of lists (<ul> , <li>) with relevant keywords. Ideally you have a page for each keyword and the list is actually a list of links to these pages. This optimized footer shows up on every page of your site – and vualla! you are keeping web crawlers happy.

Can the Footer be Pretty?

Of-course. While this is not a new concept, lately I see a trend in footers where they seem to receive more and more love from their designers and webmasters. Here are some screenshots from footers I thought looked good while delivering a nice SEO optimization advantage:

Footer on's footer

Footer: New York Times's footer

Footer Daily SEO's footer

The footer has, and is, evolving all tFooters In Modern Web Design: Creative Examples and Ideashe time. The latest trend seems to show that the footer is gaining brains and beauty. Most importantly, the footer is often overlooked but in fact, it is a great SEO tool!

Additional SEO and Design in Footers Resources:

Search Engine Optimization (SEO), Web Development , , , , ,

5 Peculiar SEO Tips You Should Know About

October 13th, 2008

The following are tips we have learned to address with any SEO Project in addition to the usual tasks:

1. Investment: the Length of your Domain Registration

Google will always value quality website and quality content. The length of registration that a domain is registered for is one more indicator for Google that a site is here to stay and that its owners have put some efforts into it. This fact is mentioned in the Google patent for its PageRank. In simple words: register your domain for at least 3-5 years for an improved recognition from Google.

See more: Does the Length of a Domain Registration Affect Your Rank?

2. Performance: Fast Sites Rank Higher

Again, if you are serious about your site you will make sure that your users can read your site fast. In some cases it takes some effort to improve a site’s performance, in particular with dynamic content such as a CMS or a Shopping Cart. Make sure to use the right software, apply updates (usually contain performance improvements), and consider tightening up the server’s configuration. A good rule of thumb for serving pages is an average of one second per page or less, with a half a second or less and you are a head of the pack.

See more: Landing page load time now affects keywords’ Quality Scores

3. History: the Age of Sites Linking to Yours Matter

Having incoming links to your sites is what you want. However, it will take time before you will see any significant result. This is because Google measure the age of incoming links to your site and ranks older links with a higher ranking. There is not much here that can be done except make sure that links to your main site remain indefinitely and avoid short term incoming links. Also, if you were considering disposing of an old site that somehow links to your main site – reconsider leaving it as is and letting the links mature like wine.

Many factors are at play here: the age of the domains from which links are pointing to pages on your site, The age of the links themselves, and the age of your own domain. In short, the older the links and domains, the higher the ranking or the rank influence overall.

See more: The Age of a Domain Name

4. Uniqueness: Canonical URL

Every site can decide what is the full URL that it will use. Most sites add the ‘www’ in front of the actual registered domain and some ommit. The dangerous teritory is when sites are dynamically generated, like a DB backed CMS, and links are relative. Hence if a third party linked to your site with the ‘www’ and another third party linked to your site without – it will produce identical content pages with different URLs. From Google’s perspective ‘’ and ‘’ are two different URLs. To solve this issue, setup a redirect to your preferred URL – your preferred URL is what is known as Canonical URL.

See more: SEO Advice: URL CanonicalizationCleaning Up Canonical URLs With Redirects

5. Relevance: Google Uses Geolocation to Serve Local Sites

Once more Google is simply trying to do its job: serve pages that are most relevant to you – the search user. Hence, it will retrieve the geolocation for a site and will rank higher local sites to local users. The geolocation data is derived from the IP address. Hence, to get more traffic from Canadian users for example, host your site in Canada.

Peculiar but simple, isn’t it?

Search Engine Optimization (SEO), Web Development , , ,

3 Steps to Increase Your Website’s Traffic with Popular Keywords

September 21st, 2008
Comments Off

These days it is all about SEO (Search Engine Optimization) and SEM (Search Engine Marketing). Especially now with the financial and real estate markets in turmoil, businesses seek to conserve resources and perhaps try the alternative to online advertising; seo with keyword targeting.

Keywords from A Tweeter User (

The following three steps will help you refine content based on a list of selected keywords:

1. List Targeted Keywords

Make a small list (5-15 keywords) of keywords that relate to your industry. Only you will actually know what keywords relate best to your business and services. What you want to remember is to list keywords that you assume your target audience will search for not necessarily keywords that describe your services directly. Notice that keywords can also be key phrases, meaning 2 or 3 keywords that are joined together into a phrase.

2. Refine the List of Targeted Keywords to Targeted and Popular Keywords

Use one of the following free services (or all of them) to refine your list:

These free tools give you a list of related keywords and key phrases with the relevant popularity and lots of other statistics. For example, we provide services for clients who power their ecommerce sites with ZenCart. I typed ‘zencart’ into Google Adwords Keyword Tool and it shows that some of the most popular key phrases are ‘zencart hosting’ and ‘zencart templates’. As a result, pages that relate to ZenCart should have these key phrases in the text. Perhaps I will separate the zencart list of keywords from the rest of the keywords, etc.

3. Develop Content Based on Targeted Keywords

Now that we have a list of refined keywords, it is time to do something about it. Develop or refine your content around these keywords.

Of-course in each website there will be a hit or a miss. Keep exploring for new keywords on a regular basis and make sure to keep tracking the results or any changes that occur as a result of your refined content.

Do you want to share your methods of achieving high levels of targeted traffic?

Search Engine Optimization (SEO), Web Development, ZenCart , , ,

SEO vs PPC: is SEO the preference these days?

September 7th, 2008

Since the days that Overture mastered PPC, even before AdWords was born, I advocated for SEO (Search Engine Optimization) and overall SEM (Search Engine Marketing) over PPC (Pay Per Click advertisement). Recently, blogs, news, client requests, and Google Trends show an ever increasing attention toward SEO at the expense of attention to PPC. Site owners are starting to realize that PPC is not the only solution and certainly not the best.

There are many reasons which will lead businesses and individuals to shift their resources and efforts from online advertisement, PPC in particular, to SEO. Some of the reasons that come to mind are: slowing economy, advertisement saturation, lack of ROI, and perhaps realization that SEO has superior value. The facts are obvious: more businesses look for SEO than ever before. Here is a recent comparison of SEO and PPC in Google Trends (from today 9/7/08):

It is true that with paid online marketing such as AdWords or Panama, it is fairly easy to see results fast. However, once you analyze the ROI in almost any business and on almost any product now a days, the data will tell you that you did OK, and nothing more than OK. In other words, you will get results but a simple glance at all your options will reveil similar or, in some cases, better ROI with other venues like Press Releases, Public Relations, Good old fashion marketing, or even … SEO. In fact, you might discover that if you try using a newsletter to promote your products you may get better ROI. This occured more than once with our clients, where their customers enjoyed the personal attention and the on going discounts that we saw a continuous boost of 30-50% in sales the day of the newsletter compared to the entire month. The bottom line: you must try other venues not only PPC advertisement!

As for the negative part of PPC, I will cover it very briefly just because I do not enjoy discussing the negatives. Watch for click fraud! Avoid paying for syndicated advertisement – it almost never shows results! Ok, I am done.

So why SEO? I have managed websites where the owner consistently spent north of $100K on paid advertisement per month with a single PPC vendor. While it worked and the results were there – the ROI compared to other solutions were never great. In comparison, take one month pay away from PPC and put it toward SEO at least once a year and you shall see greater results over time. The upside to PPC is that it is immediate, once you turn it on you see hits. With SEO you got to give it time and nurish the process. Typical results are showing within 3-6 months and nothing is guaranteed. These are the main reasons that businesses shy away from it, but they shouldn’t. Remember, once you gain momentum in SEO it is very difficult to take it away.

Never forget that SEO is only one of many tools or aproaches that you need for any website. Marketing a website requires a mix of efforts, one of them is SEO. Efficient website marketing includes SEM (Search Engine Marketing), Press Releases (with links), Working on raising the number of links into the site, etc.


The fact is out: a trend of increased attention to SEO over PPC is on. The reason for this trend is not fully understood but it can be attributed to the slowing economy, lack of ROI, or better awareness of SEO value. Regardless, many SEO projects have shown that SEO can deliver better ROI than PPC over time. Do you prefer SEO over PPC?

Search Engine Optimization (SEO), Web Development , ,

Flash and Search Engine Optimization (SEO)

July 5th, 2008
Comments Off

Those of you who worked with Activo on SEO projects know that we have always opposed Flash. At Activo we always valued traffic over look & feel which translated into avoiding Flash technology altogether. Well, no more! If it is true that Flash sites can now receive ‘equal’ treatment, then we will give Flash its place in our Web Development practices.

In recent days, both Adobe and Google issued press releases and blog articles how Google’s crawler will be able to read into Shockwave (.swf) files. This means that all text, menus, and content that is embedded in a Flash object file will now be readable by search engines. Adobe published the Showkwave standards so search engines will be able to read it and Google was one of the first to respond and announce that it knows how to read Shokwave contents. What a welcomed change!

What this means is that we will now have additional parameters to take into account, especially in websites that have decided not to work with flash as their main platform but instead offer a small portion of their home page in flash (such as a banner or a rotating main message). Additionally, if this holds true and Google will be able to read into Shokwave (flash) files than we will start seeing more flash based sites coming up in the Organic search results from Google and search engines.


Search Engine Optimization (SEO), Web Development , , , , ,

5 points for assessing link exchange requests

June 8th, 2008
Comments Off

Out of no where you receive a friendly email from a webmaster that claims that they added a link to your website on theirs and requests that you do the same on your website. They even include a link in the email showing the page with your website name, description, and the link to your website. This is great! now, we are only being asked to add the same link to their website somewhere, should I do it?

Graph representing a network of links

The art of Search Engine Optimization (SEO) is understanding how it works and what raises your site’s ranking in the various search engines. Links that point to your site from other sites is a big area of SEO and is very important to raise the traffic levels of your sites. However, sometimes links can also hurt your positioning, especially if your site is already established and has a certain level of traffic that you do not want to sacrifice.

let’s explore five ways to asses the value of the link that is now pointing to your site and whether you should or should not add a link back on your own website:

1. Website Relevancy

An incoming link from a related (content wise) website is of higher value than a link from a non related website. So, according to this principle, go ahead and visit the website from which the link is pointing to your and asses the relativity to your content and to what makes your site jazz. Notice that in some cases industry proximity will not be enough but only the specific sub-industry within the industry is what you are looking for. For example: your site focuses on gourmet coffee and the linking website is about coffee in general – while it is more relevant than a website about teas a link from this site might hurt your existing gourmet coffee traffic and in order to keep the momentum and grow your traffic you would want additional links from ony gourmet coffee related website. This is also due to the fact that coffee related websites are a dime a dozen and it is important to stay away from the crowed and onto your little search engine optimized sub-industry.

The reason this is the first rule is because this one makes it really easy to dump the idea of a link excange. Once you realize it is not in your industry or related content you do not need to proceed. Save your time!

2. Website Ranking

Following our Search Engine Optimization logic, a link from a site with a general low ranking might hurt your site more than a link from a medium or high ranking. This principle follows the logic that search engines, in particular Google, will rank your site higher if the links pointing to your site are from established and higher ranked sites. Hence, the beasic question that you want to answer is does the site linking to my site has higher PageRank value at its home page than my site’s home page? Notice that PageRank is aranking system offered by Google, if you do not wish to rely on Google alone you can simply run some relevant keyword searches in all three major search engines and see if your site shows up before the linking website or not.

Once you know which one is higher the action should be obvious, if the site linking to your is of lower ranking you should not proceed with the link exchange. It will never hurt you if you have incoming link from the lower ranked website but it may lower your ranking if you link back. Next.

3. Location and the position of the link

Ok. Now we passed the first two tests and we want to look at the specifics. Where is the link located? is it on every page – that would be the best! Is it on an easy to find page (great!)? is it on a hidden page (bad!)? Unless it is on a hidden page, you may want to proceed – but in most cases it will be located on a links page with a whole lot more links on it and you wonder if this is of any good to you. This is where you need to investigate further:

If the links page has over 100 links and seems like something put together very abruptly and with no real way for a user to find your site in the list of sites easily, you might want to abandon the link exchange. If the page is of about 10-40 links and the sites are clearly labeled and given a description and your site can easily be identified or is located near the top of the list – it might not be a bad idea to work with this website/webmaster.

4. Automated or manual link exchange request?

Some sites pay a third party to enhance their SEO and the third party develops a little utility to bombard every email they came accross to send the link request. This can be seen if the links page is full of unrelated links and is overpopulated or if the email is coming from a third party and you have a feeling it may be automated. It is tru, there is no real way to identify sometimes but here is where you can add some human element to it: call or email the person back with questions or perhaps just ask: ‘where is your business located?’

In general, you should value with a lot more respect the manual requests that come in. Perhaps even be ready to tke it to the next level outside of the boundaries of the web. Automatic requests obviously need not waist any more of your time. Next.

5. Intuition

I had to add this since I have seen a lot of link exchange requests. Always have your site’s best interest at the back of your mind. The bottom line is will this link exchange program bring more business through my site yes or no. You should always regard your site’s business interests and your site’s user experience top before you approach it with SEO tweaks. Remember that search engines work for the same users that you want to serve, so their algorithm will favor better UI in most cases. What it means to youis that if you feel that this link exchange addressed all the above items but you still feel uncomfortable with it – don’t do it! go and work on something else that will bring additional hits to your sites.


As search engine optimization gets analyzed more and more and the value of an incoming link gets higher you will receive many link exchange requests. Stay on top of the game and work with the link exchanges that bes fit your website while avoid the ones that benefit the other side only. Remember, linking is only one tool of a set of tools in your SEO arsenal.

Search Engine Optimization (SEO) , , , , , , , , ,

10 Key Search Engine Optimization Items – Feeding the Spiders

March 10th, 2008

Most of today’s internet traffic originates with a search. It’s no secret that most of the traffic that originates  comes from Google. Hence, when you start analyzing and tweaking your site so you will get more traffic, you target Google’s policies on improved rankings.

Google looks at the basic HTML page from a standards point of view. In other words, Google’s crawler/spider tries to identify key information that is labeled and described correctly. Search engine spiders appreciate the extra information and consider well labeled and well described information extra ‘tasty’. The following are key items to consider when building any web site in todays search engine centric world.

Spider Web

1. Search Engine Friendly URLs (SEFs)

Try to make use of a good folder and file naming structure. The folder should describe the category or section of the site and the file name should summarize in a few words (typically 2-5 words) the subject of the page. Today, many websites are database driven either eCommerce or Content Management Systems (CMS), however, any respected system has a way to make sure that the URLs are self describing and conform to the site’s content logic. Make use of the ReWrite URL module for Apache or .NET’s URL aliasing mechanism. Additionally, many content management systems like Ektron CMS400, Joomla CMS, SiteCore, and SiteFinity allow the content editor to define the page URL or setup an automatic rule to generate the page URL when published.

2. Page Titles – the <title> html tag

The second place that the crawler/spider will look for additional information about the page will be in the page title. Similar to the SEF, we recommend using a reverse hierarchy of the page with respect to its section and category. Something like the following would work great:
{Page Subject} - {Section} - {Site Name} - {Site Slogan}
If the Page Title matches the SEF – even better!

3. Meta Tags – the description and the keywords

Yes, it is true that search engines do not rely solely on these hidden pieces of information anymore. However, it is noticeable that sites with well written meta tags has an advantage over sites that disregard these tags completely. The key points are a short list of keywords for the page (not for the site!). Typically, you should have 2-5 main keywords for the web page. Ideally, each web page has its own dedicated keywords. The description is a short summary of the web page and usually about 2-3 sentences will suffice.

4. Breadcrumbs – another enforcement of the subject in the web page

Breadcrumbs are the little list of words describing the path to the item. They are usually located under the header and at the top of the page. The repetition of the information is what really assists with the search engine optimization. We typically recommend the reversal order of the title, something like:
{Home Page - optional} - {Section} - {Category} - {Page}

5. Performance Optimization – allowing the spiders to crawl faster

One of the reasons why many of the top ranking sites are simple informational pages or basic HTML pages is because they are served extremely quickly. In today’s world of large pipes of bandwidth, most internet users still (believe it or not) use dial up modems or DSL with very limited bandwidth. While it is true that bandwidth has improved in recent years, the pace of improvement has not come close to the pace of which pages grow in size due to graphics and special effects. Hence, performance is critical. Make sure to use tools like YSlow (FireFox extension) and Performance Analyizer to measure and improve your sites performance.

6. Valid, Semantically XHTML and CSS

Crawlers, unlike browsers depend on the validity of the HTML which builds the page. While most browsers tolerate bad web page structures, spiders are known to penalize for such defect. In addition to the validity of the web page, it is important to make use of semantic HTML. Semantic HTML is a way to label, tag, and as a result style web pages which describes the content not placement or any styling characteristics of the content. In other words, if a specific side box contains information about manufacturers, label the div tag with class=”manufacturers” or id=”manufacturers” instead of “second_right” or “brown_box” etc.

7. Avoid Nested Tables, or Tables for Layout as a Rule of Thumb

Table layouts and nested tables create significant overhead of unnecessary HTML tags and clutter. The clutter makes it hard for the spiders to differentiate between important and non-important information. Additionally, with well formatted XHTML and CSS, it is fairly easy to bring important information to the top of the page therefore giving it higher weight when indexed.

8. CSS and Ordered Lists Menus Instead of JavaScript or Flash Based Menus

Yes, it is cool to have a flashy animated menu or a smooth transitioning javascript based menu, but the bottom line will get affected. Spiders cannot read these menus. A good main menu shows on all pages in the same place, a flash or javascript menu will not only prevent guiding the spiders but also force the spider to skip important pages. According to the spider, if there is no link to the page, the page doesn’t exist in the spiders’ point of view.

9. Bottom Links – the Best Place to Remind Spiders of Additional and Important Pages

In addition to the main menu links, bottom links are a great location to place links that were left out. The bottom links are a great example of using web pages length, which are unimportant from a UI perspective, but key for SEO. Simply add or repeat links to the important pages at the bottom of the home page, and even at the bottom of every page in the site. This is also the place where you can try different variations of the links:

<a href="mylink">Wholesale</a> very different from:
<a href="mylink">Wholesale Coffee for Restaurants and Grocery Stores</a>

10. Reduce the Size of JavaScript and CSS Files

There is really no need to have inline Javascript or CSS clutter in web pages anymore. The amount of free and open source tools and the commercially available tools allow any professional web developer to collect, minify, and compress JavaScript. It allows developers to collect, consolidate, and compress CSS code. An additional benefit of extracting this code and placing them in a separate files is the fact that once the files were read once by the browser, they are cached for a while. Each browser has separate algorithms and default lengths of caching, which therefore speed up the site and use less bandwidth. One improvement means double the fun!

So now, go back to your drawing board, and spice it up for the spiders!

Search Engine Optimization (SEO)