Although on-site SEO essentials are universal, it makes sense to pay special attention to specific items and related indicators, depending on the type of website you are managing. Because some issues – page accessibility to robots, low quality content… – can reach particularly large proportions on some websites structures and have a huge impact on organic traffic.
E-commerce websites often face several types of issues related to their size, their navigation structure, and their typical organic traffic pattern: they tend to get either mainly long tail traffic (large product catalog, low number of visits per page, on many product pages) or mainly middle tail traffic (high brand recognition, significant direct traffic, and most organic traffic on top category pages, through more generic, competitive search queries).
Typical issues include:
A significant portion of products are not explored by Google
This generally has a big leverage: with a large product catalog, chances are only a portion of products are known to search engines. Either the website already generates most its traffic on product pages and having more products crawled will have a certain, mechanical effect on this long tail traffic; or it doesn’t, and this is an important source of potential incremental traffic.
The goal is to make sure that Google explores all products.
What can be done:
Encourage products crawl
- Minimize website depth, as deeper pages are less crawled: for instance, reduce pagination by adding navigation subcategories and increasing the number of items per page.
- Optimize internal “link juice” flow within the website to make sure all products receive more than just a couple of links. Typically, some products receive a single link, often from a long paginated list. Add complementary navigation criteria so a product can be listed in several lists, add product-to-product links between similar products.
Allow your Web server to deliver more content to Google within the same time frame or “crawl budget”: deliver content faster, and avoid delivering again content which has not changed since it was last explored by the search engine:
- Optimize performance. This is very specific to each website.
- Implement HTTP 304 (Not Modified) status codes, in response to HTTP headers which include an “if modified since” option. This will allow a search engine crawler to get a fast response as no content is delivered, for product pages which didn’t change since the last exploration.
Make sure Google will explore a strategic subset of product:
- Pay special attention to these products’ depth and number of incoming links.
- Implement XML sitemaps for these strategic products. Caveat: if the number of products in sitemaps is much larger than Google’s crawl budget, then instead of encouraging a higher crawl ratio for these products, the sitemaps will most likely introduce some unpredictable rotation in Google’s index.
Near duplicates within products
There can be many products which are almost the same, apart from a few details (color for clothing, minor technical characteristics for high tech products) that are not differentiators internet users are likely to include in search queries. The goal is to make sure product pages present products that are differentiated enough to respond to different queries, while avoiding the negative impact undifferentiated content has on quality criteria.
What can be done:
Implement a notion of “meta product ”
A master product which common characteristics which will be better positioned than products in the near-duplicate pool which compete with each other. This will most certainly be justified only for a subset of products, which need to be identified.
Multi-faceted navigation implementation prevents middle tail organic traffic
Navigation pages are targets for top to middle tail SEO traffic queries (for instance, “Nike childrens shoes”). It’s an issue if they are not accessible to robots, or if too many are accessible through crawlable filter combinations. The right balance must be found so that search engines see all navigation pages with potential for organic traffic, but are not swamped by additional pages that will waste search engine crawl and degrade global website quality indicators.
What can be done:
Make sure target pages for middle tail traffic are technically accessible to search engines robots.
For instance, they may not be accessible because they are displayed dynamically using Ajax. This can be solved by using HTML links (<a href>) for selected navigation pages or implementing crawlable Ajax (use URLs with a hash fragment, deliver HTML snapshots to crawlers).
Avoid creating a large number of low quality pages which result from too many filter combinations:
Very similar pages created because they include filters which are not significant differentiators, pages with a very small number of products or none, pages with filter combinations that don’t make sense for the user (with all possible combinations generated automatically). Best practice: allowing only one filter at a time, or a low number of filter combinations hand picked by product managers.
A variant of this issue can be caused by internal search pages linked on the website, with too many search criteria, and very often as well, duplicates due to similar queries with the same words in a different order.
These are just a few of the issues that can affect Ecommerce sites and aren’t common on different types of websites.