We created a product blog page that was highly optimized for SEO based on a recommendation from a colleague. These are now our best performing pages - however they do not convert as highly as the bona-fide product pages. After further investigation we're concerend that we shouldn't have split our content accross two pages - keyword cannibalisation. Is this correct and should we 301 our product blog pages to the other high converting pages?
Posts made by JohnHillman
-
Keyword cannibalisation
-
301 redirects twice
We currently have some 301 redirects set up on our site however sometimes a page will redirect twice before reaching the final location. Is this OK from an SEO perspective to have a page redirect twice or should we concentrate on reducing it to one?
-
RE: Is use of javascript to simplify information architecture considered cloaking?
Does the googlebot follow pagination of search results? All our product pages are on the third tier, but their discovery would rely on google following pagination if we cannot use our original approach to infroamtion architecture (ie use javascript to channel the google bot to discover our tier 3 pages)
Thanks for your help!
-
RE: Is use of javascript to simplify information architecture considered cloaking?
Hi Ryan
We use a navigation bar in the header which means that there are a large number of on page links and there is no clear way to determine our information architecture from our internal link structure. i.e. many pages at different levels in our information architecture can be accessed from every page on the site.
Is this an issue? Or will the URL structure be sufficient for the search engines to categorise our content? How can we help the search engine discover content at level 3 in our hierarchy if we insist on using a navigation bar in the header which we believe gives a good user experience?
Thanks!!
-
Is use of javascript to simplify information architecture considered cloaking?
We are considering using javascript to format URLs to simplify the navigation of the googlebot through our site, whilst presenting a larger number of links for the user to ensure content is accessible and easy to navigate from all parts of the site. In other words, the user will see all internal links, but the search engine will see only those links that form our information hierarchy.
We are therefore showing the search engine different content to the user only in so far as the search engine will have a more hierarchical information architecture by virture of the fact that there will be fewer links visible to the search engine to ensure that our content is well structured and discoverable.
Would this be considered cloaking by google and would we be penalised?
-
RE: My new site experienced a sudden drop in Google rankings
Various, but coffee machines reviews dropped 37 places in a week.
-
RE: My new site experienced a sudden drop in Google rankings
No. Hove focused purely on-page and content. no offsite activity at all as yet.
-
My new site experienced a sudden drop in Google rankings
We launched a new affiliate site at the beginning of April this year dealing in reviews, vouchers and price comparison for a niche product.
We had a pretty good start, with some of our target keywords ranking #8 #10 #7 but this week they have suddenly dropped off dramatically - like 36 or 42 places.
We haven't made any changes to the site apart from publishing new content pieces, such as news posts and reviews - does anyone know why this would suddenly happen?
Thanks