We're looking to rank for competitor brand terms and have created competitor brand pages for some of our main competitors. My question is where would be most effective to place these pages on our site?
Also, would this be classed as grey hat?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
We're looking to rank for competitor brand terms and have created competitor brand pages for some of our main competitors. My question is where would be most effective to place these pages on our site?
Also, would this be classed as grey hat?
I know that e-commerce sites usually have SSL certificates on their payment pages. A site I have come across is using has the https: prefix to every page on their site.
I'm just wondering if this will make any difference to the site in the eyes of Search Engines, and whether it could effect the rankings of the site?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content.
We would prefer to go down a route that didn't involve hashbangs if possible.
Does anyone have any experience using hashbangs and how it affected their site?
Any advice on the above would be gratefully received.
Thanks for the feedback Matt.
We aren't aiming to try and outrank our competitors, just appear on page 1 for those who have recently taken a significant amount of market share over the past few months, with the intention of capturing some of their traffic.
However, you do raise some valid points re loss of customer trust that we will definitely take on board and discuss.
I've been wondering for a while now, how Google treats internal duplicate content within classified sites.
It's quite a big issue, with customers creating their ads twice.. I'd guess to avoid the price of renewing, or perhaps to put themselves back to the top of the results. Out of 10,000 pages crawled and tested, 250 (2.5%) were duplicate adverts.
Similarly, in terms of the search results pages, where the site structure allows the same advert(s) to appear under several unique URLs. A prime example would be in this example. Notice, on this page we have already filtered down to 1 result, but the left hand side filters all return that same 1 advert.
Using tools like Siteliner and Moz Analytics just highlights these as urgent high priority issues, but I've always been sceptical.
On a large scale, would this count as Panda food in your opinion, or does Google understand the nature of classifieds is different, and treat it as such?
Appreciate thoughts.
Thanks.