Yeah, I think you are right. I am glad I checked 301 on WMT just to make sure there are no issues with the 301 redirect.
Cheers, C
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Yeah, I think you are right. I am glad I checked 301 on WMT just to make sure there are no issues with the 301 redirect.
Cheers, C
Mihai and SpyderTrap,
The 301 redirects were implemented in mid May. Grate idea about using Google fetch to check redirect. I checked and the redirects are ok (301).
I guess it's possible that Google finds the new URL's using our directory (with updated URL's) and also has the old URL in it's index - hence "apparent" duplicate content.
Thanks for the input,
Christian
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content.
I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL.
Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Seems like Google does interpret "Post:
"Googlebot may now perform POST requests when we believe it’s safe and appropriate."
http://googlewebmastercentral.blogspot.com/2011/11/get-post-and-safely-surfacing-more-of.html
Thanks for the input, I will check around to see if Google really does not interpret "post"
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content.
We want to get away from using an iframe to solve potential duplicate content problem.
Question:
Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?