Yes, it is important to update your old links. Check out this post:
http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
and read the part that says "1. Minimize redirects"
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Yes, it is important to update your old links. Check out this post:
http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
and read the part that says "1. Minimize redirects"
Unfortunately, robots.txt won't prevent your site from being crawled and indexed if there is a link from an external site pointing to yours. What you need to do is use
on all your development pages. I don't know how big your site is, so this may or may not be a lot of work. Do this, then after the next Google crawl, your pages will be dropped from the SERPs.
In looking at your http request and response headers, the root of your mothership site 301 redirects to www.crisisprevention.com/home. When I looked at the headers of your UK site: www.crisisprevention.co.uk/specialties, the http request status is "200 OK" which means it's not redirecting to your US site.
Perhaps what you need to do is redirect these international pages on a page-to-page basis. So www.crisisprevention.co.uk/specialties would redirect to www.crisisprevention.com/specialties and so on and so forth. That should enable you to pass the page juice successfully to the mothership site.
This may be a lot of work, depending on the number of pages that you have. The alternative is a wildcard 301 redirect.
Hi Jake,
After looking at your page, I have the feeling that what you want to do is eliminate any possibility of duplicate content, am I correct? It seems like every product on your site has it's own unique URL anyway. You would want each of these pages crawled and indexed, so I wouldn't use noindex,nofollow on these pages. Only use noindex,nofollow on pages that you DON'T want to see in the search engine results.
Use the rel="canonical" on the pages that could be reached through multiple URLs.
On your page: http://www.1800doorbell.com/wireless-plug-in-door-chimes.htm?option=71,
I would use the rel="canonical", because these URLs
http://www.1800doorbell.com/wireless-plug-in-door-chimes.htm?option=1
http://www.1800doorbell.com/wireless-plug-in-door-chimes.htm?option=21
http://www.1800doorbell.com/wireless-plug-in-door-chimes.htm?option=31
http://www.1800doorbell.com/wireless-plug-in-door-chimes.htm?option=41
...etc., all take you to the exact same page.
Hope this helps!
It sounds like you are talking about "doorway pages". This practice can get their website penalized, or even de-indexed from Google’s search results.
You can send them to this link on Google Webmaster Central, which explains it all:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2721311
Even if you use a directive in robots.txt, your pages can still get crawled and indexed if the crawler gets there via an external link to your page. Simple solution is to use:
on all pages you want excluded from the SERPs.
Optimize for the visitor first, then the search engine. Make sure the titles and the headers read naturally. Everything that you put on the page should be relevant to the content of the page - that includes titles, headings, pictures, etc. If anything on your site sounds spammy or irrelevant, dump it and start over.
Be like Google and focus on providing the best user experience possible. The Google philosophy: "Focus on the user and all else will follow."
Hi iivgi,
If it's marked as duplicate content, it can definitely affect your SEO rank. You can add a directive in robots.txt to disallow crawling of those pages. The problem with that though, is that the page can still be indexed if there is an external link to it. The way to prevent that is to use:
in the section of those pages you don't want indexed. The pages may still get crawled, but the pages shouldn't show up in the SERPs.
Hi Letty,
That is also a great idea. I've already sent the client my recommendations. If they insist on doing it their way, then I will definitely suggest they split test the two pages, and go with the winner. Thanks!
Hi Brad,
I agree with you that it would be a poor decision to change the focus of the page. I would rather improve on an existing page that has good ranking despite poor on-page, than start over with entirely new and untested content. I wouldn't want to turn off regular visitors by providing them with an entirely different user experience. I also don't want to be blamed for the drop in traffic if they go ahead and do this!
Thanks to both of you for your feedback!
It sounds like you are talking about "doorway pages". This practice can get their website penalized, or even de-indexed from Google’s search results.
You can send them to this link on Google Webmaster Central, which explains it all:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2721311
Yes, it is important to update your old links. Check out this post:
http://www.seomoz.org/blog/15-tips-to-speed-up-your-website
and read the part that says "1. Minimize redirects"
Hi iivgi,
If it's marked as duplicate content, it can definitely affect your SEO rank. You can add a directive in robots.txt to disallow crawling of those pages. The problem with that though, is that the page can still be indexed if there is an external link to it. The way to prevent that is to use:
in the section of those pages you don't want indexed. The pages may still get crawled, but the pages shouldn't show up in the SERPs.
Looks like your connection to Moz was lost, please wait while we try to reconnect.