Too many 301s?
-
Hi there, If there is a website that has accidently generated say 1,000 pages of duplicate content, would the seo be hurt if all those pages were re-directed to the origional source of the content?
There are no plans to re-write the 1,000 duplicate pages, they are already cached and indexed by Google.
I thought about canonical tags but as they have some traffic and a little seo value i thought 301 re-direct would be more appropiate to the relevant pages?
I am also right in thinking you would be able to remove the 301 in the .htaccess file once the index has updated?
Also once removed the 301 - i could use those urls later from scratch if i wanted?
Any info much appreciated.
-
Great insight Highland!!!
-
If they had links, I would 301 the pages with links. Everything else I would 404
-
How are these pages generating traffic? Are they being found in the search engine?
The real question, do these pages have links to them?
There is little value to a 301 redirect if you are not moving link traffic in the direction you are pointing. If you are out ranking the original content, then perhaps a 301 could help. How well does the original content rank?
-
Ha, yes you can my friend.
-
But you can do it, yes?
-
Bringing back URL's that you didn't want and then decide that you do want is pretty annoying to Google...
-
I would see if they had links, and get rid of the rest, it may look to Bing that you are trying to be tricky. Its not natural
-
Ok, i probably wont but in what istance would you not recommend this?
I understand pa and pr etc will be back to nothing but its the keyword url i might want to use from scratch
-
Yes but I wouldn't really recommend this.
-
Also last one, if i wanted to revive the 301s say in a year i would be allowed to and the pages would index again?
-
Thanks highland.
-
I would 301 the pages and get them out of your site's index. Even if you canonical all of them Google will still have to index 1000 pages instead of 1. The 301 will transfer most of your rank to the new page and you'll improve your crawl budget.
Why take the 301s out? Just leave them in there in case there are links pointed to them.
-
Well they seem to be generating traffic.
In principal is what i intend on doing ok, will it hard the seo or be seen as ok do you know?
Many thanks,
-
That sounds weird! If you generated 1000s of pages automatically, and these are all duplicate content, why don't you remove them? Google will end up removing them from its cache as well after a short period!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does my site have so many crawl errors relating to the wordpress login / captcha page
Going through the crawl of my site, there were around 100 medium priority issues, such as title element too short, and duplicate page title, and around 80 high priority issues relating to duplicate page content - However every page listed with these issues was the site's wordpress login / captcha page. Does anyone know how to resolve this?
Technical SEO | | ZenyaS0 -
Many Pages Being Combined Into One Long Page
Hi All, In talking with my internal developers, UX, and design team there has been a big push to move from a "tabbed" page structure (where as each tab is it's own page) to combining everything into one long page. It looks great from a user experience standpoint, but I'm concerned that we'll decrease in rankings for the tabbed pages that will be going away, even with a 301 in place. I initially recommending#! or pushstate for each "page section" on the long form content. However there are technical limitations with this in our CMS. The next idea I had was to still leave those pages out there and to link to them in the source code, but this approach may get shot down as well. Has anyone else had to solve for this issue? If so, how did you do it?
Technical SEO | | AllyBank1 -
How many keyword per page
how many keyword per page will optimize my page to list on best rank in Google search ?
Technical SEO | | krisanantha0 -
Htaccess 301s to 3 different sites
Hi, I'm an htaccess newbie, and I have to redirect and split traffic to three new domains from site A. The original home page has most of the inbound links so I've set up a 301 that goes to site B, the new corporate domain. Options +FollowSymLinks
Technical SEO | | ellenru
RewriteEngine on
RewriteRule (.*) http://www.newdomain.com/$1 [R=301,L] Brand websites C and D need 301s for their folders in site A but I have no idea how to write that in relationship to the first redirect, which really is about the home page, contact and only a few other pages. The urls are duplicates except for the new domain names. They're all on Linux..Site A is about 150 pages, should I write it by page, or can I do some kind of catch all (the first 301) plus the two folders? I'd really appreciate any insight you have and especially if you can show me how to write it. Thanks 🙂0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0 -
301s and Link Juice
So I know that a 301 will pass the majority of link juice to the new site, but if that 301 is taken away what happens?
Technical SEO | | kylesuss0 -
How many days for a Backlink
Hi One week ago, i created a blog on wordpress added the url of my blog on google, bing and yahoo. In that blog i put a link of my webshop (the site im working on SEO) but when i checked the backlinks of my webshop (with seomoz tools and yahoo explorer) , the link from the blog still doesnt show. How many days it takes for a backlink to be registered ? Thanks
Technical SEO | | nipponx0 -
I have pages that are showing up as having too many links, yet they are noindexed.
I've got several pages that have "too many on page links" and the pages mentioned have already been noindexed. Do these pages need to be no followed too? Here's one of the pages: http://digisavvy.com/site-map/. There's several pages like this, most of which are category or tag archives, which I've noindexed... Do I need to nofollow these, too?
Technical SEO | | digisavvy0