Too many 301s?
-
Hi there, If there is a website that has accidently generated say 1,000 pages of duplicate content, would the seo be hurt if all those pages were re-directed to the origional source of the content?
There are no plans to re-write the 1,000 duplicate pages, they are already cached and indexed by Google.
I thought about canonical tags but as they have some traffic and a little seo value i thought 301 re-direct would be more appropiate to the relevant pages?
I am also right in thinking you would be able to remove the 301 in the .htaccess file once the index has updated?
Also once removed the 301 - i could use those urls later from scratch if i wanted?
Any info much appreciated.
-
Great insight Highland!!!
-
If they had links, I would 301 the pages with links. Everything else I would 404
-
How are these pages generating traffic? Are they being found in the search engine?
The real question, do these pages have links to them?
There is little value to a 301 redirect if you are not moving link traffic in the direction you are pointing. If you are out ranking the original content, then perhaps a 301 could help. How well does the original content rank?
-
Ha, yes you can my friend.
-
But you can do it, yes?
-
Bringing back URL's that you didn't want and then decide that you do want is pretty annoying to Google...
-
I would see if they had links, and get rid of the rest, it may look to Bing that you are trying to be tricky. Its not natural
-
Ok, i probably wont but in what istance would you not recommend this?
I understand pa and pr etc will be back to nothing but its the keyword url i might want to use from scratch
-
Yes but I wouldn't really recommend this.
-
Also last one, if i wanted to revive the 301s say in a year i would be allowed to and the pages would index again?
-
Thanks highland.
-
I would 301 the pages and get them out of your site's index. Even if you canonical all of them Google will still have to index 1000 pages instead of 1. The 301 will transfer most of your rank to the new page and you'll improve your crawl budget.
Why take the 301s out? Just leave them in there in case there are links pointed to them.
-
Well they seem to be generating traffic.
In principal is what i intend on doing ok, will it hard the seo or be seen as ok do you know?
Many thanks,
-
That sounds weird! If you generated 1000s of pages automatically, and these are all duplicate content, why don't you remove them? Google will end up removing them from its cache as well after a short period!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
Anything new if determining how many of a sites pages are in Google's supplemental index vs the main index?
Since site:mysite.com *** -sljktf stopped working to find pages in the supplemental index several years ago has anyone found another way to identify content that has been regulated to the supplemental index?
Technical SEO | | SEMPassion0 -
Will a blog post about a collection of useful tools and web resources for a specific niche being seen as negative by google for too many links?
SEO newbie here, I'm thinking about creating a blog post about a collection of useful tools and web resources for my specific niche. It'd be 300 links or more, but with comments, and categorized nicely. It'd be a useful resource for my target audience to bookmark, and share. Will google see this as a negative? If so, what's the best way to do such a blog post? Thanks
Technical SEO | | ericzou0 -
So many internal links to the same page
Hey guyz,
Technical SEO | | atakala
I'm working with a client that has a page which has many internal links to the same page .
Let me illustrate it.
So as you can see I have a page which is called in the image "page" :D.
As you can see, the **page **has many links to the solutions.htmls' anchor links which mean they are basically the same page ( solutions.html)
Is it going to be a problem for us to do that ?
And is there anyway to handle this problem?
Thank you for you patience. And sorry for my bad english 😄 4deRc1W.png0 -
XCart Directory 301s Not Working
I'm working with someone to make fixes to an xcart site but I'm at a loss for some fixes. Some directory URLs had been changed around on their ecommerce site to make them more descriptive & more human friendly. The problem is that according to the team's coder, simple redirects won't work for the directories and mod rewrite and redirectmatch didn't work for some unknown reason. I don't really know anything about xcart. I've made some basic changes and redirects before though their admin panel but I don't have any clue as to how to make directories 301 properly. Any insights? Thanks!
Technical SEO | | MikeRoberts0 -
Too many links in header menu
I'm working on a few clients who are starting to get big header menus. Their site now easily exceeds the 100 links per page recommendation. Normally I would recommend them to cut down on the links, bit in this case these sites have menus that makes navigation easier. I honestly think these menus adds value for the users. The dilemma is that I think the menus provide value from an UX standpoint, but I'm not sure from the SEO standpoint. Any recommendations to this dilemma? Some examples: http://moodsofnorway.com/no/ http://www.gmax.no/ http://www.flust.no/
Technical SEO | | Inevo0 -
Too Many Internal Links?
Hi Guys, I'm completing a overhawl of our website at the moment have a certain penguin killed our site for our main keyword. I'm currently working on our internal linking as most of our blog posts have a link back to our home page with the main money keyword. At present we have 3,331 internal links and our site has only 1,000 pages. Can you get penalised for having too many internal links with exact match anchors. Thanks, Scott
Technical SEO | | ScottBaxterWW0 -
Htaccess 301s to 3 different sites
Hi, I'm an htaccess newbie, and I have to redirect and split traffic to three new domains from site A. The original home page has most of the inbound links so I've set up a 301 that goes to site B, the new corporate domain. Options +FollowSymLinks
Technical SEO | | ellenru
RewriteEngine on
RewriteRule (.*) http://www.newdomain.com/$1 [R=301,L] Brand websites C and D need 301s for their folders in site A but I have no idea how to write that in relationship to the first redirect, which really is about the home page, contact and only a few other pages. The urls are duplicates except for the new domain names. They're all on Linux..Site A is about 150 pages, should I write it by page, or can I do some kind of catch all (the first 301) plus the two folders? I'd really appreciate any insight you have and especially if you can show me how to write it. Thanks 🙂0