Duplicate Pate Content - 404's or 301's?
-
I deleted about 100 pages of stale content 6 months ago and they are currently returning 404's. The crawl diagnostics have pointed out 77 duplicate pages because of this. Should I redirect these as 301's to get rid of the error or keep them as 404's? Most of the pages still have some page authority but I don't want to get penalized. Just looking for the best solution. Thanks!
-
Hi Braunna,
It sounds like you deleted pages, then remade them. It is great you're keeping up with the freshness of the site but for search engine purposes you should have simply updated the current page with fresh content, or remade the page then 301'd the old page to the new page.
In general you should try to avoid deleting pages or remaking the same page with a new url unless there is a reason greater than content driving the decision. Such as a new CMS (content management system, Joomla, WP, OSC etc..), switching server side scripting (php to asp), or overhauling navigation and architecture.
If there was a case of a the page simply being completely useless and you are removing it completely then 404 would be correct. If you are seeing any duplicate content issues in this case it is likely because the search engines have not de-indexed the old page or it is still in their cache. Google can help you with removing cached versions and forcing de-indexing.
I hope that hit the correct answer for you.
Happy New Year
-
I do not believe 404 pages will result in a penalty, and eventually they will be deindexed by search engines.
Like SanketPatel said, 301 redirects are best used on a one to one basis, where the old page is related to the page it is getting 301 redirected to.
If some of those pages have great links point to them, I would first make an effort to get those links changed to direct to the existing URLs. If that does not work, it might be worth considering creating a relevant page for a page with high page authority to be 301 redirected to.
-
Hi Braunna,
Do you have related pages for those 100 pages ? 301 redirect is the best solution in your case only when you redirect those pages to most relevant pages so that some authority get transfer to those pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's Importance on usability issues in sub directories or sub domains?
Hi Moz community, As the different usability issues like pagespeed or mobile responsiveness are playing a key role in website rankings; I wonder how much the same factors are important for sub directories or sub domain pages? Do each and every page of sub directory or sub domain must be optimised like website pages? Does Google gives same importance? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Google's Presentation Yesterday
We hired a new website/marketing company that is a Preferred Google Partner (one of two in Charlotte according to them) and they hosted a presentation by Google at the Google Fiber office in Charlotte yesterday. As expected, there were lots of self-promotion by Google, accompanied with a plethora of data they created to support their PPC Marketing. It was an impressive performance with Molly Dince and Celena Fergusson, presenting Google Marketing Solutions: "Making the Web Work For You" and the keynote speaker Tim Reis, Director of Performance Agencies at Google: speaking on "Mobile Micromoments: Why Your Biggest Opportunities Are In The Smallest Moments" They ended with 15 minutes of Q&A and my question was answered with "I don't know" which I found surprising. So, here it is Thursday morning and I'm asking the same question to my Moz Family for some feedback: "Since the removal of Ads from the right column of a SERP, what percentage of Google traffic comes from Ads vs. the Organics?" I look forward to your comments. TY,
Algorithm Updates | | KevnJr
KJr0 -
How important is fresh content?
Lets say the website you are working on has covered most of the important topics on your subject. How important is it that you continue to add content to it when there really may not be much that is so relevant to your users anymore? Can a site continue to rank well if nothing new is added to the site for year but continues to get good quality links?
Algorithm Updates | | DemiGR0 -
301'ing away from an exact match domain.
Hi Moz Community! My website gets just over 50% of its traffic from ranking in the top 3 in over 10 countries for my exact match keyword domain. 80% + from keywords related to the exact match domain. We are now looking at doing a to 301 re-direct to a new domain to start a fresh branding to the site to increase scope and expand. This would involve removing the keyword from the homepage and domain entirely . However. Considering all competitors ranking for our main keyword, have the keyword in their domain as either a subdomain to or in their root domain and in their homepage content, would this make ranking without the keyword in domain & content hard? I have found a very similar example that has done so, so I guess the answer to that question is no its not. about 65-70% of our anchor text on our backlinks is for our domain keyword. Can anyone advise how best to go about maintaining rankings after 301ing or how best to go about 301ing to make sure that we can maintain the rankings for our main keyword! Any advise at all would be greatly appreciated, Thanks.
Algorithm Updates | | howiex10 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
"We've processed your reconsideration request for www...." - Could this be good news?
Hey, We recently had a Google Penguin related links warning and I've been going through Google WMT and removing the most offensive links. We have requested resubmission a couple of times and have had the standard response of: "
Algorithm Updates | | ChrisHolgate
Site violates Google's quality guidelines We received a request from a site owner to reconsider your site for compliance with Google's Webmaster Guidelines. We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support.
" On the 5th September after spending another couple more days removing the most prolific offenders we resubmitted the site again and again got the automated response saying they had received our request. A week later on the 13th September we got a slightly different response of : "
We've processed your reconsideration request We received a request from a site owner to reconsider how we index your site. We've now reviewed your site. When we review a site, we check to see if it's in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site. If your site still doesn't appear in our search results, check our Help Center for steps you can take. " I left it another couple of weeks to see if we'd get a slightly more in depth response however so far there has been nothing. I'll be honest in not being entirely sure what this means. The e-mails says simultaneously 'We've now reviewed your site' (as in past tense) but then continues with "If we don't find any problems" which suggests a future tense. I’m unsure from reading the e-mail whether they have indeed reviewed it (and just not told us the outcome) or whether it’s just a delayed e-mail saying that they have received the reconsideration request. Of course, if I received this e-mail off anyone other than Google I would have thought I was still in the dog house but the fact that it differs from the standard ‘Site violates Google’s quality guidelines’ message leads me to believe that something has changed and they may be happy with the site or at least happier than they were previously. Has anybody else received the latter message and has anybody managed to determine exactly what it means? Cheers guys!0