Duplicate Pate Content - 404's or 301's?
-
I deleted about 100 pages of stale content 6 months ago and they are currently returning 404's. The crawl diagnostics have pointed out 77 duplicate pages because of this. Should I redirect these as 301's to get rid of the error or keep them as 404's? Most of the pages still have some page authority but I don't want to get penalized. Just looking for the best solution. Thanks!
-
Hi Braunna,
It sounds like you deleted pages, then remade them. It is great you're keeping up with the freshness of the site but for search engine purposes you should have simply updated the current page with fresh content, or remade the page then 301'd the old page to the new page.
In general you should try to avoid deleting pages or remaking the same page with a new url unless there is a reason greater than content driving the decision. Such as a new CMS (content management system, Joomla, WP, OSC etc..), switching server side scripting (php to asp), or overhauling navigation and architecture.
If there was a case of a the page simply being completely useless and you are removing it completely then 404 would be correct. If you are seeing any duplicate content issues in this case it is likely because the search engines have not de-indexed the old page or it is still in their cache. Google can help you with removing cached versions and forcing de-indexing.
I hope that hit the correct answer for you.
Happy New Year
-
I do not believe 404 pages will result in a penalty, and eventually they will be deindexed by search engines.
Like SanketPatel said, 301 redirects are best used on a one to one basis, where the old page is related to the page it is getting 301 redirected to.
If some of those pages have great links point to them, I would first make an effort to get those links changed to direct to the existing URLs. If that does not work, it might be worth considering creating a relevant page for a page with high page authority to be 301 redirected to.
-
Hi Braunna,
Do you have related pages for those 100 pages ? 301 redirect is the best solution in your case only when you redirect those pages to most relevant pages so that some authority get transfer to those pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible (or advisable) to try to rank for a keyword that is 'split' across subfolders in your url?
For example, say your keyword was 'funny hats' - ideally you'd make your url 'website.com/funny-hats/' But what if 'hats' is already a larger category in your site that you want to rank for as its own keyword? Could you then try to rank for 'funny hats' using the url 'website.com/hats/funny/' ? Basically what I'm asking is, would it be harmful to the chances of ranking for your primary keyword if it's split across the url like this, and not necessarily in the correct order?
Algorithm Updates | | rwat0 -
How long you've seen it take to rank in small niche
Hello, How long do you see small niche sites taking to rank where they should be for their strength? Our last site took at least 6 months. Our current site's home page for our main term is stuck at around the 40th page and not moving. It's an exact match domain so it should be on at least page 2. We have one site in the industry already that carries similar products but it is much bigger with a much wider scope of products. It took a while to rank too. Our only backlinks I'm working on are Google & Youtube (and DMOZ), we have a facebook fan page. Our site is nicer than the site in position #1. Working on making as many pages as possible 10X content. Thank You, Bob
Algorithm Updates | | BobGW0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
Google's spell check recognize a keyword with volume
When the keyword "acls recertification" (an important keyword for our client) is typed into the Google search box, the word "recertification" is underlined in red. Note that you only need to type "acls rec" to make the red underline appear.BUT, Google does not underline the word "recertification" when it is typed into the search box alone, nor does Google underline the word "recertification" when the following keywords are searched: cpr recertification bls recertification pals recertification ^These are all closely related to the keyword "acls recertification," so this spell check behavior is very inconsistent.Why does this matter? Because no matter how close you come to typing "acls recertification," Google's autocomplete suggestions never include "acls recertification" (because of the perceived misspelling?).BUT, Google does suggest "acls recertification online" in the dropdown menu. If you select the "acls recertification online" suggestion then backspace until the word "online" is gone, the red underline disappears, and "acls recertification" becomes an autocomplete suggestion. VERY strange behavior...I have replicated this issue on various depersonalized browsers and devices, so I am confident that this is not related to my personal settings.This keyword contributes to a large portion of our client's business (they specialize in acls certification and recertification), so you can imagine how concerning this is for us. Note that until very recently (3-4 months ago), this keyword did NOT have any spell-check issues. This keyword averages 2400 searches per month according to AdWords which should be enough volume to allow Google to recognize the correct spellingI posted this issue in the Google product forums, where I was advised to submit feedback directly on the search results page via Google's "feedback" link. I have submitted this feedback to Google, but I thought I would bring this to the MOZ community as well to see if anyone has experienced a similar issue, or has any ideas as to what could be causing this issue.
Algorithm Updates | | RyanKent0 -
Site has disappeared since Panda 4 despite quality content, help!
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0. All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high. The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty). Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively? Would really appreciate any help.
Algorithm Updates | | search_shop0 -
Increased 404 and Blocked URL Notifications in Webmaster Tools
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools. When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May. The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building. I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic. My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot. Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
Algorithm Updates | | CUnet0 -
Changing in website design reduce traffic? I don't think so.
HI, Around the month of Nov I was working on the website. Due to some reasons I have to change the design of website. I saw my traffic going down and down(70 - 100/day) so roll back it on previous one. after that it improve little bit but not as on previously. (traffic 250 - 300/day). Question: All Urls, content and links are same then how that can effect on the traffic. We have removed all the errors that was shown in the seomoz report.But traffic is still the issue here. We are working on SEO area enough and try to recover from it. Your suggestion may be helpful for us.So I am looking forward for your answers. how i can over come with it. Thanks Regards
Algorithm Updates | | lucidsoftech0 -
Need help with some duplicate content.
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was. It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this: http://noahsdad.com/resources/ http://noahsdad.com/resources/page/2/ http://noahsdad.com/therapy/page/2/ I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category. What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed." There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories. Any ideas what I should do? Thanks.
Algorithm Updates | | NoahsDad0