Duplicate Content
-
Hi, I have a website with over 500 pages.
The website is a home service website that services clients in different areas of the UK.
My question is, am I able to take down the pages from my URL, leave them down for say a week, so when Google bots crawl the pages, they do not exist.
Can I then re upload them to a different website URL, and then Google wont penalise me for duplicate content?
I know I would of lost juice and page rank, but that doesnt really matter, because the site had taken a knock since the Google update.
Thanks for your help.
Chris,
-
Just to second what Mike said - it's always tough to speak in generalities, but I can't think of any benefit to this approach. Typically, 301s are the preferred method for changing URLs. If you just kill the old pages and introduce new ones with the same content, you not only may experience some short-term duplicate content issues, but you lose inbound links and ranking signals to those old URLs.
Are you concerned about transferring a penalty via 301s? I'm just not clear on what the goal is here.
-
In good conscious I cant think of how that would be a good ongoing process to move forward with. What I would do is start an overhaul and merge scenario where you identify the most valuable unique content within the site and start merging less valuable content pages with stronger ones. As well as 301 any old pages to like items or the home page as well as make sure 404's and 410'are in place as well.
With that many pages I know it can be a daunting task to think about but if you truly care for the site and it seems that you do I would not take this approach as it can lead to further damage down the road. I would be trying to make sure I do not get hit again by P & P.
In the end if it seems like a spammy or fishy tactic it probably is or will be at some point.
Good luck my friend
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Duplicate content site not penalized
Was reviewing a site, www.adspecialtyproductscatalog.com, and noted that even though there are over 50,000 total issues found by automated crawls, including 3000 pages with duplicate titles and 6,000 with duplicate content this site still ranks high for primary keywords. The same essay's worth of content is pasted at the bottom of every single page. What gives, Google?
White Hat / Black Hat SEO | | KenSchaefer0 -
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
Wordpress Category Archives - Index - but will this cause duplication?
Okay something I am struggling with Using YOAST - but have a recipe blog - However the category archives have /are being optimized and indexed as I am adding custom content to them , then listing the recipes below. My question is if I am indexing the Category Archives and using these to add custom content above - then allows the recipe excerpts from the category to be listed underneath - will these recipe excerpts be picked up as duplicate content?
White Hat / Black Hat SEO | | Kelly33300 -
Duplicate keywords in URL?
Is there such a thing as keyword stuffing URLs? Such as a domain name of turtlesforsale.com having a directory called turtles-for-sale that houses all the pages on the site. Every page would start out with turtlesforsale.com/turtles-for-sale/. Good or bad idea? The owner is hoping to capitalize on the keywords of turtles for sale being in the URL twice and ranking better for that reason.
White Hat / Black Hat SEO | | CFSSEO0 -
Top authors for ecommerce content
Hello, What are some tips that you recommend for someone looking to hire an expert to write or consult in a piece of content. It's as general a keyword as our niche has and it's the only keyword that's actually inside the niche that has any decent level of backlinks. We're considering searching out an expert in our field that knows more about the subject than our people do even though our people are knowledgable. Trying to come from authority. Your recommendations in the process of coming up with a great piece of content from a good authority?
White Hat / Black Hat SEO | | BobGW0 -
Content website of the year 2009 ....
I own a network of travel sites, after all the changes that happened to past 12 months and so. I am really thinking if maybe my sites are worthless. I mean, let's be honest here. I understand what Google is doing. So i ask myself. If I wasn't trying to make a living with google adsense and affiliate sites... Would I still have these travel sites ? well the truth is NO NO... Therefore should i forget about my content site ? It is a punch of useless content. well some interesting information but it is a travel guide like many others online. What do you think? now it is better to focus on your product site or create 1 good websites rather than a network of sites that worked very veryyy well the past 10 years...
White Hat / Black Hat SEO | | sandyallain0 -
Article Re-posting / Duplication
Hi Mozzers! Quick question for you all. This is something I've been unsure of for a while. But when a guest post you've written goes live on someone's blog. Is it then okay it post the same article to your own blog as well as Squidoo for example? Would the search engines still see it as duplication if I have a link back to the original?
White Hat / Black Hat SEO | | Webrevolve0