Duplicate Content
-
Hi, I have a website with over 500 pages.
The website is a home service website that services clients in different areas of the UK.
My question is, am I able to take down the pages from my URL, leave them down for say a week, so when Google bots crawl the pages, they do not exist.
Can I then re upload them to a different website URL, and then Google wont penalise me for duplicate content?
I know I would of lost juice and page rank, but that doesnt really matter, because the site had taken a knock since the Google update.
Thanks for your help.
Chris,
-
Just to second what Mike said - it's always tough to speak in generalities, but I can't think of any benefit to this approach. Typically, 301s are the preferred method for changing URLs. If you just kill the old pages and introduce new ones with the same content, you not only may experience some short-term duplicate content issues, but you lose inbound links and ranking signals to those old URLs.
Are you concerned about transferring a penalty via 301s? I'm just not clear on what the goal is here.
-
In good conscious I cant think of how that would be a good ongoing process to move forward with. What I would do is start an overhaul and merge scenario where you identify the most valuable unique content within the site and start merging less valuable content pages with stronger ones. As well as 301 any old pages to like items or the home page as well as make sure 404's and 410'are in place as well.
With that many pages I know it can be a daunting task to think about but if you truly care for the site and it seems that you do I would not take this approach as it can lead to further damage down the road. I would be trying to make sure I do not get hit again by P & P.
In the end if it seems like a spammy or fishy tactic it probably is or will be at some point.
Good luck my friend
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Impact of wiping content on a subdomain
Hi, I've been asked to look at the impact of bulk deleting content on a blog subdomain and how it could impact the SEO of a linked www subdomain. Can deleting content on one subdomain have a negative impact on other linked subdomains? Thanks
White Hat / Black Hat SEO | | think-web0 -
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Having a Size Chart and Personalization Descriptions on each page - Duplicate Content?
Hi everyone, I am coding a Shopify Store theme currently and we want to show customers the size comparisons and personalization options for each product. It will be a great UX addition since it is the number one & two things asked via customer support. But my only concern is that Google might flag it as duplicate content since it will be visible on each product page. What are your thoughts and/or suggestions? Thank you so much in advance.
White Hat / Black Hat SEO | | MadeByBrew0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Multiple domains different content same keywords
what would you advice on my case: It is bad for google if i have the four domains. I dont link between them as i dont want no association, or loss in rakings in branded page. Is bad if i link between them or the non branded to them branded domain. Is bad if i have all on my webmaster tools, i just have the branded My google page is all about the new non penalized domain. altough google gave a unique domain +propdental to the one that he manually penalized. (doesn't make sense) So. What are the thinks that i should not do with my domain to follow and respect google guidelines. As i want a white hat and do not do something that is wrong without knowledge
White Hat / Black Hat SEO | | maestrosonrisas0 -
XML feeds in regards to Duplicate Content
Hi everyone I hope you can help. I run a property portal in Spain and am looking for an answer to an issue we are having. We are in the process of uploading an XML feed to our site which contains 10,000+ properties relating to our niche. Although this is great for our customers I am aware this content is going to be duplicated from other sites as our clients advertise over a range of portals. My question is, are there any measures I can take to safeguard our site from penalisation from Google? Manually writing up 10,000 + descriptions for properties is out of the question sadly. I really hope somebody can help Thanks Steve
White Hat / Black Hat SEO | | buysellrentspain0