Duplicate Content
-
Hi, I have a website with over 500 pages.
The website is a home service website that services clients in different areas of the UK.
My question is, am I able to take down the pages from my URL, leave them down for say a week, so when Google bots crawl the pages, they do not exist.
Can I then re upload them to a different website URL, and then Google wont penalise me for duplicate content?
I know I would of lost juice and page rank, but that doesnt really matter, because the site had taken a knock since the Google update.
Thanks for your help.
Chris,
-
Just to second what Mike said - it's always tough to speak in generalities, but I can't think of any benefit to this approach. Typically, 301s are the preferred method for changing URLs. If you just kill the old pages and introduce new ones with the same content, you not only may experience some short-term duplicate content issues, but you lose inbound links and ranking signals to those old URLs.
Are you concerned about transferring a penalty via 301s? I'm just not clear on what the goal is here.
-
In good conscious I cant think of how that would be a good ongoing process to move forward with. What I would do is start an overhaul and merge scenario where you identify the most valuable unique content within the site and start merging less valuable content pages with stronger ones. As well as 301 any old pages to like items or the home page as well as make sure 404's and 410'are in place as well.
With that many pages I know it can be a daunting task to think about but if you truly care for the site and it seems that you do I would not take this approach as it can lead to further damage down the road. I would be trying to make sure I do not get hit again by P & P.
In the end if it seems like a spammy or fishy tactic it probably is or will be at some point.
Good luck my friend
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Duplicate Content Product Descriptions - Technical List Supplier Gave Us
Hello, Our supplier gives us a small paragraph and a list of technical features for our product descriptions. My concern is duplicate content. Here's what my current plan is: 1. To write as much unique content (rewriting the paragraph and adding to it) as there is words in the technical description list. Half unique content half duplicate content. 2. To reword the technical descriptions (though this is not always possible) 3. To have a custom H1, Title tag and meta description My question is, is the list of technical specifications going to create a duplicate content issue, i.e. how much unique content has to be on the page for the list that is the same across the internet does not hurt us? Or do we need to rewrite every technical list? Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Content website of the year 2009 ....
I own a network of travel sites, after all the changes that happened to past 12 months and so. I am really thinking if maybe my sites are worthless. I mean, let's be honest here. I understand what Google is doing. So i ask myself. If I wasn't trying to make a living with google adsense and affiliate sites... Would I still have these travel sites ? well the truth is NO NO... Therefore should i forget about my content site ? It is a punch of useless content. well some interesting information but it is a travel guide like many others online. What do you think? now it is better to focus on your product site or create 1 good websites rather than a network of sites that worked very veryyy well the past 10 years...
White Hat / Black Hat SEO | | sandyallain0 -
Copied Content/ Copied Website/
Hello guys, I was checking my product descriptions and I found out that there is a website that is using my descriptions word by word, also they use company name, product images, they have a link that sends you to my site, contact form.. I tried to purchase something and the order came through our email, but i made an inquire and it didn't come through. Also they have a sub-folder with my company name. Also they have url's with my company name, and this isn't right is it? I am confused and honestly I don't know what to do, we don't take part to any affiliation program or anything like that and we don't ship out of Europe. This is a Chinese website. Just for curiosity, I noticed that one of our competitors is there as well, and it does seem weird. Here is the links: www.everychina . com/company/repsole_limited-hz1405d06.html
White Hat / Black Hat SEO | | PremioOscar0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0 -
"take care about the content" is it always true?
Hi everyone, I keep reading answer ,in reference to ranking advice, in wich the verdict is always the same: "TAKE CARE ABOUT THE CONTENT INSTEAD OF PR", and phrases like " you don't have to waste your time buying links, you have first of all to engage your visitors. ideally it works but not when you have to deal with small sites and especially when you are going to be ranked for those keywords where there's not too much to write. i'll give you an example still unsolved: i've got a client who just want to be ranked first for his flagship store, now his site is on the fourth position and the first ranked is a site with no content and low authority but it has the excact keyword match domain. tell me!!! what kind of content should i produce in order to be ranked for the name of the shop and the city?? the only way is to get links.... or to stay forth..... if you would like to help me, see more details below: page: http://poltronafraubrescia.zenucchi.it keyword: poltrona frau brescia competitor ranked first: http://turra.poltronafraubrescia.it/ competiror ranked second: http:// poltronafraubrescia.com/
White Hat / Black Hat SEO | | guidoboem0