Help with Duplicate Content Issue for pages...
-
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
There are about 300 pages affected by duplicate content currently.Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index?
The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google.
Any advice much appreciated.
Kind Regards,
-
Can this work as a temporary thing?
So if i put a canonical tag in then take it out, once taken out and the page is crawled Google will then value it as unique content?
I just don't want to do anything which makes Google think its permanent.
-
I would suggest using the canonical tag on duplicate pages until you come up with new contents.
-
Thanks for the advice so far everyone.
Some of the pages are getting traffic etc and all seem to be indexed - up to 1000 it seems.
Its 3/4 paragraphs of text so a little more than a product description.
It's a difficult one to call, as i need to save the urls but not get penalised for duplicate content.
-
Perhaps 302 to the folder (/destination/) until you're ready to use the content on a page. Implementing the canonical tag here probably won't make a difference as there will only be a small number of 'live' pages.
Good luck Paul
Rob
-
I would just focus your efforts on creating unique, optimsed content for these pages. If they are just duplicate product descriptions, they will know that and I doubt it having an major impact on the rest of the site.
What percentage of your site do these 300 pages account for? How quickly can you rewrite the content on the pages?
You could noindex but unless there was a proven impact that these pages are having on your unique pages I think you are likely wasting your time.
-
Thanks for the advice.
On further looking there's more of 100 pages which have duplicate content.
The urls are similar for example.
/destination
and then
/destination/keyword
I have been told to keep the urls and put unique content on as i go on progressing the site. I just don't know what the best way to do it is, whether it is 302, canonical, etc.
-
Sounds like a lot of work!
If there's a lot of dupe content are the URLs also quite closely matched in terms of keyword use? Are you going to end up just consolidating a lot of the content on a smaller number of pages? If that's going to be the case then perhaps a 301 to the root domain, or to the best of the current pages, would be better.
Cheers
Rob
-
Well i might need the re-direct to be there for up to 12 months, There's quite alot of content to do.
-
302 redirect does indicate a temporary move, but how temporary is the move? If you're going to have this content sorted quite quickly then you might leave the pages as they are for now.
Are you falling short on ranking currently because of dupe issues?
Don't forget about using internal anchor text to inform Google which pages are relevant for certain keywords.
Cheers
Rob
-
Well yeah i have a few pages which have been duplicated content and i dont want to get penalised for duplicate content. However I want to eventually write unique content for them and use the urls. As they are cached and indexed already I am wondering what the best solution is. I dont want a perm re-direct as i want to use the urls again.
-
Is there 300 pages that you considering 302 for?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Community Discussion - Should low-cost content providers be seen as viable options for content marketers?
Hello there, In the latest YouMoz post, "Case Study: How We Gained More than 100 Links for a Travel Website via Content Marketing," Tom McLoughlin recommends an idea for content creation that is sure to elicit strong opinions from all sides: "Websites like Fiverr and Upwork are fantastic resources for finding freelancers who do great work. It simply takes a bit of initial time to sift through and separate the wheat from the chaff. Once that’s done, give the freelancers a detailed brief and tell them exactly what you want." What's your opinion? Have you had good experiences using these sites? If so, what have you found as the keys to making the working relationship a success.
Content Development | | ronell-smith1 -
Duplicate Content
Hi Does anyone know a site where i can paste text to test for duplication? We've used some outstanding freelancer copywriters in the past but need to check the authenticity of the article created before publishing, Thanks Gary
Content Development | | GaryVictory0 -
How many words should be placed on a home page, category pages, and product pages?
To optimize content for a website, how many words should be provided for a home page, category page and a product page?
Content Development | | gallreddy0 -
Does the duplicate content on the crawl errors report test content on external websites?
Hello, Can you tell me if this is just duplicate content within my site or if it also recognises duplicate content on external sites as well? Thanks
Content Development | | stuarta600 -
Panda and Thin Content
Hi Guys, I have a quick question. We have a website and in the wake of Panda, we are worried about our video news section. We produce about 10 videos news a month on a templated page and beneath it is a small extract of the words spoken in the video. The text below each video is about 180 words each. Currently the video news section makes up and 1/5 of the content on the site. I.e Out of 500 pages, we have about 100 video news articles. Should I be worried about being wacked by Panda for this? Can I tell Google this is a news section?
Content Development | | VividLime0 -
How can i solve duplicate problem with different url needed?
My client is a big international firm with 10 websites with different url (.co.uk, .com, .com.au, .pl... etc). All websites are exactly the same except the price. I suggested them to only use .com and use region as a sub domain like au.xxx.com instead of xxx.com.au. However they cannot do that for some reason. I am trying to solve the duplicate issue. I dont think i can use 301 redirect or canonial link because all regions are making even traffics. Any suggestions?
Content Development | | ringochan0 -
Content Writers / Blog Posts
Hi there Would anyone know where i could fund affordable, reliable blog post writers who would be able to produce quality posts at affordable rates? What would the accepted rate be etc? Regards Stef
Content Development | | stefanok0 -
Archive older, low ranked content to help new content in Panda 2.2?
After watching the white board friday re: Panda 2.2, it got me to thinking about old content. One of the sites that I work with generates 3-10 new articles/day (movie reviews, interviews, guides, event previews, etc) and has been doing so since 2005. Now, they have almost 10k articles, 7k of which are indexed. The quality of the content varies, and much of it is dated (movies, events) much of the amount of older content gets 0-5 pageviews/month, made in the days BEFORE the site was using Google News + social tools to spread the word (and backlinks). Note that those older articles also of course tend to have 100% bounce, and small/zero TOS. Is this hurting the site? With 75-100 articles/month being published, I want to make sure they get maximum exposure. I'm also concerned that crawlers get sucked into the site chasing down old BS content, and that is hurting it as well. What to do with this content? Should I unpublish unpopular, dated content and get it off the internet? Or, do I leave it on, but NOINDEX it so Google won't crawl it?
Content Development | | EricPacifico0