Help with Duplicate Content Issue for pages...
-
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
There are about 300 pages affected by duplicate content currently.Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index?
The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google.
Any advice much appreciated.
Kind Regards,
-
Can this work as a temporary thing?
So if i put a canonical tag in then take it out, once taken out and the page is crawled Google will then value it as unique content?
I just don't want to do anything which makes Google think its permanent.
-
I would suggest using the canonical tag on duplicate pages until you come up with new contents.
-
Thanks for the advice so far everyone.
Some of the pages are getting traffic etc and all seem to be indexed - up to 1000 it seems.
Its 3/4 paragraphs of text so a little more than a product description.
It's a difficult one to call, as i need to save the urls but not get penalised for duplicate content.
-
Perhaps 302 to the folder (/destination/) until you're ready to use the content on a page. Implementing the canonical tag here probably won't make a difference as there will only be a small number of 'live' pages.
Good luck Paul
Rob
-
I would just focus your efforts on creating unique, optimsed content for these pages. If they are just duplicate product descriptions, they will know that and I doubt it having an major impact on the rest of the site.
What percentage of your site do these 300 pages account for? How quickly can you rewrite the content on the pages?
You could noindex but unless there was a proven impact that these pages are having on your unique pages I think you are likely wasting your time.
-
Thanks for the advice.
On further looking there's more of 100 pages which have duplicate content.
The urls are similar for example.
/destination
and then
/destination/keyword
I have been told to keep the urls and put unique content on as i go on progressing the site. I just don't know what the best way to do it is, whether it is 302, canonical, etc.
-
Sounds like a lot of work!
If there's a lot of dupe content are the URLs also quite closely matched in terms of keyword use? Are you going to end up just consolidating a lot of the content on a smaller number of pages? If that's going to be the case then perhaps a 301 to the root domain, or to the best of the current pages, would be better.
Cheers
Rob
-
Well i might need the re-direct to be there for up to 12 months, There's quite alot of content to do.
-
302 redirect does indicate a temporary move, but how temporary is the move? If you're going to have this content sorted quite quickly then you might leave the pages as they are for now.
Are you falling short on ranking currently because of dupe issues?
Don't forget about using internal anchor text to inform Google which pages are relevant for certain keywords.
Cheers
Rob
-
Well yeah i have a few pages which have been duplicated content and i dont want to get penalised for duplicate content. However I want to eventually write unique content for them and use the urls. As they are cached and indexed already I am wondering what the best solution is. I dont want a perm re-direct as i want to use the urls again.
-
Is there 300 pages that you considering 302 for?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi friends, I have a question. I want to know how Google detects the author of a content.
Hi friends, I have a question. I want to know how Google detects the author of a content.
Content Development | | Alopii0 -
Duplicate Content & Tags
I've recently added tags to my blog posts so that related blog posts are suggested to visitors. My understanding was that my robot.txt was handling duplicate content so thought it wouldn't be an issue but after Moz crawled by site this week is reported 56 issues of duplicate content in my blog. I'm using Shopify, so I can edit the robot.txt file but is my understanding correct that if there are 2 or more tags then they will be ignored? I've searched the Shopify documents and forum and can't find a straight answer. My understanding of SEO is fairly limited. Disallow: /blogs/+
Content Development | | Tangled
Disallow: /blogs/%2B
Disallow: /blogs/%2b0 -
Authorship on Pages
Hi there - I have just been told that it is a big "no no" to add authorship to a standard page and that it should only be used on blog posts. Is this true? What are the negative effects of using authorship on a page? I added the rel="author" tag to a page that talks about a member of our team and a few weeks ago his picture started showing next to the search results entry on Google - great for click through rate, however is this improper usage? Any help welcome!
Content Development | | DHS_SH0 -
Authorship showing in SERPs for non-blog pages
Hi, A few months ago we set up authorship for on our blog articles for multiple authors, which has helped driving extra traffic to our blog posts. Today, I did a search for one of most important search terms and one of our non-blog pages is showing in the first page of the results with one of our authors headshot next to it. Technically we have not not set it up to do this, the page is on a different CMS to our blog (which is wordpress). I'm not complaining because I think this is a positive outcome, but does anyone have an idea why it has done this? I was under the impression that only blog article pages could have authorship set up. Thanks, Stu
Content Development | | Stuart260 -
2 URLs pointing to exactly the same content
Hi guys As far as I know if you have 2 websites with exactly the same (100%) content with 2 URLs which are not pointing to any other URL should attract penalisation from google, right? well, there is such a case and it was online for long time but the bad guys are in top of organic search and it does not seem to bother google at all! I don't want to list them here; it is extremely annoying and frustrating as I worked hard to get in higher search but seeing this thing is extremely frustrating! any advice on this? thanks
Content Development | | photoion0 -
How many pages is too many to add to a site at one time?
I have quite a bit of excellent content articles at my disposal and we would like to increase the number of pages on our site. I could, theoretically add 100's of pages at a time. Does anyone have a good sense of how much content added to a sight in mass looks bad to Google? My plan is to add approximately 50 pages a week to our site, which already has 4000 pages of content. This is relevant content, since we are a custom writing service and all topics are covered. Our content is what gives us great organic hits and orders. However, I would like to add more than 50 a week...how many is too many? Thanks and I appreciate thoughts and feedback! Karen
Content Development | | eworld0 -
Best strategy for content/articles. Individual pages or blog posts?
Hi all, Whilst adding content to one of my sites quite often I'm left deciding whether I should create an individual webpage for the content, or write it up as another blog post. More often I write it up as a static page so it fits in with the rest of my website more 'directly'. However I'm wondering if I'm missing out here as obviously I'm not taking advantage of the benefits of a blog, RSS, Tag Cloud, etc etc... Just wondering if others encounter the same quandary?
Content Development | | davebrown19750 -
Duplicate Content on WordPress Blogs?
We are getting ready to add a WordPress blog to our established website. Our plans are to place it in a subfolder on our website to maximize rank. My question is...Do we need to utilize a Meta Robots WordPress plugin by Yoast or similar so that noindex,follow robots meta tags will prevent search engine indexing of search result pages, subpages and category archives? We want to avoid the dreaded Duplicate Content Error and penalty. Any other great SEO WordPress plugins? Thank you for your time. Brian
Content Development | | gw3seo0