How can I prevent errors of duplicate page content generated by my tags from my wordpress on-site blog platform?
-
When I add meta data and a canonical reference to my blog tags for my on-site blog which works using a wordpress.org template, Roger generates errors of duplicate content. How can I avoid this problem? I want to use up to 5 tags per post, with the same canonical reference and each campaign scan generates errors/warnings for me!
-
Hi Steve,
Remember to use the WordPress SEO plug-in feature to redirect pages that are duplicates. Let's say you have example tag, tag1 tag2 you want to 301 redirect the pages that are generated without use back to the original page. That's only if by not indexing them it is not clear up your problem.
Is there anything else I can be of assistance with?
I and very happy I could help.
Sincerely,
Thomas
-
Thomas,
Thank you for this recommendation. I already had the Yoast plug-in so I will now check the box for no-index follow for my tags and see what that does. Previously I had left categories, tags, posts etc all empty and not checked. I am only going to no-index the tags at present and se what results I get!
Many thanks again,
Steve
-
I would download a program like WordpressSEO http://yoast.com/wordpress/seo/ ask for the tags not to be indexed and it will solve your issue here is a SEOmoz link regarding the same problem
http://www.seomoz.org/q/solving-link-and-duplicate-content-errors-created-by-wordpress-blog-and-tags
the fact that you're showing same canonical reference for each tag link tells Google that it is not duplicate content. However I would follow the advice above and wish you the best.
hope I have been of help,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Pages
Hello, we have an issue which I'm hoping someone can help with. Our Moz system is saying that this page http://www.indigolittle.com/fees/ Is a duplicate page. We use this page purely for mobiles and we have added code to say This has been on for over a month now however Moz is still picking the page us as a High Priority Issue.
Moz Pro | | popcreativeltd0 -
Duplicate content in crawl despite canonical
Hi! I've had a bunch of duplicate content issues come up in a crawl, but a lot of them seem to have canonical tags implemented correctly. For example: http://www.alwayshobbies.com/brands/aztec-imports/-catg=Fireplaces http://www.alwayshobbies.com/brands/aztec-imports/-catg=Nursery http://www.alwayshobbies.com/brands/aztec-imports/-catg=Turntables http://www.alwayshobbies.com/brands/aztec-imports/-catg=Turntables?page=0 Aztec http://www.alwayshobbies.com/brands/aztec-imports/-catg=Turntables?page=1 Any ideas on what's happening here?
Moz Pro | | neooptic0 -
Duplicate Page Title - although there are differences
Hello, I get duplicate page titles errors on pages in which there are little differences. For example: C++ Online Test for Seniors C# Online Test for Seniors I assume that from some reason the ++ and the # are removed when SEOMoz crawler checks for duplicate page titles. As you may know C# and C++ means two different programming languages. Should I do something about it or is it a bug in the crawler?
Moz Pro | | ulukach0 -
On-Page Report Card B grade because its a PPC landing page
I have a PPC landing page with I'm getting a B grade on the On-Page Report Card. Can I just ignore that, it says its a "Critical Factor" Thanks Mike Crawl status <dd>Status Code: 200
Moz Pro | | mjrinvent
meta-robots: noindex,nofollowall
meta-refresh: None
X-Robots: None</dd> <dt>Explanation</dt> <dd>Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible.</dd> <dt>Recommendation</dt> <dd>Ensure the URL returns the HTTP code 200 and is not blocked with robots.txt, meta robots or x-robots protocol (and does not meta refresh to another URL)</dd>0 -
Getting rid of duplicate content
Hi everyone, I'm a newbie and at the moment don't know very much about SEO. I have a problem with some of my campaigns where i keep getting a report with either Duplicate Page and/or Duplicate Content errors. I have no idea how to rectify this error, remove it or fix it on the relevant websites. Can anyone please help explain how to do this, maybe step by step? I really appreciate your views and opinions! Regards, Hugh
Moz Pro | | DigitalAcademyZA0 -
How can you set SEOmoz to work with your dev site behind an htpasswd?
All sites need to be developed from the small to the grand - and this takes time. Development usually takes place on a subdomain different from our live domain. It is locked down behind an htpasswd during development so its not picked up by searching engines - that may create duplicate content issues if when the site goes live it has already scanned our site on the development server. Its also a security implementation to keep the site away from prying eyes before its ready for launch There could be security holes that have not been tweaked. Whats the best strategy to get SEOmoz involved in this scenario. Its tools are invaluable to the SEO part of the build - but the seomoz crawler bot has a different IP address (being cloud based) - so we cannot just let a single IP address through our htpasswd. Also is there a way to link the dev and live site in seomoz - so when it goes live to maintain all teh same logs without having to create two seperate site campaigns? Thanks!
Moz Pro | | dseo2410 -
It won't let me print the secon or third pages of site errors.
When I place the site errors page in pdf format it won't let me print the second or any of the other webpages containing content about my site. Does any one know why?
Moz Pro | | ibex0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Hello guys, our site is nearly perfect - according to SEOmoz campaign overview. But, it shows me 5200 Errors, more then 2500 Pages with Duplicate Content plus more then 2500 Duplicated Page Titles. All these pages are sites to edit profiles. So I set them "noindex, follow" with meta robots. It works pretty good, these pages aren't indexed in the search engines. But why the SEOmoz tools list them as errors? Is there a good reason for it? Or is this just a little bug with the toolset? The URLs which are listet as duplicated are http://www.rimondo.com/horse-edit/?id=1007 (edit the IDs to see more...) http://www.rimondo.com/movie-edit/?id=10653 (edit the IDs to see more...) The crawling picture is still running, so maybe the errors will be gone away in some time...? Kind regards
Moz Pro | | mdoegel0