Duplicate Content in Wordpress.com
-
Hi Mozers!
I have a client with a blog on wordpress.com.
http://newsfromtshirts.wordpress.com/
It just had a ranking drop because of a new Panda Update, and I know it's a Dupe Content problem.
There are 3900 duplicate pages, basically because there is no use of noindex or canonical tag, so archives, categories pages are totally indexed by Google.
If I could install my usual SEO plugin, that would be a piece of cake, but since Wordpress.com is a closed environment I can't.
How can I put a noindex into all category, archive and author peges in wordpress.com?
I think this could be done by writing a nice robot.txt, but I am not sure about the syntax I shoud use to achieve that.
Thank you very much,
DoMiSol Rossini
-
Hi Mike,
I have been looking for the solution since a week almost .
My website http://mysay.in has over 3000 duplicate content issue and equal number of warnings and notices .. I am very new to the Moz world and SEO knowledge is close to negligible . My sit is hosted on wordpress.com as well. Now , after reading the solution I did remove the Tag cloud widget .. Will that help ?? and if you could suggest ideally how many tags for a post are optimum ? Will removing tags from previous posts and removing the Tag cloud remove these duplicate pages on its own or is there anything else I am supposed to do??
I am just confused . Please assist !!
Thanks
Vikash
-
Depending on your theme (sometimes) you can change Tag Archives and Category pages to show a post summary instead of the full article... which can help but will not fully solve your problem.
If you could convince them to self-host instead of using WordPress's hosting then you'd be able to install an SEO related plugin that might help more to fix the problem.
Tag things less. There are a number of tags that have only 1 post associated with them. Take for instance the post from February 5th that has 8 tags, 5 of which go to Tag Archives that only have that one post on it.
Also, consider getting rid of the Tag Cloud because it just adds unnecessary and irrelevant links pointing to those Tag pages that are causing your dupe content problem and probably harming the flow of link equity through the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content : domain alias issue
Hello there ! Let's say my client has 2 webshops (that exists since long time, so many backlinks & good authority on both) : individuals.nl : for individuals (has 200 backlinks, let's say) pros.nl : exact same products, exact same content, but with a different branding intended to professionnals (has 100 backlinks, let's say) So, both websites are 99% identical and it has to remain like that !!! Obviously, this creates duplicate content issues. Goal : I want "individuals.nl" to get all ranking value (while "pros.nl" should remain accessible through direct access & appear on it's own brand queries). Solution ? Implement canonical tags on "pros**.nl**" that goes to "individuals.nl". That way, "individuals.nl" will get all ranking value, while "pros.nl" will still be reachable through direct access. However, "individuals.nl" will then replace "pros.nl" from SERP in the long-term. The only thing I want is to keep "pros.nl" visible for its own brand queries -> it won't be possible through organic search result, so, I'm just gonna buy those "pros" queries through paid search ! Put links on all pages of pros.nl to individuals.nl (but not the other way around), so that "pros.nl" will pass some ranking value to "individuals.nl" (but only a small part of the ranking value -> ideally, I would like to pass all link value to this domain). Could someone advise me ??? (I know it sound a bit complicated... but I don't have much choice ^^)
Technical SEO | | Netsociety0 -
Multiple Sites Duplicate Content Best Practice
Hi there, I have one client (atlantawidgets.com) who has a main site. But also has duplicate sites with different urls targeting specific geo areas. I.e. (widgetmakersinmarietta.com) Would it be best to go ahead and create a static home page at these add'l sites and make the rest of the site be nonindexed? Or should I go in and allow more pages to be indexed and change the content? If so how many, 3, 5, 8? I don't have tons of time at this point. 3)If I change content within the duplicate sites, what % do I need to change. Does switching the order of the sentences of the content count? Or does it need to be 100%fresh? Thanks everyone.
Technical SEO | | greenhornet770 -
Shopify duplicate content issue
We recently moved out site to shopify but now have a duplicate content issue as we have the same products in different collections. I have added canonical code to get rid of this but my webmaster tools still shows hundreds of duplicate pages. How can I tell if the code I added is working? How long will it take for google to recognise this and drop the duplicates from their index and is this likely to have a significant impact on SERPS? Our we page is www.devoted2vintage.co.uk. Thanks Paul
Technical SEO | | devoted2vintage1 -
Fix duplicate content caused by tags
Hi everyone, TGIF. We are getting hundreds of duplicate content errors on our WP site by what appears to be our tags. For each tag and each post we are seeing a duplicate content error. I thought I had this fixed but apparently I do not. We are using the Genesis theme with Yoast's SEO plugin. Does anyone have the solution to what I imagine is this easy fix? Thanks in advance.
Technical SEO | | okuma0 -
How do I stop www.mysite.com/ showing as a duplicate of www.mysite.com
I have run the campaigns software over a site and it is showing that www.mysite.com/ is a duplicate of www.mysite.com, how do I correct this? Is it a genuine duplicate page? My first thought was to use rel canonical but there is no page called / to put it on. Your suggestions welcomed Sean
Technical SEO | | ske110 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
Duplicate content, how to solve?
I have about 400 errors about duplicate content on my seomoz dashboard. However I have no idea how to solve this, I have 2 main scenarios of duplication in my site: Scenario 1: http://www.theprinterdepo.com/catalogsearch/advanced/result/?name=64MB+SDRAM+DIMM+MEMORY+MODULE&sku=&price%5Bfrom%5D=&price%5Bto%5D=&category= 3 products with the same title, but different product models, as you can note is has the same price as well. Some printers use a different memory product module. So I just cant delete 2 products. Scenario 2: toners http://www.theprinterdepo.com/brother-high-capacity-black-toner-cartridge-compatible-73 http://www.theprinterdepo.com/brother-high-capacity-black-toner-cartridge-compatible-75 In this scenario, products have a different title but the same price. Again, in this scenario the 2 products are different. Thank you
Technical SEO | | levalencia10 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0