Issue with duplicate content
-
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot.
I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue.
Thank you!
-
Peter, i'm trying to PM you but i have no idea what to place in the "recepient" field. Thank you for assistance.
-
We only crawl your own site, so we wouldn't surface a duplicate with Tumblr, unless something really, really weird is going on. This is why I need to look at the campaign - what you're describing shouldn't happen, in theory, so I have a feeling this is a pretty unusual situation.
-
Hello Peter, thank you for helping!
Peter, why do you say that neither Moz nor Webmaster Tools are going to detect the duplicates between your subdomain and Tumblr? MOZ is detecting it now. Can you more elaborate on it?
THanks
-
My gut feeling is that you have 2+ issues going on here. Neither Moz nor Webmaster Tools are going to detect the duplicates between your subdomain and Tumblr. So, we/they, must be seeing duplicates in the subdomain itself. This sounds like an overly-complex setup that is likely to be causing you some harm, but without seeing it in play, it's really hard to diagnose.
Could you PM me with the domain - I can log into Moz Analytics as you and check, but you have a few campaigns set up.
(Sorry, I originally was logged in as Marina and my reply posted - I apologize for the confusion)
-
Hello Kane,
Thank you for trying to help me!
I added a link to three screenshots. Two of them are from my MOZ account showing exponential increase of duplicate content and the second one is the subdomain where that duplicate content is coming from. The third screenshot is from my gmail account showing notification from GWT about deep links issue. I'm not sure whether these two issues have anything in common but i fell that they do. Please let me know what you think.
Thanks
-
Hi Marina, a few questions for you:
Can you possibly post screenshots of the Google Webmaster Tools warning that you're seeing?
Does your website have an app associated with it?
Assuming your Tumblr content isn't reposted somewhere on your main domain, it doesn't seem like a duplicate content issue to me, it seems like the GWT message is related to deep linking for a mobile app. I can't imagine why you'd get that if you don't have an app.
-
Thank you for help!
Answering your questions:-
My subdomain look like this: photos.domain.com and it was poined to Tumblr platform (our blog on Tumblr) because it is very image-fiendly platform as well as they host all the images.
-
We use this subdomain only for images posting. We don't use this content on our root domain at all.
I'm really confused what Android app they are talking about. Do the consider Tumblr as Android app?
Thanks
-
-
Hi there
Do you have a web development team or a web developer? What I would do is pass this notification over to them, along with your notifications from Moz, and see if they have a means to correct these issues. I am assuming that Google passed along resources in their notification; I would look into those and see what your options are.
If you do not have a web development team, I would check out the Recommended List to find a company that does web development as well as SEO that can assist in this. What it sounds like to me is that you are linking off to an app with a subdomain and it's creating a different user experience than the one generated by your website.
If I were you, I would find a suitable blogging platform that you can bring your sharing capabilities onto, and create a consistent and seamless experience for your users. Two questions:
- Is your subdomain blog.domain.com? Or is it named differently?
- Do you have your blog posts on your website and copied word for word on your subdomain?
Here are a couple of more resources to review with your team:
App Indexing for Google Search Overview What is App Indexing?
App Indexing for Google Search Technical Details Enabling Deep Links for App ContentLet me know if any of this helps or if you have any more comments - good luck!
-
Thank you for replies!
I'm fairly well aware about duplicate content issue but i have never faced such particular issue. As Lesley said i don't have access to head sections of each post because those posts are practically not on my property but on Tumblr's. And i have no idea how it is created. I assume that it is cause by Tumblr's feature that allows users to re-blog my blog posts.
Moreover, i've just received a warning from Google Webmaster Tools specifically pertaining this subdomain. I'm really confused. Please help
Fix app deep links to ....com/ that dont match the content of the web pages
Dear webmaster
When indexing the deep links to your app, we detected that the content of 1 app pages doesnt match the content of the corresponding web page. This is a bad experience for your users because they wont find what they were looking for on your app page. We wont show deep links for these app pages in our smartphone search results. This is an important issue that needs your immediate attention.
Take these actions to fix this issue:
- Check the Android Apps section of the Crawl Errors report in Webmaster Tools to find examples of app URIs whose content doesnt match their corresponding web page.
- Use these examples to debug the issue:
- Open the corresponding web page to have it ready.
- Use Android debug bridge to open the app page.
- Make sure the content on both your web page and your app page is the same.
- If necessary, change the content on your app (or change your sitemap / or rel=alternate element associations to make sure the each app page is connected to the right web page).
- If necessary, change your robots.txt file to allow crawling of relevant resources. This mismatch might also be due to the fact that some of the resources associated with the app page are disallowed from crawling though robots.txt.
-
I am not very experienced with tumblr personally, but I am pretty sure it cannot be done because they don't give you access to what you would need. You would need access to the head section of each page so that you could put the canonical tag in.
One thing that MIGHT could work, but would be tricky and I would also consult with someone else about too, to see what they though. Is that if the url's are the same minus the sub domain, you could get apache to rewrite a canonical in the actual request header and send it over. I do not know if google would respect this, so I would ask others advice.
-
Hi there
The only time you should noindex a site is if it's not supposed to be seen by search engines - if that's the case, then noindex it.
However, if this content is supposed to be seen by search engines, I would make use of your canonical tags on the subdomain and point it to the original content on the domain.
I would also think of a way to build a community with your website - it sounds like you have opportunities to do so and are getting some attention from your audience and how they are sharing your posts and information.
Also, look into sitemap opportunities with your images and how you can help crawlers understand the information on your website.
You can read more about duplicate content here.
Hope this helps a bit! Let me know if you have any questions or comments!
-
Hello Lesley,
Thank you for response! Well, the subdomain is pointing to our Tumblr blog. I have access to both main domain and Tumblr. Where should i add canonical?Thanks
-
Do you have control over the sub domain to add a canonical to it and point the canonical to the original content?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Self inflicted duplicate content penalty?
Wondering if I could pick the brains of fellow mozer's. Been working with a client for about 3 months now to get their site up in the engine. In the three months the DA has gone from about 11 to 34 and PA is 40 (up from about 15) so that's all good. However, we seem not to be moving up the ranking much. The average DA of competitors in the niche in the top ten is 25. We have 9.2 times the average no of backlinks too. During a call to the client today they told me that they noticed a major drop in their rankings a few months back. Didn't say this when we started the project. I just searched for the first paragraph on their homepage and it returns 16,000 hits in google, The second returns 9600 and the third 1,400. Searching for the first paragraph of their 'about us' page gives me 13,000 results!! Clearly something is not right here. Looking into this, I seems that someone has use their content, word for word, as the descriptions on thousands of blogs, social sites. I am thinking that this, tied in with the slow movement in the listings, has caused a duplicate content penalty in the search engines. The client haven't copied anyone's content as it is very specific for their site but it seems all over the web. I have advised them to change their site content asap and hope we get a Panda refresh in to view the new unique content. Once the penalty is off i expect the site to shoot up the rankings. From an seo company point of view, should I have seen this before? Maybe. If they had said they suffered a major drop in rankings a few months back - when they dropped their seo agency, I would have looked into it, but one doesn't naturally assume that a client's copy will be posted all over the web, it is not something I would have searched for without reason to search Any thoughts on this, either saying yes or no to my theory would be most welcome please. Thanks Carl
Technical SEO | | GrumpyCarl0 -
Duplicate content issue with Wordpress tags?
Would Google really discount duplicate content created by Wordpress tags? I find it hard to believe considering tags are on and indexed by default and the vast majority of users would not know to deindex them . . .
Technical SEO | | BlueLinkERP0 -
Duplicate content or titles
Hello , I am working on a site, I am facing the duplicate title and content errors,
Technical SEO | | KLLC
there are following kind of errors : 1- A link with www and without www having same content. actually its a apartment management site, so it has different bedrooms apartments and booking pages , 2- my second issue is related to booking and details pages of bedrooms, because I am using 1 file for all booking and 1 file for all details page. these are the main errors which i am facing ,
can anyone give me suggestions regarding these issues ? Thnaks,0 -
Fix duplicate content caused by tags
Hi everyone, TGIF. We are getting hundreds of duplicate content errors on our WP site by what appears to be our tags. For each tag and each post we are seeing a duplicate content error. I thought I had this fixed but apparently I do not. We are using the Genesis theme with Yoast's SEO plugin. Does anyone have the solution to what I imagine is this easy fix? Thanks in advance.
Technical SEO | | okuma0 -
Duplicate Content Issue: Google/Moz Crawler recognize Chinese?
Hi! I am using Wordpress multisite and my Chinese version of the website is in www.mysite.com/cn Problem: I keep getting duplicate content errors within www.mysite.com/cn (NOT between www.mysite.com and www.mysite.com/cn) I have downloaded and checked the SEOmoz report and duplicate_page_content list in CSV file. I have no idea why it says they have the same content., they have nothing in common in content . www.mysite.com is the English version of the website,and the structure is the same for www.mysite.com/cn *I don't have any duplicate content issues within www.mysite.com Question: Does google Crawler properly recognizes chinese content??
Technical SEO | | joony20080 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0 -
Duplicate Content -->?ss=facebook
Hi there, When searching site:mysite.com my keyword I found the "same page" twice in the SERP's. The URL's look like this: Page 1: www.example.com/category/productpage.htm Page 2: www.example.com/category/productpage.htm**?ss=facebook** The ?ss=facebook is caused by a bookmark button inserted in some of our product pages. My question is... will the canonical tag do to solve this? Thanks!
Technical SEO | | Nobody15565529539090