Issue with duplicate content
-
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot.
I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue.
Thank you!
-
Peter, i'm trying to PM you but i have no idea what to place in the "recepient" field. Thank you for assistance.
-
We only crawl your own site, so we wouldn't surface a duplicate with Tumblr, unless something really, really weird is going on. This is why I need to look at the campaign - what you're describing shouldn't happen, in theory, so I have a feeling this is a pretty unusual situation.
-
Hello Peter, thank you for helping!
Peter, why do you say that neither Moz nor Webmaster Tools are going to detect the duplicates between your subdomain and Tumblr? MOZ is detecting it now. Can you more elaborate on it?
THanks
-
My gut feeling is that you have 2+ issues going on here. Neither Moz nor Webmaster Tools are going to detect the duplicates between your subdomain and Tumblr. So, we/they, must be seeing duplicates in the subdomain itself. This sounds like an overly-complex setup that is likely to be causing you some harm, but without seeing it in play, it's really hard to diagnose.
Could you PM me with the domain - I can log into Moz Analytics as you and check, but you have a few campaigns set up.
(Sorry, I originally was logged in as Marina and my reply posted - I apologize for the confusion)
-
Hello Kane,
Thank you for trying to help me!
I added a link to three screenshots. Two of them are from my MOZ account showing exponential increase of duplicate content and the second one is the subdomain where that duplicate content is coming from. The third screenshot is from my gmail account showing notification from GWT about deep links issue. I'm not sure whether these two issues have anything in common but i fell that they do. Please let me know what you think.
Thanks
-
Hi Marina, a few questions for you:
Can you possibly post screenshots of the Google Webmaster Tools warning that you're seeing?
Does your website have an app associated with it?
Assuming your Tumblr content isn't reposted somewhere on your main domain, it doesn't seem like a duplicate content issue to me, it seems like the GWT message is related to deep linking for a mobile app. I can't imagine why you'd get that if you don't have an app.
-
Thank you for help!
Answering your questions:-
My subdomain look like this: photos.domain.com and it was poined to Tumblr platform (our blog on Tumblr) because it is very image-fiendly platform as well as they host all the images.
-
We use this subdomain only for images posting. We don't use this content on our root domain at all.
I'm really confused what Android app they are talking about. Do the consider Tumblr as Android app?
Thanks
-
-
Hi there
Do you have a web development team or a web developer? What I would do is pass this notification over to them, along with your notifications from Moz, and see if they have a means to correct these issues. I am assuming that Google passed along resources in their notification; I would look into those and see what your options are.
If you do not have a web development team, I would check out the Recommended List to find a company that does web development as well as SEO that can assist in this. What it sounds like to me is that you are linking off to an app with a subdomain and it's creating a different user experience than the one generated by your website.
If I were you, I would find a suitable blogging platform that you can bring your sharing capabilities onto, and create a consistent and seamless experience for your users. Two questions:
- Is your subdomain blog.domain.com? Or is it named differently?
- Do you have your blog posts on your website and copied word for word on your subdomain?
Here are a couple of more resources to review with your team:
App Indexing for Google Search Overview What is App Indexing?
App Indexing for Google Search Technical Details Enabling Deep Links for App ContentLet me know if any of this helps or if you have any more comments - good luck!
-
Thank you for replies!
I'm fairly well aware about duplicate content issue but i have never faced such particular issue. As Lesley said i don't have access to head sections of each post because those posts are practically not on my property but on Tumblr's. And i have no idea how it is created. I assume that it is cause by Tumblr's feature that allows users to re-blog my blog posts.
Moreover, i've just received a warning from Google Webmaster Tools specifically pertaining this subdomain. I'm really confused. Please help
Fix app deep links to ....com/ that dont match the content of the web pages
Dear webmaster
When indexing the deep links to your app, we detected that the content of 1 app pages doesnt match the content of the corresponding web page. This is a bad experience for your users because they wont find what they were looking for on your app page. We wont show deep links for these app pages in our smartphone search results. This is an important issue that needs your immediate attention.
Take these actions to fix this issue:
- Check the Android Apps section of the Crawl Errors report in Webmaster Tools to find examples of app URIs whose content doesnt match their corresponding web page.
- Use these examples to debug the issue:
- Open the corresponding web page to have it ready.
- Use Android debug bridge to open the app page.
- Make sure the content on both your web page and your app page is the same.
- If necessary, change the content on your app (or change your sitemap / or rel=alternate element associations to make sure the each app page is connected to the right web page).
- If necessary, change your robots.txt file to allow crawling of relevant resources. This mismatch might also be due to the fact that some of the resources associated with the app page are disallowed from crawling though robots.txt.
-
I am not very experienced with tumblr personally, but I am pretty sure it cannot be done because they don't give you access to what you would need. You would need access to the head section of each page so that you could put the canonical tag in.
One thing that MIGHT could work, but would be tricky and I would also consult with someone else about too, to see what they though. Is that if the url's are the same minus the sub domain, you could get apache to rewrite a canonical in the actual request header and send it over. I do not know if google would respect this, so I would ask others advice.
-
Hi there
The only time you should noindex a site is if it's not supposed to be seen by search engines - if that's the case, then noindex it.
However, if this content is supposed to be seen by search engines, I would make use of your canonical tags on the subdomain and point it to the original content on the domain.
I would also think of a way to build a community with your website - it sounds like you have opportunities to do so and are getting some attention from your audience and how they are sharing your posts and information.
Also, look into sitemap opportunities with your images and how you can help crawlers understand the information on your website.
You can read more about duplicate content here.
Hope this helps a bit! Let me know if you have any questions or comments!
-
Hello Lesley,
Thank you for response! Well, the subdomain is pointing to our Tumblr blog. I have access to both main domain and Tumblr. Where should i add canonical?Thanks
-
Do you have control over the sub domain to add a canonical to it and point the canonical to the original content?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Email and landing page duplicate content issue?
Hi Mozers, my question is, if there is a web based email that goes to subscribers, then if they click on a link it lands on a Wordpress page with very similar content, will Google penalize us for duplicate content? If so is the best workaround to make the email no index no follow? Thanks!
Technical SEO | | CalamityJane770 -
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
How to protect against duplicate content?
I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google. My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored. Is that the right move? Are there other things I should do? Thanks!
Technical SEO | | williammarlow0 -
Ways of Helping Reducing Duplicate Content.
Hi I am looking to no of anyway there is at helping to reduce duplicate content on a website with out breaking link and affecting Google rankings.
Technical SEO | | Feily0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860 -
API for testing duplicate content
Does anyone know a service or API or php lib to compare two (or more) pages and to return their similiarity (Level-3-Shingles). API would be greatly prefered.
Technical SEO | | Sebes0 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0 -
Question about duplicate content within my site
Hi. New here to SEOmoz and also somewhat new to SEO in general. A friend has asked me to help do some onsite SEO for their company's website. The company uses Drupal Content Management System. They have a couple product pages that contain a tabbed section for features, accessories, etc. When they built their tabs, they used a Drupal module called Quicktabs, by which each individual tab is created as a separate page and then pulled into the tabs from those pages. So, in essence, you now have instances of repeated content. 1) the page used to create the tab, and 2) the tab that displays on the product page. My question is, how should I handle the pages that were used to create the tabs? Should I make them NOINDEX? Thank you for your advice in advance.
Technical SEO | | aprilm-1890400