Crawl Diagnostics returning duplicate content based on session id
-
I'm just starting to dig into crawl diagnostics and it is returning quite a few errors. Primarily, the crawl is indicating duplicate content (page titles, meta tags, etc), because of a session id in the URL.
I have set-up a URL parameter in Google Webmaster Tools to help Google recognize the existence of this session id. Is there any way to tell the SEOMoz spider the same thing? I'd like to get rid of these errors since I've already handled them for the most part.
-
You the man! Thanks!
-
Hi Cody,
The best way is to block Rogerbot within your Robots.txt from crawling specific pages of your site. In your case protecting Rogerbot from seeing the pages with a session ID.
More information could be found here on Rogerbot.Be cautious and test it out, but the lines you would have to add to your Robots.txt are probably:
User-agent: rogerbot
Disallow: /*sessionidHope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's my best strategy for Duplicate Content if only www pages are indexed?
The MOZ crawl report for my site shows duplicate content with both www and non-www pages on the site. (Only the www are indexed by Google, however.) Do I still need to use a 301 redirect - even if the non-www are not indexed? Is rel=canonical less preferable, as usual? Facts: the site is built using asp.net the homepage has multiple versions which use 'meta refresh' tags to point to 'default.asp'. most links already point to www Current Strategy: set the preferred domain to 'www' in Google's Webmaster Tools. set the Wordpress blog (which sits in a /blog subdirectory) with rel="canonical" to point to the www version. Ask programmer to add 301 redirects from the non-www pages to the www pages. Ask programmer to use 301 redirects as opposed to meta refresh tags & point all homepage versions to www.site.org. Does this strategy make the most sense? (Especially considering the non-indexed but existent non-www pages.) Thanks!!
Moz Pro | | kimmiedawn0 -
Duplicate Content
Crawl Diagnostics is returning duplicate content/title tags for every product image on listing pages of my classified site because each image is on a separate url. So this page, for example, http://marketplace.myclassicgarage.com/cars/all/Chevrolet-Bel-Air/24481/ has, among other things, the same title tag as all this page, http://marketplace.myclassicgarage.com/cars/all/Chevrolet-Bel-Air/24481/media/151968 which is one of many different images that are all child pages in the folder /media In this particular case there are over 140 pages with the same title tag because there are over 140 images for this particular car. That is just one listing and there are over 1,000 listings (vehicles) and that number will grow. Is this really a problem? With limited resources, what real positive effect will making all these images have unique title tags really have from a SERP perspective? Keep in mind this being user generated content, there is no way to descriptively update the title tags to something like <title>Bel Air Passenger Side Profile</title>. That is not feasible.
Moz Pro | | MyClassicGarage0 -
Duplicate content error?
I am getting a duplicate content error for the following pages: http://www.bluelinkerp.com/products/accounting/index.asp http://www.bluelinkerp.com/products/accounting/ But, of course, the 2nd link is just an automatic redirect to the index file, is it not? Why is it thinking it is a different URL? See image. NJfxA.png
Moz Pro | | BlueLinkERP0 -
Is there a way to perform a crawl diagnostics without creating a campaign?
If you wanted to perform a crawl diagnostics but your campaigns are at full capacity are you able to do this and how (or does this mean you will have to remove one campaign to make space for another)?
Moz Pro | | SarahAhmed3790 -
Duplicate page errors
I have 102 duplicate page title errors and 64 duplicate page content errors. They are almost all from the email a friend forms that are on each product of my online store. I looked and the pages are identical except for the product name. Is this a real problem and if so is there a work around or should I see if I can turn off the email a friend option? Thanks for any information you can give me. Cingin Gifts
Moz Pro | | cingingifts0 -
Issue: Duplicate page title
Hello, I have run the "Crawl Diagnostics" report using SEOmoz pro and it says that I have a total of 56 errors. 18 of those errors being duplicate content and another 38 errors being duplicate title tags. Now I have looked at both reports and detail and the reason I am getting there errors is due to the fact the it is checking "http" and "https". So for example: my website is http://www.widgets.com On the crawl diagnostics report, it also checks https://www.widgets.com So it looks like I have duplicate content and duplicate title tags because of this Now my question is this: Is this really duplicate content? If so, how do I fix this? Any help is greatly appreciated.
Moz Pro | | threebiz0 -
Sub-domain not crawled
One of our sites was recently re-designed. The home page is a landing page (www.labadieauto.com) and I moved the blog to this domain (labadieauto.com/blog/) and put a link is the bottom left of the home page. Since the change the SEOMOZ campaign overview is showing only 1 page crawled. This is not setup as a sub-domain so why isn't it showing in the crawl? Help!
Moz Pro | | LabadieAuto0 -
Seo moz crawl is not updating
When I check our seo campaign I can see that the report was not updated. It still show that the next crawl is Nov 1 but it is already Nov 3.
Moz Pro | | shebinhassan0