Re : Duplicate Content
-
Hello,
I am a pro member, in my campaign it says duplicate content for few urls. which i m not able to understand, because both the url's are same but why its showing under duplicate content. here are the urls example.
-
Hi Justin,
Ow... I just had a quick look and if I am not mistaken SEOmoz might actually have gotten better at picking up the duplicates in my case. I just manually went through the list and I can't really fault the results... in hindsight.
Note to self: manually check the data next time before replying to any posts!
Cheers
Greg
-
Thats interesting, i've not noticed anything on any of my campaigns
Out of curiosity, does the list of duplicate URL's give any clues as the what may be causing the duplicate pages error? Could Moz have updated their crawler to be case sensitive or something along those lines ?
-
I am also seeing sudden spikes in Duplicate Content for several campaigns and I keep seeing questions about this in the Q&A. This leads me to assume that there is an issue is at SEOmoz' side...
There is no reason that I would suddenly see an increase without having made any changes.
Just checked the data manually and in my case the Duplicate warnings are valid actually...
-
Agree with Toms comments
If you want to tidy up this error, go through the site and make the links consistent (ie change http://domain.com/page to http://domain.com/page/ throughout and that should solve the problem.
This is particularly worth while if you share the reports with your customers as customers hate seeing errors.
Hope that helps
-
I wouldn't worry about this error in the campaign.
It may be that the moz crawler has tried to index the second URL - with the slash - because a site might be linking to it, or it could be in your sitemap, worth checking if this is the case.
But I can see that you have a canonical tag set up on your website that will prevent Google from indexing the second URL and potentially seeing duplicate content. Indeed, after having searched for the second URL in Google it does not appear to be indexed, so there should be no duplicate content issue.
Check no one is linking to your site in that way, or if it's in your sitemap. If not, chalk it up to a Moz crawler error and don't worry about it!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Feedback on Content Ideation / "Skyscraper" Spreadsheet Template
Hi All - I've been getting a ton of use out of the MOZ API for discovering the popularity of content - which I'm using for content ideation or to implement the Skyscraper concept. I built a spreadsheet template that combines MOZ with some other APIs to apply this to new topics of my choosing, and my friends encouraged me to clean it up a bit and share with the broader community. So, here it is - fire away! I'd love any and all feedback about the spreadsheet - it's a prototype still so it could stand to pull back more results. For example: would you want to include Domain Authority in the results? Focus more or less on the social sharing elements - or let you choose the thresholds? Would love to know if there other methodologies for which you'd be interested in seeing spreadsheet templates produced. Cheers! skyscraper-template.png
Moz Pro | | paulkarayan0 -
Problems with duplicate contents...
Hi folks, how's going? I started using Seo Moz and from the first crawling appears that I have 11 pages with duplicate contents... but this is not true, are different pages with different url, contents, tags... any idea to solve he problem? Alessandro, MusicaNueva.es
Moz Pro | | musicanueva0 -
Duplicate Content, Canonicalization may not work in our scenario.
I'm new to SEO (so please excuse the lack of terminology), and will be taking over our companies inbound marketing completely, I previously just did data analysis and managed our PPC campaigns within Google and Bing/Yahoo, now I get all three, Yipee! But I digress. Before I get started here, I did read: http://moz.com/community/q/new-client-wants-to-keep-duplicate-content-targeting-different-cities?sort=most_helpful and I found both the answers there to be helpful, but indirect for my scenario. I'm conducting our companies first real SEO audit (thanks MOZ for the guide there), and duplicate content is going to be our number one problem to tackle. Our companies website was designed back in 2009, with the file structure /city-name/product-name. The problem with this is, we are open in over 50 cities now (and headed to 100 fast), and we are starting to amass duplicate content. Five products (and expanding), times the locations... you get it. My Question(s): How should I deal with this? The pages are almost identical, except listing the different information for each product depending upon it's location. However, for one of our products, Moz's own tools (PRO) did not find all the duplicate content, but did find some (I'm assuming it's because the pages have different course options and the address for the course is different, boils down to a different address on the very bottom of the body and different course options on the right sidebar). The other four products duplicate content were found and marked extensively. If I choose to use Canonicalization to link all the pages to one main page, I believe that would pass all the link juice to that one page, but we would no longer show in a Google search for the other cities, ex: washington DC example product name. Correct me if I'm wrong here. **Should I worry about the product who's duplicate content only was marked four times out of fifty cities? **I feel as if this question answers itself, but I still would like to have someone who knows more than me shed some light on this issue. The other four products are not going to be an issue as they are only offered online, but still follow the same file structure with /online in place of /city-name. These will be Canonicalized together under the /online location. One last thing I will mention here, having the city name in the url gives us a nice advantage (I think) when people are searching for products in cities we offer our product. (correct me again) If this is not the case, I believe I could talk our team into restructuring the files (if you think that's our best option). Some things you need to know about our site: We use a cookie for the location. Once you land on a page that has a location tied to it, the cookie is updated and saved. If the location does not exist, then you are redirected to a page to chose a location. I'm pretty sure this can cause some SEO issues too, but once again not sure. I know this is a wall of text, but I cannot tell you enough how appreciative I am in advance for your informative answers. Thanks a million, Trenton
Moz Pro | | PM_Academy0 -
Advice for 4000+ duplicate errors on 1st check
Hi, 1st time use of the SEOMOZ scan has thrown up a lot of duplicate errors. Seems to look like my site has a .com.au/ & .com.au/default for the same pages. We had the domain on a hosted cms solution & have now migrated to magento. We duplicated the pages, but had to redirect all of the old url's to he new magento structure. This was done via a developer adding a 301 wildcard code to the .htaccess. Would that many errors be normal for a 1st scan? Where should I look for someone to fix them? Thanks
Moz Pro | | Paul_MC0 -
Seomoz duplicate rel="next" pages
Hello my page has this Although with seomoz crawl it says that this pages has duplicate titles. If my blog has 25 pages, i have according seomoz 25 duplicate titles. Can someone tell me if this is correct or if the seomoz crawl cannot recognize rel="next" or if there is another better way to tell google when there a pages generated from the blog that as the same title Should i ignore these seomoz errors thank you,
Moz Pro | | maestrosonrisas0 -
Does SeoMoz realize about duplicated url blocked in robot.txt?
Hi there: Just a newby question... I found some duplicated url in the "SEOmoz Crawl diagnostic reports" that should not be there. They are intended to be blocked by the web robot.txt file. Here is an example url (joomla + virtuemart structure): http://www.domain.com/component/users/?view=registration and the here is the blocking content in the robots.txt file User-agent: * _ Disallow: /components/_ Question is: Will this kind of duplicated url errors be removed from the error list automatically in the future? Should I remember what errors should not really be in the error list? What is the best way to handle this kind of errors? Thanks and best regards Franky
Moz Pro | | Viada0 -
Why does my crawl diagnostics show duplicate content
My crawl diagnostics show duplicate content at mysite.com and mysite.com/index.html which are essentially the same file.
Moz Pro | | MSSBConsulting0 -
What is the best method to solve duplicate page content?
The issue I am having is an overwhelmingly large number of pages on cafecartel.com show that they have duplicate page content. But when I check the errors on SEOmoz it shows that the duplicate content is from www.cafecartel.com not cafecartel.com. So first of all, does this mean that there are two sites? and is this a problem I can fix easily? (i.e. redirecting the URL and deleting the extra pages) Is this going to make all other SEO useless due to the fact that it shows that nearly every page has duplicate page content? Or am I just completely reading the data wrong?
Moz Pro | | MarkP_0