Duplicate Content/Missing Meta Description | Pages DO NOT EXISIT!
-
Hello all,
For the last few months, Moz has been showing us that our site has roughly 2,000 duplicate content errors. Pages that were actually duplicate content, I took care of accordingly using best practice (301 redirects, canonicalization,etc.). Still remaining after these fixes were errors showing for pages that we have never created.
Our homepage is www.primepay.com. An example of pages that are being shown as duplicate content is http://primepay.com/blog/%5BLink%20to%20-%20http:/www.primepay.com/en/payrollservices/payroll/payroll/payroll/online-payroll with a referring page of http://primepay.com/blog/%5BLink%20to%20-%20http:/www.primepay.com/en/payrollservices/payroll/payroll/online-payroll. Some of these are even now showing up as 403 and 404 errors.
The only real page on our site within that URL strand is primepay.com/payroll or primepay.com/payroll/online-payroll. Therefore, I am not sure where Moz is getting these pages from.
Another issue we are having in relation to duplicate content is that moz is showing old campaign url’s tacked on to our blog page i.e. http://primepay.com/blog?title=&page=2&utm_source=blog&utm_medium=blogCTA&utm_campaign=IRSblogpost&qt-blog_tabs=1.
As of this morning, our duplicate content went from 2,000 to 18,000. I exported all of our crawl diagnostics data and looked to see what the referring pages were, and even they are not pages that we have created. When you click on these links, they take you to a random point in time from the homepage of our blog; some dating back to 2010.
I checked our crawl stats in both Google and Bing’s Webmaster tool, and there are no duplicate content or 400 level errors being reporting from their crawl. My team is truly at a loss with trying to resolve this issue and any help with this matter would be greatly appreciated.
-
Thanks Dirk. Very insightful tip about not using campaign tracking to check internal links. There was an old blog post that had anchor text with campaign tracking that was causing many SEO issues. As for the latter part, it is unknown why a string of gibberish can be placed after /blog/ and also for our locations page. Our team's web developer is looking further into this issue. If anyone has any more advice on the matter it would be greatly appreciated.
-
Hey there
Dirk pretty much hit upon the issue, which I'll reiterate with a visual. If you enter any gibberish /blog URL (like this: http://primepay.com/blog/jglkjglkjg) in the browser it returns a 200 OK which, but it should return a 404 code --> http://screencast.com/t/cStpPB5zE
Otherwise pages that are really broken will look to crawlers like they are supposed to exist.
-
You shouldn't use campaign tracking to check internal links - you have to use event tracking. Check http://cutroni.com/blog/2010/03/30/tracking-internal-campaigns-with-google-analytics/ . Apart from the reporting issue - it's also generating a huge number of url's that need to be crawled by Google bot and is just wasting it's time (most of these tagged url have a correct canonical version). You mention these tags are old - but they are still present on a lost of pages.
For cases like this it's better to check with a local tool like Screaming Frog which gives you a much better view which pages are generating these links.The other issue you have is probably related to a few pages that have a bad formatted (relative) url in a link - the way your site is configured it's just rendering a page on your site - so the bots are then crawling your site over and over again, each time encountering the same bad relative link - and each time adding the bad formatting to the url. It's an endless loop - best way to avoid this is to use absolute internal links rather than relative links. Not sure if it's the only one - but one of the pages with this error is :http://primepay.com/blog/7-ways-find-right-payroll-service-your-company - it contains a link to
[Your payroll service is no different.]([Link to - http://www.primepay.com/en/payrollservices/] "Your payroll service is no different.")
This page should generate a 404 but is generating a 200 and the loop starts here.
Again - with screaming frog you can for each of these bad url's you can generate a crawl path report which shows you exactly on which page the error is generated.
Hope this helps,
Dirk
-
Example:
http://primepay.com/blog/hgehergreg
Status:
My site as an example:
https://caseo.ca/blog/hgehergreg
If I put in random gibberish in this URL, it should be displaying a 404 page and not the blog page.
-
Getting you some help for direct advice on your problem, but wanted to leave a comment about the tool itself. When you are looking at the Moz crawl tool, it only updates once a week, so if there hasn't been that long between the last crawl and when you did the work, it won't be updated. Here's more info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
High Page Authority / Low Domain Authority
I have been building links to the home page of my web site. At this point, the site has a DA of 22 and the home page has a PA of 35. I am a bit puzzled as to how the PA can be so much higher than DA. I suppose it might have something to do with the fact that most of the inbound links are going to the home page. Are there any other factors that might be contributing to this situation? Does it point to a weakness in our backlink profile?
Moz Pro | | NathanSchlink1230 -
What is the best way to add a noindex./nofollow meta tags to tags in a blog?
Could anyone tell me the best way to add noindex./nofollow meta tags as I have around 12 duplicate tags in a blog. I have the Yoast SEO plugin - unpaid version.
Moz Pro | | SEM_at_Lees0 -
Duplicate Page
I just Check Crawl the status error with Duplicate Page Content. As Mentioned Below. Songs.pk | Download free mp3, Hindi Music, Indian Mp3 Songs http://www.getmp3songspk.com Songs.pk | Download free mp3, Hindi Music, Indian Mp3 Songs http://getmp3songspk.com and then i added these lines to my htaccess file RewriteBase /
Moz Pro | | Getmp3songspk
RewriteCond %{HTTP_HOST} !^www.getmp3songspk.com$ [NC]
RewriteRule ^(.*)$ http://www.getmp3songspk.com/$1 [L,R=301] But Still See that error again when i crawl a new test.0 -
Duplicate Content
Hello, I'm managing a site which shows as having duplicate page issues (in the crawl analyser) for 3 pages. Basically the site is offering 3 different options of the same product so depending on which size you select, you are directed to the relevant page. These 3 pages are basically identical apart from a slight difference in copy regarding the size (small, medium, large) Is this likely to be a big issue regarding SEO, and what would the moz community suggest re this? Thank you!
Moz Pro | | wearehappymedia0 -
I need an interlinking report for my site, is there a report in Moz or another application that tell me how all of my pages are linked to other pages on my site?
I am in the process of doing a redesign for one of my sites. I need an interlinking report for my site. Is there a report in Moz or another application that tell me how all of my pages are linked to other pages on my site?
Moz Pro | | seoflorida0 -
Why There is no Page Authority in none of my websites?
I just saw, SEOMoz updated their Page Authority, Mozrank and Domain Authority. However in most of my websites, the Page Authority got reset. Am I doing something wrong?
Moz Pro | | GroupM0 -
Should I worry about limiting link count on product listing/category pages?
I've noticed that my link count is high (165ish for some) on my category listing pages. I've been scouring my page to see if there's any way that I can reduce the link count without restricting functionality to the end user. Each product listing on the category page has 5 links currently: A link to the product in the title A link to the product from the image An 'add to compare' link An 'add to cart' link An 'add to wishlist' link When the customer chooses to show 30 products per page, the link tally goes off the scale. So I have two questions: Firstly - is it appropriate to keep link count down in this scenario? To elaborate - is it just inevitable that product listing pages will have lots of links, and should I just assume that Google knows this and forget about these warnings. Secondly - There are two links to the same page (the title and image links to the product page). Does SEOmoz include this in the link count, and more importantly, will Google take heed of these when deciding whether the page is too link-heavy?
Moz Pro | | SimonGreer0 -
An error in the SeoMoz On page note?
Hello folks, Whenever I go the OnPage link in SeoMoz some of my links show a F ranking note. And when I click in one of them to see the detail of the page rank, it shows me as an A ranking note. Do you have seen the same problem? Which note shall I rely on? Thanks!!
Moz Pro | | jgomes0