Duplicate page errors from pages don't even exist
-
Hi,
I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages don't even exist.
My website has around 40-50 pages but SEO report shows that 375 pages have been crawled.
My guess is that the errors have something to do with my recent htaccess configuration. I recently configured my htaccess to add trailing slash at the end of URLs. There is no internal linking issue such as infinite loop when navigating the website but the looping is reported in the SEOmoz's report.
Here is an example of a reported link:
btw there is no issue such as crawl error in my Google webmaster tool.
Any help appreciated
-
Thanks Mohammad. I am glad it is all resolved. Yes, broken or malformed links can often be like gremlins and cause all kinds of odd things to show up in crawl reports.
Ciao!
Dana
-
@Dana Tan,
Thanks. it is all sorted now. a broken link caused the problem. Thanks anyway.
Mohammad
-
Hi Mohammed,
Are you using absolute URLs for any canonical tags you have in place? Also, are you using absolute URLs for all of your internal links? You probably are, but I thought I'd better ask before diving in deeper. Hope I can help a little.
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
How to handle pages I can't delete?
Hello Mozzers, I am using wordpress and I have a small problem. I have two sites, I don't want but the dev of the theme told me I can't delete them. /portfolio-items/ /faq-items/ The dev said he can't find a way to delete it because these pages just list faqs/portfolio posts. I don't have any of these posts so basically what I have are two sites with just the title "Portfolio items" and "FAQ Items". Furthermore the dev said these sites are auto-generated so he can't find a way to remove them. I mean I don't believe that it's impossible, but if it is how should I handle them? They are indexed by search engines, should I remove them from the index and block them from robots.txt? Thanks in advance.
Technical SEO | | grobro0 -
Issue: Duplicate Page Content > Wordpress Comments Page
Hello Moz Community, I've create a campaign in Moz and received hundreds of errors, regarding "Duplicate Page Content". After some review, I've found that 99% of the errors in the "Duplicate Page Content" report are occurring due to Wordpress creating a new comment page (with the original post detail), if a comment is made on a blog post. The post comment can be displayed on the original blog post, but also viewable on a second URL, created by Wordpress. http://www.Example.com/example-post http://www.Example.com/example-post/comment-page-1 Anyone else experience this issue in Wordpress or this same type of report in Moz? Thanks for your help!
Technical SEO | | DomainUltra0 -
Pages Linking to Sites that Return 404 Error
We have just a few 404 errors on our site. Is there any way to figure out which pages are linking to the pages that create 404 errors? I would rather fix the links than create new 301 redirects. Thanks!
Technical SEO | | jsillay0 -
Why are pages linked with URL parameters showing up as separate pages with duplicate content?
Only one page exists . . . Yet I link to the page with different URL parameters for tracking purposes and for some reason it is showing up as a separate page with duplicate content . . . Help? rpcIZ.png
Technical SEO | | BlueLinkERP0 -
Is there an easy solution for duplicate page content on a drupal CMS?
I have a drupal 7 site www.australiacounselling.com.au that has over 5000 crawl errors (!). The main problem - close to 3000 errors- is I have duplicate page content. When I create a page I can create a URL alias for the page that is SEO friendly, however every time I do this, it is registering there are 2 pages with the same content. Is there a module that you're aware of that I can have installed that would allow me to show what is the canonical page? My developers seemed stumped and have given up trying to find a solution, but I'm not convinced that it should be that hard. Any ideas from those familiar with drupal 7 would be greatly appreciated!
Technical SEO | | ClintonP0 -
How do I eliminate duplicate page titles?
Almost...I repeat almost all of my duplicate page titles show up as such because the page is being seen twice in the crawl. How do I prevent this? <colgroup><col width="336"> <col width="438"></colgroup>
Technical SEO | | ENSO
| www.ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics |
| ensoplastics.com/ContactUs/ContactUs.html | Contact ENSO Plastics | This is what is from the CSV...there are many more just like this. How do I cut out all of these duplicate urls?0 -
Deep Page Link - url no longer exists
I used Open Site Explorer and found a link to our site on http://www.business.com/guides/bedding-supplies-3639/ The link was setup to go to an important, deep page on my website, but the structure of our urls changed and the url no longer exists. The link (anchor text 'National Hospitality Supply') does direct to our homepage, www.nathosp.com. My question is, am I receiving full link juice? Or would I be better served to create a 301 redirect to the revised / new page url? In case it matters, if I had my choice I'd prefer the link to go to the intended deep page. Thanks in advance for your insight. -Josh Fulfer
Technical SEO | | mhans0