SeoMoz crawler giving false positives?
-
SeoMoz crawler indicated a few times that my site has a duplicate home page error (http://mysite.com and www.mysite.com)
I eliminated the the couple remaining internal links that pointed to http://mysite on a couple pages (all other internal links point to http://www.mysite.com)
I ran the crawl again and it said no errors this time. I naturally thought the duplicate page error problem was fixed.
However this morning I got the regularly scheduled crawl report from SeoMoz that said again I have those duplicate error pages. No changes were made to any of my site's pages between the crawls.
That makes me wonder if the crawler is providing false positives at times or was wrong when it said on the crawl a couple days ago that I don't have any errors (no duplicate page error).
Now, I don't know what to think.
-
Hey,
Our crawler actually requests the page http://mysite.com first but then finds all your links to www.mysite.com
You will want to contact the person responsible for hosting or developing your site in order to make these changes.
Have a great day!
Kenny
-
Thanks for the explanation. Could you answer a couple questions?
1 - If all internal site links go to www.mysite.com (none link to http://mysite.com), how does a duplicate page even happen? I don't understand how this happened to begin with if I don't have any such internal link to http://mysite.com.
2 - Can you recommend a service who can fix the htaccess page for me to create the 301 redirect? I'm not sure I want the hosting service doing it and making a mistake.
Thanks!
-
Hey,
That third campaign is actually a subdomain setup to crawl non-www. No duplicate content errors were presented because there are not any links to follow since all the links contain the subdomain www in them.
Root domain campaigns are distinguished with an astrick before the domain name.
-
Thanks - I initially thought that was it.
But if you see my 3rd campaign of the crawl, it runs it for the root domain and it shows no duplicates.
-
Hey,
I just looked into the issue that you are experiencing with our crawler. The reason the the discrepancy is because you actually have two separate campaigns running for the same site. One is set to crawl the root domain and one the subdomain.
The root domain campaign actually still presents these errors and has week over week but the sub-domain campaign is setup for the www version of your site and that's why these errors are not present, because the crawler won't even attempt to crawl off of www.
It is advisable to perform a 301 redirect as the other commenters mention.
Hope that helps!
Kenny
-
My point is the inconsistency in the SeoMoz crawler reports.
I got two SeoMoz crawl reports today - one was the regularly scheduled one which said I have duplicate home pages (as noted) and the crawl I started a couple hours ago said there are no errors.
So...how do you tell which one is right? Both cannot be since there were no changes to my website pages between the crawls.
thx
-
Hi,
If needed - this is the .htaccess code to help fix this issue; (Make sure and back up .htaccess before making any chages)
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^yourdomainhere.com [NC]
RewriteRule ^(.*)$ http://www.yourdomainhere.com/$1 [L,R=301]
The above code would redirect all traffic from non www to www version of your site fixing dup content issues in that regard
Source ;http://www.webconfs.com/how-to-redirect-a-webpage.php
PS Spaces between lines not needed (funky formatting here)
Hope this helps
-
You need to redirect one of your home pages to the other. www.mysite.com is different to the crawl robot as my site.com. In addition to having the issue with seomoz, you are losing serp value for your home page because you are dividing up the SEO value. Do a 301 redirect from one to the other and voila....problem solved.
Please make sure you give me the thumbs up for the help!! Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawler issues on subdomain - Need resolving?
Hey Guys, I'm fairly new to the world of SEO and have a ton of crawler issues with a friends website I'm doing some work on. After Moz did a site crawl I'm getting loads of errors (Total of 100+ for critical crawler, content and meta data). Most of these are due to broken social links on a subdomain - so my question is do I need to resolve all of the errors even if they are on a sub-domain? Will it affect the primary website? Thanks, Jack
Technical SEO | | Jack11660 -
GWT giving me 404 errors based on old and deleted site map
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure. However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages. Any thoughts?
Technical SEO | | Santaur0 -
My site doesnt give any 404 error
Hi guyz, I've realized that when someone try to access some url that doesn't exist on my site, my site gives a custom 404 page but not give any 404 http status code.
Technical SEO | | atakala
It still give 200 http status code. My system is IIS based how can I solve it?0 -
Can view pages of site, but Google & SEOmoz return 404
I can visit and view every page of a site (can also see source code), but Google, SEOmoz and others say anything other than home page is a 404 and Google won't index the sub-pages. I have check robots.txt and HTAccess and can't find anything wrong. Is this a DNS or server setting problem? Any ideas? Thanks, Fitz
Technical SEO | | FitzSWC0 -
While SEOMoz currently can tell us the number of linking c-blocks, can SEOMoz tell us what the specific c-blocks are?
I know it is important to have a diverse set of c-blocks, but I don't know how it is possible to have a diverse set if I can't find out what the c-blocks are in the first place. Also, is there a standard for domain linking c-blocks? For instance, I'm not sure if a certain amount is considered "average" or "above-average."
Technical SEO | | Todd_Kendrick0 -
Press release not giving me my link juice
The other day we released a press release, see it here http://www.businesswire.com/news/home/20120717006087/en/Rapid7-Metasploit-Pro-Increases-Vulnerability-Management-Efficiency. I asked them to include two links (seen in the first paragraph) with targeted anchor text (vulnerability management and penetration testing). The press release was published and when I check the open site explorer to see if I got any link juice from the press release, I am not seeing the link...ugh I noticed that they are using some sort of tracking code that seems to be redirecting the link, is this the problem? I talked to our sales rep at businesswire and he told me that they could take the code off if that is what needs to be done. Do you have any insight into this or have you ever ran into this problem?
Technical SEO | | PatBausemer0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Seomoz api for domains working, for domains+directory not?
We're working on a tool using the seomoz api ... for domains we're always getting the right values, but for longer URLs we're having troubles ... Example: http://www.seomoz.org/blog/6-reasons-why-qa-sites-can-boost-your-seo-in-2011-despite-googles-farmer-update-12160 won't work http://www.seomoz.org/blog works Any idea what we might be doing wrong?
Technical SEO | | gmellak0