Why that many Crawl Diagnostics errors, false?
-
Hi,
my fist question, trying to understand how seomoz pro works.
I have about 680 crawl erros, but when I check the details, I found this:
1. many 403 errors ( I think from all links).
> I have tested my web site (telcelsolcom.com) in other tools and all says OK 200 response.
2. many title and content duplicates, but the system is showing as duplicated pages with and without the www.
> I have a 301 redirect from non www to with www and it is working ok.
Do I have false errors? What am I doing wrong?
thanks.
-
Thanks for your time Brent.
As far as I know I dont have any configuration that could block spiders. I have more sites in same sever without any issues, nothing special in .htaccess and some blocked directories in robots, thats it.
Actually google is indexing the web site very well (about 50% of urls submited two days ago).
I am sorry if next question sounds like blasphemy here in the comunity:
do these tools really works?
I dont want to expend time solving false errors. Also my free month is about to expire and would like to know it worth to pay next month.
-
Not really sure, but as I was trying to diagnose it, the site broke on me. Only after I sent my crawl tool to it.
So I am wondering if your server sees incoming spiders crawling, then blocks them. That would explain why both SEOmoz and my crawl tool came back with funny errors.
At first glance, I didn't see any issues. Not really sure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help for 404 error
Is it ok to use hash(#) in URL as right now it is showing 404 error while checking broken links using check my link chrome extension. E.g. http://instafrsh.com/index.php#aboutUs If not, Please suggest how to resolve this issue
On-Page Optimization | | Obbserv0 -
Internal and Link Juice Analysis - Too Many Links Error
Howdy! I have an analysis question related to internal links/link juice. Here is the general link set up of our site: 1. All Site Pages (Including Home Page): We have drop down "mega" menus in the header of everypage linking to various sub-categories on the site. So, because of this, in our header, we have a few hundred links to various pages on our site and these show up on every page of the site. 2. Product Pages: Header pages as mentioned above, but on top of that, we list out the keywords for that particular product and each keyword is linked back to our search results pages for that particular keyword. In General Moz is telling us we are having between 200-300 links on each product page. Currently, our Search Results pages are ranking higher and showing up in search more than our actual product pages. So, based on the above info, here are some thoughts: 1. Should we ajax in the Header links so that they aren't showing up for the search engines? Or, should we ajax them in only on all pages that are not the Home Page? 2. Should we get rid of the keyword links back to the Search Results pages that are on the product pages? What effect would these changes "actually" have? Does this just improve crawling? Or are there other positive results that would come of changes like these? We have hundreds of thousands of products, so if we were to make changes like these, could we experience negative results? Thanks for your help! Craig
On-Page Optimization | | TheCraig0 -
How many words per page should be have?
Hi, How many words per page should we have? And how many keywords should be in there for optimal ranking> Thanks Andrew
On-Page Optimization | | Studio330 -
Too many on page links - created by filters
I have an ecommerce site and SEOmoz "Crawl Diagnostics Summary" points out that I have too many hyperlinks on most of my pages. The most recent thing I've done that could the culprit is the creation of number product filters. Each filter I put on the page is creating a hyperlink off that page. As an example, there's a filter available for manufacturers. Under that, there are 8 new filter links, thus new hyperlinks. On one category there are 60 new links created because of filters. I feel like these filters have made the user experience on the site better BUT has dramatically increased the number of outbound links off the page. I know keeping it to under 100 is a rule-of-thumb but at the same time there must be some validity to trying to limit them. Do you have any recommendation on how I can "have my cake and eat it too?" Thanks for any help!
On-Page Optimization | | jake3720 -
Too many on page links in sitemap.html
My crawl report is flagging an issue with too many links to one of my pages, this page is my sitemap.html. However, I have coded the page so that if required is specified it generates an .xml version of the page and if not then the html version is displayed. What is the best way to stop the crawl finding the html version whilst maintaining it on the site for clients navigation?
On-Page Optimization | | SamPenno0 -
Too many On-Page Links warning + Javascript Menu
We do have javascript menus on each page. These are used by a visitor to contact a specific office in a specif city. Could this be where all these links are being counted? I don't see them elsewhere? What about links that are in the footer? They actually link to the same pages as the menus, but are just straight links.
On-Page Optimization | | Stevej240 -
How long after a URL starts showing a 404 does Google stop crawling?
Before hiring me to do SEO, a client re-launched their site and did not 301 the old URLs to the new. Only the home page URL stayed the same. For a month after the re-launch, the old URLs returned a 404. For the next month, all 404 pages (basically any non-existent URL) were 301'd to the home page. Finally, 2 months after launching, they properly 301'd the old URLs to the new. Now, the new URLs are not ranking well. I assume it's too late to realize any benefit from the 301's, just checking to see if anybody has any insight into how long Google keeps trying to crawl old/404/improperly 301'd URLs. Thanks!
On-Page Optimization | | AndrewMiller0 -
What is the effect of too many internal links on a page?
Hi there! We have been doing a great effort during the last year but our main competitor is still above us in search rankings. Basically, the main differente remains now in the number of internal links, specially in our homepage. We have more than 200 and they only have around 100, so I think we are wasting too much link power among some irrelevant pages. What could be the effect of this?
On-Page Optimization | | bodaclick0