Seomoz Can not Crawl My Site
-
Hello there
Seomoz can not crawl my site. It's been 3 days now not a single page has been crawled.
I deleted the campaign and tried again still now crawl not a single page..
Any solutions??
-
Mozbot usually crawls once every 7 days and takes a day or two to complete the crawl.
Give it at least 7 days.
On the bottom right under "Crawl diagnostics" for your campaign, it should say when the next crawl is scheduled.
Greg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you keep you old HTTP xml sitemape when moving to HTTPS site wide?
Hi Mozers, I want to keep the HTTP xml sitemape live on my http site to keep track of indexation during the HTTPS migration. I'm not sure if this is doable since once our tech. team forces the redirects every http page will become https. Any ideas? Thanks
Technical SEO | | znotes0 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
Can an AJAX framework (using HTML5 + pushstate) on your site impact your ranking?
Hello everybody, I am currently investigating a website which is rendered by an AJAX Framework (Angularjs) using the HTML5 +API history - Pushstate methods.
Technical SEO | | Netsociety
Recently Google announced that they are able to execute Javascript and can therefore see the content and links to discover all pages in the structure. However it seems that it doesn't run the Javascript at ALL times. (after some internal testing) So technically it is possible it arrives on a page without seeing any content and links, while another time he can arrive, run Javascript and read/discover the content and links generated by AJAX.
The fact that Google can't always interpret or read the website correctly can therefore have negative SEO impact? (not the indexation process but ranking) We are aware that is better to create a snapshot of the page but in the announcement of Google they state that the method that is currently used, should be sufficient. Does anybody have any experience with this AND what is the impact on the ranking process? Thanks!0 -
Can I redirect a link even if the link is still on the site
Hi Folks, I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places. When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation. We can't use rel-canonical because they don't want visitors going to that 2nd page. Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page? I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change. So, what are your thoughts? Thanks!
Technical SEO | | Rock330 -
What steps can you take to help a site that does not change
Hi, i am working on a product and services website www.clairehegarty.co.uk but the problem i have is, the site does not really change. The home page stays the same and the only time it changes is when a new course is advertised. The most important page on the website is http://www.clairehegarty.co.uk/virtual-gastric-band-with-hypnotherapy but we have seen the site drop in rankings because the page is not being updated. This page has all the information you could want on weight loss but we have seen the page drop from number one in google to number four. I would like to know what steps we should take to increase our rankings in google and would be grateful for your suggestions. If i put in articles on the site, had a section where we put a new article every week, would this then get google to visit the whole site more and move our pages back up the rankings, or should we be looking at doing other things.
Technical SEO | | ClaireH-1848860 -
Should Canonical be used if your site does not have any duplicate
Should canonical be used site wide even if my site is solid no duplicate content is generated. please explain your answer
Technical SEO | | ciznerguy0 -
Why does my site have a PageRank of 0?
My site (www.onemedical.com) has a PageRank of 0, and I can't figure out why. We did a major site update about a year ago, and moved the site from .md to .com about 9 months ago. We are crawled by Google and rank on the first page for many of our top keywords. We have a MozRank of 4.59. I figured this is something that would just take time to work out of the system, but nothing seems to change while we patiently wait. One more thing to note - when a user comes to the homepage (city selector) and selects their region they will then be cookied and directed to their relevant city site on subsequent visits. But even our city-specific pages (ie www.onemedical.com/sf) have pageranks of 0. My management team keeps asking me about this and I suspect there is something silly that we keep overlooking...but for the life of me, can't figure it out. Any help would be appreciated.
Technical SEO | | OneMedical0 -
Site Architecture Trade Off
Hi All I'm looking for some feedback regarding a site architecture issue I'm having with a client. They are about to enter a re-design and as such we're restructuring the site URLs and amending/ adding pages. At the moment they have ranked well off the back of original PPC landing pages that were added onto the site, such as www.company.com/service1, www.company.com/service2, etc The developer, from a developer point of view wished to create a logical site architecture with multiple levels of directories etc. I've suggested this probably isn't the best way to go, especially as the site isn't that large (200-300 pages) and that the key pages we're looking to rank should be as high up the architecture as we can make them, and that this amendment could hurt their current high rankings. It looks like the trade off may be that the client is willing to let some pages be restructured so for example, www.company.com/category/sub-category/service would be www.company.com/service. However, although from a page basis this might be a solution, is there a drawback to having this in place for only a few pages rather than sitewide? I'm just wondering if these pages might stick out like a sore thumb to Google.
Technical SEO | | PerchDigital1