Page crawling is only seeing a portion of the pages. Any Advice?
-
last couple of page crawls have returned 14 out of 35 pages. Is there any suggestions I can take.
-
Ahh, I think RankSurge was referring to Google crawling, and you were referring to SEOmoz crawling? For SEOmoz, send an email to help@seomoz.org and let them know which account/campaign you're talking about, and they'll make sure Roger crawls the full site.
Keri
-
well I have left it two weeks to test, it jumped from 9 to 26 and after todays crawl. I am now down to 2 pages. I really have absolutely no idea whats going on here.
-
Hi RankSurge,
Thanks for the quick response, I jumped back into webmaster tools and completed a new fetch,.
Then jumped into site config > stemapes and resubmitted it. That picked up every page and has been resubmitted.
Did I miss something?
Cheers,
Andrew -
Do you have your search engine sitemap verified in your Google Webmaster account? That would be the first step you should start with.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Dropping pages from SERPS?
Our website for my olansi company in London, China has hundreds of pages dedicated to every service we provide to China local areas. The total number of pages is approximately 100. Google caters pretty well for long-tail searches when it indexes all these pages, so we usually get a fair amount of traffic when this happens. However, Google occasionally drops most of our indexed pages from search engine results for a few days or weeks at a time - for example, Google is currently indexing 60 pages while last week it was back at 100. Can you tell me why this happens? When these pages don't display, we lose a lot of organic traffic. What are we doing wrong? Site url:https://www.olanside.com
Technical SEO | | sesahoda0 -
Page Speed or Size?
Hi everyone. I have a client who really wants to add a 1min html5 video to the background of their homepage. I have managed to reduce the size of the video to 20MB and I have tested the page in pingdom. The results are 1.85 s to load, and weighed in at 21.2 MB. My question is does Google factor page load speed or size in it's ranking factors? I am also mindful of the negative effect this could have on bounce rate. Thanks.
Technical SEO | | WillWatrous0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Index page
To the SEO experts, this may well seem a silly question, so I apologies in advance as I try not to ask questions that I probably know the answer for already, but clarity is my goal I have numerous sites ,as standard practice, through the .htaccess I will always set up non www to www, and redirect the index page to www.mysite.com. All straight forward, have never questioned this practice, always been advised its the ebst practice to avoid duplicate content. Now, today, I was looking at a CMS service for a customer for their website, the website is already built and its a static website, so the CMS integration was going to mean a full rewrite of the website. Speaking to a friend on another forum, he told me about a service called simple CMS, had a look, looks perfect for the customer ... Went to set it up on the clients site and here is the problem. For the CMS software to work, it MUST access the index page, because my index page is redirected to www.mysite.com , it wont work as it cant find the index page (obviously) I questioned this with the software company, they inform me that it must access the index page, I have explained that it wont be able to and why (cause I have my index page redirected to avoid duplicate content) To my astonishment, the person there told me that duplicate content is a huge no no with Google (that's not the astonishing part) but its not relevant to the index and non index page of a website. This goes against everything I thought I knew ... The person also reassured me that they have worked within the SEO area for 10 years. As I am a subscriber to SEO MOZ and no one here has anything to gain but offering advice, is this true ? Will it not be an issue for duplicate content to show both a index page and non index page ?, will search engines not view this as duplicate content ? Or is this SEO expert talking bull, which I suspect, but cannot be sure. Any advice would be greatly appreciated, it would make my life a lot easier for the customer to use this CMS software, but I would do it at the risk of tarnishing the work they and I have done on their ranking status Many thanks in advance John
Technical SEO | | Johnny4B0 -
Page rank and ranking down
Hi I blog at Technostarry. Some 3 months back during page rank update, my page rank went down from 3 to 2. I don't know the reason behind this. And now, my traffic and ranking is also down. I am not involved in any bad SEO practices, I don't copy paste and I write original content. I am too confused as why and what has happened with my site. If someone could analyze my blog and look at my weak points then that would be great. I would like to get any suggestions to get back my ranking and also page rank back. Thanks.
Technical SEO | | technotech0 -
Renaming of pages
About 2 months ago one of our clients renamed a section of his website. The worst part is that the URLs of the page also changed. New page: http://www.meresverige.dk/rejser/malmo Old page: http://www.meresverige.dk/rejser/malmoe The problem now is that the new page get absolutely no page-rank transfered from the old page. It also get no mozrank at all. Also if I try to find it in the Open Site Explorer it can not be found.The old page can, but not the new one. We have updated the sitemap.xml and also done proper 301 redirect for the pages since about 2 months. Any ideas here? This page was a very important page in terms of traffic so very much thankful for any input. Have a great day Fredrik
Technical SEO | | Resultify0 -
Consolidate page strength
Hi, Our site has a fair amount of related/similiar content that has been historically placed on seperate pages. Unfortuantely this spreads out our page strength across multiple pages. We are looking to combine this content onto one page so that our page strength will be focused in one location (optimized for search). The content is extensive so placing it all on one page isn't ideal from a user experience (better to separate it out). We are looking into different approaches one main "tabbed" page with query string params to seperate the seperate pages. We'll use an AJAX driven design, but for non js browsers, we'll gracefully degrade to separate pages with querystring params. www.xxx.com/content/?pg=1 www.xxx.com/content/?pg=2 www.xxx.com/content/?pg=3 We'd then rel canonical all three pages to just be www.xxx.com/content/ Same concept but useAJAX crawlable hash tag design (!#). Load everything onto one page, but the page could get quite large so latency will increase. I don't think from an SEO perspective there is much difference between options 1 & 2. We'll mostly be relying on Google using the rel canonical tag. Have others dealt with this issue were you have lots of similiar content. From a UX perspective you want to separate/classifiy it, but from an SEO perspective want to consolidate? It really is very similiar content so using a rel canonical makes sense. What have others done? Thoughts?
Technical SEO | | NicB10 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0