SEOMoz only crawling 5 pages of my website
-
Hello,
I've added a new website to my SEOmoz campaign tool. It only crawls 5 pages of the site. I know the site has way more pages then this and also has a blog.
Google shows at least 1000 results indexed.
Am I doing something wrong? Could it be that the site is preventing a proper crawl?
Thanks
Bill
-
You should have setup a subdomain (which is what you are very linkely to have done anyway) but this linking issue is a real sticking point for you at the moment.
It's difficult to give you concrete advise without knowing your friend's business model, marketing strategy and content, owever, lets just say for neatness he wants to keep his main squeeze page as it is - at www.kingofcopy.com - you could separate all of the squeeze pages from the 'subscribers' content by creating a sub folder called 'members-area' for example - so www.kingofcopy.com contains the squeeze page where it is now (and additional sqeeze pages reside at www.kingofcopy.com/maxoutsideusa.html etc)
and all of the opt in content is moved to www.kingofcopy.com/members-area/ ensuring all of the good info that shouldn't be visible is noindexed accordingly.
Of course, this advise is based on the assumption that you only want to rank squeeze pages.
If I were undertaking this project I would do things a little differently - as I believe that sqeeze pages have now lost some of their kick - perhaps due to the huge numbers of them I have seen... So instead I would have a lot of teaser articles and videos - which contain a lot of good keyword targeted content all SEOd to the max, making sure that there are some good nuggets of info in them - so that the reader thinks - Wow! If the stuff he gives away for free is this good then I can't wait to find out how much better the paid for stuff is!
In terms of onpage SEO and campaign management - separate content which you want highly visible from the members only content - store non-indexed pages emembers pages within a sub-folder - link all of the content you want visible and indexing in some way
-
well I am just doing this for a friend of mine. His site is not ranking as well as he would like it to. I know he has some issues but first I wanted to see what major errors I could find and then fix. Then of course I am only getting 5 pages of content.
The rest of his site is indexed in google. You can find lots of his pages. I was just trying to figure out why the tool is only crawling 5 pages.
I don't recall which campaign I set it up originally. Which one do you recommend?
-
Hey Bill,
Can you tell us what campaign type you initially setup: Sub domain, Root domain or Sub folder?
I believe you are going to struggle setting up your campaign to monitor all of these pages due to the current configuration - based on the link architecture/navigation.
Would it be fair to say that you are actually only concerned about monitoring the performance of the visible Sqeeze pages in the SERPS - because if every other page should only be visible when you opt in then it stands to reason that you would be better to have all of this content hidden using noindex, to preserve the value of the content within those pages - to give potential customers every reason to opt in?
If we had a better idea of what your end goal was it might help us better assist you.
-
I think what he has here is a squeeze page set as his home page. You can not access the rest of the site unless you optin. Of course some of the other subpages are indexed in Google so you can bypass the home page.
Because he is using a squeeze page with no navigation is this why there is no link to the rest of the sites content?
Sorry-Trying to follow along.
-
http://www.kingofcopy.com/sitemap.xml - references only 3 files (with the index and sitemap/xml link making it up to 5)
However the other sections of the site are installed into sub folders or are disconnected from the content referenced from your root www.kingofcopy.com
take a look at this sitemap further into the site from one of your subfolders http://www.kingofcopy.com/products/sitemap.xml and you will see what looks to be the 1000+ pages you refer to.
However, there is no connection between the root directory and these other pages and sub folders.
It appears that your main page is http://www.kingofcopy.com/main.html
Ordinarily you would want to bring them into one common, connected framework - with all accessible pages linked to in a structured and logical way - and if you have other exclusive squeeeze pages/landing pages that you do not want to show up in search results - and just direct users to them using mail shots etc then you can prevent them getting indexed - for example - you may want to prevent a pure sqeeze page like http://www.kingofcopy.com/max/maxoutsideusa.html from appearing in the SERPS.
To prevent all robots from indexing a page on your site, place the following meta tag into the section of your page:
Personally, I would consider a restructure to bring this content into the root directory - noindexing the squeeze pages as required - but this would need to be carefully planned and well executed with 301 redirects in place where content has moved from one directory to another
However, you could always shuffle around the first few pages - renaming main.html to index html and having the copy you currently have at www.kingofcopy.com in a lightbox/popup or similar over the top of the main page ?
I think the problem with the main.html page not being found as your default root/home page and the lack of connections between certain pages is the cause for a lot of the issues with your campaign crawling so few of the pages.
Incidentally, if you did restructure consider using Wordpress as would be a great fit with what you have produced already (and there are plently of wordpress squeeze page/product promotion themes available.
-
I feel like I've checked just about everything. I do not have access to his GWT.
Ryan, thanks for helping me with this.
-
Can you share the URL?
There are several things to check, starting with the robots.txt file and your site's navigation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Seo moz has only crawled 2 pages of my site. Ive been notified of a 403 error and need an answer as to why my pages are not being crawled?
SEO Moz has only crawled 2 pages of my clients site. I have noticed the following. A 403 error message screaming frog also cannot crawl the site but IIS can. Due to the lack of crawling ability, im getting no feed back on my on page optimization rankings or crawl diagnostics summary, so my competitive analysis and optimization is suffering Anybody have any idea as to what needs to be done to rectify this issue as access to the coding or cms platform is out of my hands. Thank you
Moz Pro | | nitro-digital0 -
Crawl Diagnostics 403 on home page...
In the crawl diagnostics it says oursite.com/ has a 403. doesn't say what's causing it but mentions no robots.txt. There is a robots.txt and I see no problems. How can I find out more information about this error?
Moz Pro | | martJ0 -
Has the relevancy of SEOmoz tools disappeared?
I have A rankings for my on-site grades for my most important keywords. I have no Critical issues and no Warnings with my Crawl Diagnostics. Most of the Competiive Link analysis data shows my site beating out the competition. If all this is accurate, how can my SERPs continue to decrease and lesser pages with terrible optimization and backlinking be ranking higher? I even have a facebook page beating me in the results. If there is nothing left for me to address using SEOmoz, and I keep getting worse & results, doesn't it mean that the SEOmoz tools are not relevant to producing actual results? Or, am I missing something?
Moz Pro | | TOPYX0 -
A question about Mozbot and a recent crawl on our website.
Hi All, Rogerbot has been reporting errors on our website's for over a year now, and we correct the issues as soon as they are reported. However I have 2 questions regarding the recent crawl report we got on the 8th. 1.) Pages with a "no-index" tag are being crawled by roger and are being reported as duplicate page content errors. I can ignore these as google doesnt see these pages, but surely roger should ignore pages with "no-index" instructions as well? Also, these errors wont go away in our campaign until Roger ignores the URL's. 2.) What bugs me most is that resource pages that have been around for about 6 months have only just been reported as being duplicate content. Our weekly crawls have never picked up these resources pages as being a problem, why now all of a sudden? (Makes me wonder how extensive each crawl is?) Anyone else had a similar problem? Regards GREG
Moz Pro | | AndreVanKets0 -
20000 site errors and 10000 pages crawled.
I have recently built an e-commerce website for the company I work at. Its built on opencart. Say for example we have a chair for sale. The url will be: www.domain.com/best-offers/cool-chair Thats fine, seomoz is crawling them all fine and reporting any errors under them url great. On each product listing we have several options and zoom options (allows the user to zoom in to the image to get a more detailed look). When a different zoom type is selected it adds on to the url, so for example: www.domain.com/best-offers/cool-chair?zoom=1 and there are 3 different zoom types. So effectively its taking for urls as different when in fact they are all one url. and Seomoz has interpreted it this way, and crawled 10000 pages(it thinks exist because of this) and thrown up 20000 errors. Does anyone have any idea how to solve this?
Moz Pro | | CompleteOffice0 -
Duplicate page title
I own a store www.mzube.co.uk and the scam always says that I have duplicate page titles or duplicate page. What happens is thn I may have for example www.mzube.co.uk/allproducts/page1. And if I hve 20 pages all what will change from each page is the number at the end and all the rest of the page name will be the same but really the pages are if different products. So the scans think I have 20 pages the same but I havent Is this a concern as I don't think I can avoid this Hope you can answer
Moz Pro | | mzube0 -
Has anyone else not had an SEOmoz crawl since Dec 22?
Before the holidays, I completed a website redesign on an eCommerce website. One of the reasons for this was duplicate content. The new design has removed all duplicate content. (Product descriptions on 2 pages) I took a look at my Crawl Diagnostics Summary this morning and this is what I saw: Last Crawl Completed: Dec. 15th, 2011 Next Crawl Starts: Dec. 22nd, 2011 Thinking it might have something to do with the holidays. Although, I would like to see this data as soon as possible. Is there a way I can request a crawl from seomoz ? Thanks, John Parker
Moz Pro | | JohnParker27920 -
SEOmoz crawl error questions
I just got my first seomoz crawl report and was shocked at all the errors it generated. I looked into it and saw 7200 crawl errors. Most of them are duplicate page titles and duplicate page content. I clicked into the report and found that 97% of the errors were going off of one page It has ttp://legendzelda.net/forums/index.php/members/page__sort_key__joined__sort_order__asc__max_results__20 http://legendzelda.net/forums/index.php/members/page__sort_key__joined__sort_order__asc__max_results__20__quickjump__A__name_box__begins__name__A__quickjump__E etc Has 20 pages of slight variations of this link. It is all my members list or a search of my members list so it is not really duplicate content or anything. How can I get these errors to go away and make search my site is not taking a hit? The forum software I use is IPB.
Moz Pro | | NoahGlaser780