SEOMoz only crawling 5 pages of my website
-
Hello,
I've added a new website to my SEOmoz campaign tool. It only crawls 5 pages of the site. I know the site has way more pages then this and also has a blog.
Google shows at least 1000 results indexed.
Am I doing something wrong? Could it be that the site is preventing a proper crawl?
Thanks
Bill
-
You should have setup a subdomain (which is what you are very linkely to have done anyway) but this linking issue is a real sticking point for you at the moment.
It's difficult to give you concrete advise without knowing your friend's business model, marketing strategy and content, owever, lets just say for neatness he wants to keep his main squeeze page as it is - at www.kingofcopy.com - you could separate all of the squeeze pages from the 'subscribers' content by creating a sub folder called 'members-area' for example - so www.kingofcopy.com contains the squeeze page where it is now (and additional sqeeze pages reside at www.kingofcopy.com/maxoutsideusa.html etc)
and all of the opt in content is moved to www.kingofcopy.com/members-area/ ensuring all of the good info that shouldn't be visible is noindexed accordingly.
Of course, this advise is based on the assumption that you only want to rank squeeze pages.
If I were undertaking this project I would do things a little differently - as I believe that sqeeze pages have now lost some of their kick - perhaps due to the huge numbers of them I have seen... So instead I would have a lot of teaser articles and videos - which contain a lot of good keyword targeted content all SEOd to the max, making sure that there are some good nuggets of info in them - so that the reader thinks - Wow! If the stuff he gives away for free is this good then I can't wait to find out how much better the paid for stuff is!
In terms of onpage SEO and campaign management - separate content which you want highly visible from the members only content - store non-indexed pages emembers pages within a sub-folder - link all of the content you want visible and indexing in some way
-
well I am just doing this for a friend of mine. His site is not ranking as well as he would like it to. I know he has some issues but first I wanted to see what major errors I could find and then fix. Then of course I am only getting 5 pages of content.
The rest of his site is indexed in google. You can find lots of his pages. I was just trying to figure out why the tool is only crawling 5 pages.
I don't recall which campaign I set it up originally. Which one do you recommend?
-
Hey Bill,
Can you tell us what campaign type you initially setup: Sub domain, Root domain or Sub folder?
I believe you are going to struggle setting up your campaign to monitor all of these pages due to the current configuration - based on the link architecture/navigation.
Would it be fair to say that you are actually only concerned about monitoring the performance of the visible Sqeeze pages in the SERPS - because if every other page should only be visible when you opt in then it stands to reason that you would be better to have all of this content hidden using noindex, to preserve the value of the content within those pages - to give potential customers every reason to opt in?
If we had a better idea of what your end goal was it might help us better assist you.
-
I think what he has here is a squeeze page set as his home page. You can not access the rest of the site unless you optin. Of course some of the other subpages are indexed in Google so you can bypass the home page.
Because he is using a squeeze page with no navigation is this why there is no link to the rest of the sites content?
Sorry-Trying to follow along.
-
http://www.kingofcopy.com/sitemap.xml - references only 3 files (with the index and sitemap/xml link making it up to 5)
However the other sections of the site are installed into sub folders or are disconnected from the content referenced from your root www.kingofcopy.com
take a look at this sitemap further into the site from one of your subfolders http://www.kingofcopy.com/products/sitemap.xml and you will see what looks to be the 1000+ pages you refer to.
However, there is no connection between the root directory and these other pages and sub folders.
It appears that your main page is http://www.kingofcopy.com/main.html
Ordinarily you would want to bring them into one common, connected framework - with all accessible pages linked to in a structured and logical way - and if you have other exclusive squeeeze pages/landing pages that you do not want to show up in search results - and just direct users to them using mail shots etc then you can prevent them getting indexed - for example - you may want to prevent a pure sqeeze page like http://www.kingofcopy.com/max/maxoutsideusa.html from appearing in the SERPS.
To prevent all robots from indexing a page on your site, place the following meta tag into the section of your page:
Personally, I would consider a restructure to bring this content into the root directory - noindexing the squeeze pages as required - but this would need to be carefully planned and well executed with 301 redirects in place where content has moved from one directory to another
However, you could always shuffle around the first few pages - renaming main.html to index html and having the copy you currently have at www.kingofcopy.com in a lightbox/popup or similar over the top of the main page ?
I think the problem with the main.html page not being found as your default root/home page and the lack of connections between certain pages is the cause for a lot of the issues with your campaign crawling so few of the pages.
Incidentally, if you did restructure consider using Wordpress as would be a great fit with what you have produced already (and there are plently of wordpress squeeze page/product promotion themes available.
-
I feel like I've checked just about everything. I do not have access to his GWT.
Ryan, thanks for helping me with this.
-
Can you share the URL?
There are several things to check, starting with the robots.txt file and your site's navigation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Moz can't crawl my site
Moz is being blocked from crawling the following site - https://www.cleanchain.com. When looking at Robot.txt, the following is disallowing access but don't know whether this is preventing Moz from crawling too? User-agent: *
Moz Pro | | danhart2020
Disallow: /adeci/
Disallow: /core/
Disallow: /connectors/
Disallow: /assets/components/ Could something else be preventing the crawl?0 -
Pages with Duplicate Page Content Nov
Moz is showing all many of URL's as duplicate URLs. I put canonical for all the pages but still it showing all as duplicate page. These are URL's https://www.crystalizeonline.com/brands/ravenscroft-crystal/material/non-lead/page/2.html https://www.crystalizeonline.com/brands/ravenscroft-crystal/material/non-lead/page/2/sort-by/price/sort-direction/desc.html https://www.crystalizeonline.com/brands/ravenscroft-crystal/material/non-lead/page/2/sort-by/price/sort-direction/asc.html Their is a lot of pages like this. How can I get rid from all this issues.
Moz Pro | | crystalize0 -
Moz is treating my pages as duplicate content but the pages have different content in reality
Attached here is a screenshot of links with duplicate content. Here are some links that is also based on the screenshot http://federalland.ph/construction_updates/paseo-de-roces-as-of-october-2015 http://federalland.ph/construction_updates/sixsenses-residences-tower-2-as-of-october-2015/ http://federalland.ph/construction_updates/sixsenses-residences-tower-3-as-of-october-2015 The links that I have placed here have different content. So I don't why they are treated as duplicates BWWJuvQ
Moz Pro | | clestcruz0 -
Functionality of SEOmoz crawl page reports
I am trying to find a way to ask SEOmoz staff to answer this question because I think it is a functionality question so I checked SEOmoz pro resources. I also have had no responses in the Forum too it either. So here it is again. Thanks much for your consideration! Is it possible to configure the SEOMoz Rogerbot error-finding bot (that make the crawl diagnostic reports) to obey the instructions in the individual page headers and http://client.com/robots.txt file? For example, there is a page at http://truthbook.com/quotes/index.cfm month=5&day=14&year=2007 that has – in the header -
Moz Pro | | jimmyzig
<meta name="robots" content="noindex"> </meta name="robots" content="noindex"> This page is themed Quote of the Day page and is duplicated twice intentionally at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2004 and also at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2010 but they all have <meta name="robots" content="noindex"> in them. So Google should not see them as duplicates right. Google does not in Webmaster Tools.</meta name="robots" content="noindex"> So it should not be counted 3 times? But it seems to be? How do we gen a report of the actual pages shown in the report as dups so we can check? We do not believe Google sees it as a duplicate page but Roger appears too. Similarly, one can use http://truthbook.com/contemplative_prayer/ , here also the http://truthbook.com/robots.txt tells Google to stay clear. Yet we are showing thousands of dup. page content errors when Google Webmaster tools as shown only a few hundred configured as described. Anyone? Jim0 -
How to find page with the link that returns a 404 error indicated in my crawl diagnostics?
Hi Newbie here - I am trying to understand what to do, step by step, after getting my initial reports back from seomoz. The first is regarding the 404 errors shown as high priority to fix, in crawl diagnostics. I reviewed the support info help on the crawl diagnostics page referring to 404 errors, but still did not understand exactly what I am supposed to do...same with the Q&A section when I searched how to fix 404 errors. I just could not understand exactly what anyone was talking about in relation to my 404 issues. It seems I would want to find the page that had the bad link that sent a visitor to a page not found, and then correct the problem by removing the link, or correcting and re-uploading the page being linked to. I saw some suggestions that seemed to indicate that seomoz itself will not let me find the page where the bad link is and that I would need to use some external program to do this. I would think that if seomoz found the bad page, it would also tell me what page the link(s) to the bad page exists on. A number of suggestions were to use a 301 redirect somehow as the solution, but was not clear when to do this versus, just removing the bad link, or repairing the page the link was pointing to. I think therefore my question is how do I find the links that lead to 404 page not founds, and fix the problem. Thanks Galen
Moz Pro | | Tetruss0 -
Too many on-page links
I received a warning in my most recent report for too many on-page links for the following page: http://www.fateyes.com/blog/. I can't figure out why this would be. I am counting between 60-70 including all pull downs, "read more's", archive, category and a few additional misc. links. Any ideas or suggestions on this? Or what I might do to rectify? Perhaps it's just an SEOmoz report blip... We currently don't have the post list rolling to additional pages so it's kind of passively set up to be endless, but it's in the works.
Moz Pro | | gfiedel0 -
Seomoz crawler problems
I have had Seomoz for about a month. It has crawled about 1000 pages. I have about 10,000 pages total for the site. Why are these others being a problem? I have contacted help but the guy isn't any help, we have just been going back and forth for the last two weeks. Any suggestions?
Moz Pro | | EcommerceSite0