SEOMoz only crawling 5 pages of my website
-
Hello,
I've added a new website to my SEOmoz campaign tool. It only crawls 5 pages of the site. I know the site has way more pages then this and also has a blog.
Google shows at least 1000 results indexed.
Am I doing something wrong? Could it be that the site is preventing a proper crawl?
Thanks
Bill
-
You should have setup a subdomain (which is what you are very linkely to have done anyway) but this linking issue is a real sticking point for you at the moment.
It's difficult to give you concrete advise without knowing your friend's business model, marketing strategy and content, owever, lets just say for neatness he wants to keep his main squeeze page as it is - at www.kingofcopy.com - you could separate all of the squeeze pages from the 'subscribers' content by creating a sub folder called 'members-area' for example - so www.kingofcopy.com contains the squeeze page where it is now (and additional sqeeze pages reside at www.kingofcopy.com/maxoutsideusa.html etc)
and all of the opt in content is moved to www.kingofcopy.com/members-area/ ensuring all of the good info that shouldn't be visible is noindexed accordingly.
Of course, this advise is based on the assumption that you only want to rank squeeze pages.
If I were undertaking this project I would do things a little differently - as I believe that sqeeze pages have now lost some of their kick - perhaps due to the huge numbers of them I have seen... So instead I would have a lot of teaser articles and videos - which contain a lot of good keyword targeted content all SEOd to the max, making sure that there are some good nuggets of info in them - so that the reader thinks - Wow! If the stuff he gives away for free is this good then I can't wait to find out how much better the paid for stuff is!
In terms of onpage SEO and campaign management - separate content which you want highly visible from the members only content - store non-indexed pages emembers pages within a sub-folder - link all of the content you want visible and indexing in some way
-
well I am just doing this for a friend of mine. His site is not ranking as well as he would like it to. I know he has some issues but first I wanted to see what major errors I could find and then fix. Then of course I am only getting 5 pages of content.
The rest of his site is indexed in google. You can find lots of his pages. I was just trying to figure out why the tool is only crawling 5 pages.
I don't recall which campaign I set it up originally. Which one do you recommend?
-
Hey Bill,
Can you tell us what campaign type you initially setup: Sub domain, Root domain or Sub folder?
I believe you are going to struggle setting up your campaign to monitor all of these pages due to the current configuration - based on the link architecture/navigation.
Would it be fair to say that you are actually only concerned about monitoring the performance of the visible Sqeeze pages in the SERPS - because if every other page should only be visible when you opt in then it stands to reason that you would be better to have all of this content hidden using noindex, to preserve the value of the content within those pages - to give potential customers every reason to opt in?
If we had a better idea of what your end goal was it might help us better assist you.
-
I think what he has here is a squeeze page set as his home page. You can not access the rest of the site unless you optin. Of course some of the other subpages are indexed in Google so you can bypass the home page.
Because he is using a squeeze page with no navigation is this why there is no link to the rest of the sites content?
Sorry-Trying to follow along.
-
http://www.kingofcopy.com/sitemap.xml - references only 3 files (with the index and sitemap/xml link making it up to 5)
However the other sections of the site are installed into sub folders or are disconnected from the content referenced from your root www.kingofcopy.com
take a look at this sitemap further into the site from one of your subfolders http://www.kingofcopy.com/products/sitemap.xml and you will see what looks to be the 1000+ pages you refer to.
However, there is no connection between the root directory and these other pages and sub folders.
It appears that your main page is http://www.kingofcopy.com/main.html
Ordinarily you would want to bring them into one common, connected framework - with all accessible pages linked to in a structured and logical way - and if you have other exclusive squeeeze pages/landing pages that you do not want to show up in search results - and just direct users to them using mail shots etc then you can prevent them getting indexed - for example - you may want to prevent a pure sqeeze page like http://www.kingofcopy.com/max/maxoutsideusa.html from appearing in the SERPS.
To prevent all robots from indexing a page on your site, place the following meta tag into the section of your page:
Personally, I would consider a restructure to bring this content into the root directory - noindexing the squeeze pages as required - but this would need to be carefully planned and well executed with 301 redirects in place where content has moved from one directory to another
However, you could always shuffle around the first few pages - renaming main.html to index html and having the copy you currently have at www.kingofcopy.com in a lightbox/popup or similar over the top of the main page ?
I think the problem with the main.html page not being found as your default root/home page and the lack of connections between certain pages is the cause for a lot of the issues with your campaign crawling so few of the pages.
Incidentally, if you did restructure consider using Wordpress as would be a great fit with what you have produced already (and there are plently of wordpress squeeze page/product promotion themes available.
-
I feel like I've checked just about everything. I do not have access to his GWT.
Ryan, thanks for helping me with this.
-
Can you share the URL?
There are several things to check, starting with the robots.txt file and your site's navigation.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my MOZ report only crawling 1 page?
just got this weeks MOZ report and it states that it have only crawled: Pages Crawled: 1 | Limit: 10,000 it was over 1000 a couple of weeks ago, we have moved servers recently but is there anything i have done wrong here? indigocarhire.co.uk thanks
Moz Pro | | RGOnline0 -
Is www.domain.com/page the same url as www.domain.com/page/ for Google? (extra slash at end of url)
Dear all, in open site explorer there is a difference the url's 'www.domain.com/page' and 'www.domain.com/page/' (extra slash at end). There can be different values in pageauthority etc. in the open site explorer tool, but is this also the case for Google? Thanks for replying, Regards, Ben
Moz Pro | | HMK-NL0 -
Duplicate page reported in Wordpress site, but I can't find it in All Pages list
In a crawl report a duplicate page content warning has been displayed. The two urls are 1. http://www.superheroes.com.au/shop
Moz Pro | | Andyfools
and
2. http://www.superheroes.com.au/shop/category/catalog/ It's a WordPress site and I cannot find the second page anywhere in the list of All Pages in the Admin (I want to add canonicalisation code) When I view the 2nd page and click Edit Page, it redirects to the 1st page. Any ideas where SEOMoz would be finding this 2nd page or how it might be being generated? (btw I didn't build this site) Thanks Simon0 -
Tools that crawl 2 million page sites
Our site is about 2million pages deep, 50% of which is stale content. Yes, I know - OMG #unhygienic. Even if we get approval to get rid of half of it. SEOMoz Pro Elite only crawls 20k deep - what can i do to crawl and diagnose the whole site. Are there any tools anyone can suggest. SEOMoz??
Moz Pro | | ilhaam0 -
How to resolve Duplicate Content crawl errors for Magento Login Page
I am using the Magento shopping cart, and 99% of my duplicate content errors come from the login page. The URL looks like: http://www.site.com/customer/account/login/referer/aHR0cDovL3d3dy5tbW1zcGVjaW9zYS5jb20vcmV2aWV3L3Byb2R1Y3QvbGlzdC9pZC8xOTYvY2F0ZWdvcnkvNC8jcmV2aWV3LWZvcm0%2C/ Or, the same url but with the long string different from the one above. This link is available at the top of every page in my site, but I have made sure to add "rel=nofollow" as an attribute to the link in every case (it is done easily by modifying the header links template). Is there something else I should be doing? Do I need to try to add canonical to the login page? If so, does anyone know how to do it using XML?
Moz Pro | | kdl01 -
Set up of SEOMoz campaign for specific country within a global website
I'm tasked with optimising the UK part of a global site. www.mysite.com/UK - how should I set my campaign up in SEOMoz? Is it a sub domain?
Moz Pro | | columbus0 -
On-Page Optimisation tool on intranet pages
Does anybody know if there's any easy way to use the On-Page Optimisation tool on intranet or not publicly accessible pages? Thanks!
Moz Pro | | neooptic0 -
SEOMoz Campaign shows Warnings for pages with >200 and <300 links
We currently use SEOMoz's campaign tool to review the SEO progress of our site. One thing we are unsure of is that SEOMoz gives us a warning for over 1000 of our pages because we have around 200 links on those pages (all in the Menu Drop Downs). I read the post and watched the video, Whiteboard Friday Flat Site Architecture a while ago and Rand mentioned there is no issue with having a web page with 200 to 300 links and he even encouraged it. So why would these show up as warnings in our Campaign?
Moz Pro | | PBCLinear0