When I try creating a sitemap, it doesnt crawl my entire site.
-
We just launched a new Ruby app at (used to be a wordpress blog) -
We have not had time to create an auto-generated sitemap, so I went to a few different websites with free sitemap generation tools. Most of them index up to 100 or 500 URLS. Our site has over 1,000 individual listings and 3 landing pages, so when I put our URL into a sitemap creator, it should be finding all of these pages. However, that is not happening, only 4 pages seem to be seen by the crawlers.
TheSquareFoothttp://www.thesquarefoot.com/http://www.thesquarefoot.com/users/sign_inhttp://www.thesquarefoot.com/searchhttp://www.thesquarefoot.com/renters/sign_upThis worries me that when Google comes to crawl our site, these are the only pages it will see as well. Our robots.txt is blank, so there should be nothing stopping the crawlers from going through the entire site. Here is an example of one of the 1,000s of pages not being crawled****http://www.thesquarefoot.com/listings/Houston/TX/77098/Central_Houston/3910_Kirby_Dr/Suite_204Any help would be much appreciated!
-
Thanks for you help, can I ask one more question -
We just submitted a new sitemap to google for our new rails app -
http://www.thesquarefoot.com/sitemap.xml
Which has over 1,300 pages, however Google is only seeing 114. About 1,025 are in the listings folder / 250 blog posts / and 15 landing pages.
Any help would be appreciated!
Aron
-
I'd worry less about the sitemaps and more about internal linking structure. The problem you are having with crawlers is as symptom of the linking problem.
Most of your content seems to be on the other side of a search form. When crawlers, including those from search engines, explore you site they are looking for href links to follow - they will not submit forms.
If then you want the other content to be indexed then you need to provide a crawl path to it. Could you add links to each neighbourhood on page somewhere so that there is path to follow? That might lead on to further questions about your url structure and use of ajax too.
The general principal is that you should link to content you want to rank. Many will argue that a sitemap removes that necessity, but links provide more information that a list of URLs and I certainly wouldn't rely on sitemaps alone to get content indexed let alone ranked.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it worth creating an Image Sitemap?
We've just installed the server side script 'XML Sitemaps' on our eCommerce site. The script gives us the option of (easily) creating an image sitemap but I'm debating whether there is any reason for us to do so. We sell printer cartridges and so all the images will be pretty dry (brand name printer cartridge in front of a box being a favourite). I can't see any potential customers to search for an image as a route in to the site and Google appears to be picking up our images on it's own accord so wonder if we'll just be crawling the site and submitting this information for no real reason. From a quality perspective would Google give us any kind of kudos for providing an Image Sitemap? Would it potentially increase their crawl frequency or, indeed, reduce the load on our servers as they wouldn't have to crawl for all the images themselves?
Intermediate & Advanced SEO | | ChrisHolgate
I can't stress how little of a hardship it will be to create one of these automatically daily but am wondering if, like Meta Keywords, there is any benefit to doing so?1 -
Will merging sites create a duplicate content penalty?
I have 2 sites that would be better suited being merged and creating a more authoritative site. Basically I'de like to merge site A in to site B. If I add new pages from site A to Site B and create 301 redirects for those pages on site A to the new pages on Site B is that the best way to go about it? As the pages are already indexed would this create any duplicate content issue or would the redirect solve this?
Intermediate & Advanced SEO | | boballanjones0 -
Site re-design, full site domain A/B test, will we drop in rankings while leaking traffic
We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete
Intermediate & Advanced SEO | | leshonk0 -
Should I redirect my xml sitemap?
Hi Mozzers, We have recently rebranded with a new company name, and of course this necessitated us to relaunch our entire website onto a new domain. I watched the Moz video on how they changed domain, copying what they did pretty much to the letter. (Thank you, Moz for sharing this with the community!) It has gone incredibly smoothly. I told all my bosses that we may see a 40% reduction in traffic / conversions in the short term. In the event (and its still very early days) we have in fact seen a 15% increase in traffic and our new website is converting better than before so an all-round success! I was just wondering if you thought I should redirect my XML sitemap as well? So far I haven't, but despite us doing the change of address thing in webmaster tools, I can see Google processed the old sitemap xml after we did the change of address etc. What do you think? I know we've been very lucky with the outcome of this rebrand but I don't want to rest on my laurels or get tripped up later down the line. Thanks everyone! Amelia
Intermediate & Advanced SEO | | CommT0 -
301s from previous site
Hi! Got quite a tricky problem regarding a client, http://www.muchbetteradventures.com/ and their previous site, http://v1.muchbetteradventures.com/ Here's the background: We have approx 1500 'listing' pages like this: http://v1.muchbetteradventures.com/listing/view/1925/the-barre-des-ecrins-or-the-dome-des-ecrins-mountaineering-trip They bring in min 2k hits/month, and also add to the overall site authority I suspect. They will eventually all have a home on main domain. When they do, they will also each have been rewritten to be unique, so the value of them will increase (many are currently not). We also have landing pages like this: http://v1.muchbetteradventures.com/view/559/volunteering-holidays- which despite being hideous are ranked fairly well (page 1 for key terms). We cannot currently fulfil all these on main domain, but do not want to shut them down and lose positioning. Choices as I see it: Make a landing page e.g. muchbetteradventures.com/volunteering and a) redirect from old landing page, b) redirect all related 'listings' to this page. May help preserve rankings of main landing page (the most important), but not of any listings? Import all listings to have a home on main domain, (probably as children of a landing page, but not rewritten to be unique just yet). Make them not accessible from homepage, and change functionality of them so that new visitors from google are told we cannot currently help them with this trip. This is more work to complete so will take longer to do and is a distraction from our core focus so needs good justification! Stay running largely as we are, slowly redirecting 1 page at a time as we carry over more and more options to main domain. This will take over 12 months min.
Intermediate & Advanced SEO | | neooptic0 -
How to remove an entire site from Google?
Hi people, I have a site with around 2.000 urls indexed in google, and 10 subdomains indexed too, which I want to remove entirely, to set up a new web. Which is the best way to do it? Regards!
Intermediate & Advanced SEO | | SeoExpertos0 -
My site links have gone from a mega site links to several small links under my SERP results in Google. Any ideas why?
A site I have currently had the mega site links on the SERP results. Recently they have updated the mega links to the smaller 4 inline links under my SERP result. Any idea what happened or how do I correct this?
Intermediate & Advanced SEO | | POSSIBLE0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90