Why is our sitemap not being indexed on Webmaster Tools?
-
Hi there,
We have been having a problem with one of our websites. It appears as though someone has stolen our template and used it for themselves, but in the process also stole our analytics information. We have problems with the analytics, but are fixing that ourselves. The problem we have now is that when we tried to put in a sitemap into Google Webmaster Tools the URLs are submitted but have yet to be indexed. We have tried pinging them, but there has been no change. This is not a problem for our other websites which are very similar.
What could be the problem here? For reference, the url is http://www.dentistinlittlerock.com
Thank you for your responses in advance!
-
I just noticed that some of your pages are indexed under www.dentistinlittlerock.com and some with out the www's at dentistinlittlerock.com. This could be why you are not seeing them all indexed in your webmaster tools. Webmaster tools see these as two different sites.
You should set up a 301 redirect to have them all go to one domain(either www.dentistinlittlerock.com or dentistinlittlerock.com). This should fix it.
-
What makes you think that the pages are not being index?
Looking at your sitemap at http://www.dentistinlittlerock.com/sitemap.xml I see 10 different pages listed. Two of these are a duplicate listing of the home page so you really have 9 separate pages listed there.
Looking indexed pages on Google I see all 9 pages listed: http://www.google.com/search?ie=UTF-8&q=site%3Adentistinlittlerock.com
So it looks like all of the pages have been indexed. If you have a different sitemap please list it so we can take a look at it.
Thanks
-
A few suggestions:
-
check your sitemap location in Google WMT. Be sure it is correct. GWMT > Site Configuration > Sitemaps
-
check to ensure your sitemap is accessible. Try using another pc and putting in the URL to the sitemap.
-
your sitemap should be located in your root level directory. Alternatively you can establish a route for your sitemap.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Site structure and Visual Sitemaps
Aside from mind mapping software are there any tools ( recommended) to build a visual sitemap of the internal linking structure of a URL? I've been trying to 'show' clients the structure of a website as it pertains to internal and external links. Here is one I've tried it's "Close" - http://site-visualizer.com/ . I've been using the excel export function, import into mind meister and building it. It's a teeny bit time consuming for large websites. Site structure I feel is a valuable portion of SEO and a down and dirty visual explanation would be great. Don't get me wrong, it offers other benefits as well- it's just I'd like to free up the time it takes. Thank you in advance. Screen shots are available on the website of the organization.
Web Design | | TammyWood0 -
Google text-only vs rendered (index and ranking)
Hello, can someone please help answer a question about missing elements from Google's text-only cached version.
Web Design | | cpawsgo
When using JavaScript to display an element which is initially styled with display:none, does Google index (and most importantly properly rank) the elements contents? Using Google's "cache:" prefix followed by our pages url we can see the rendered cached page. The contents of the element in question are viewable and you can read the information inside. However, if you click the "Text-only version" link on the top-right of Google’s cached page, the element is missing and cannot be seen. The reason for this is because the element is initially styled with display:none and then JavaScript is used to display the text once some logic is applied. Doing a long-tail Google search for a few sentences from inside the element does find the page in the results, but I am not certain that is it being cached and ranked optimally... would updating the logic so that all the contents are not made visible by JavaScript improve our ranking or can we assume that since Google does return the page in its results that everything is proper? Thank you!0 -
What is the best tool to view your page as Googlebot?
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
Web Design | | EcommerceSite0 -
So apparently SEO moz will get us de-indexed according to a SEO company!
Each and every day i get called up from an SEO company who promises to get me top spots in Google rankings if i quickly get on their special offer they have today normally i would say "no thanks and put the phone down" but i had a bit of spare time so i indulged the guy and we got talking. After the introductions and speal about his company he was showing me what his company does and how they go about it to get me top ranks (they don't get me ranks but create a website they own which then passes leads to me- kinda clever since they could then start charging me per lead or my competitors) We continued to talk and i mentioned i used SEOmoz to check my rankings and back links etc and he told me that Google are cracking down and anyone using these types of software/websites will get their websites de indexed. This struck me as BS but i wanted to get your thoughts on the matter, i personally don't believe Google would ever do such a thing as this since it would be so easy to get your competitors websites taken down (i.e. negative seo) but its certainly a talking point.
Web Design | | GarethEJones0 -
Usual time to index and rank a new site
Hi Just wondering if anyone knew how long it usually takes for a brand new site to get indexed and ranked? I launched a new site about 5 weeks ago. So far I have had 96,000 pages indexed but the majority haven't ranked particularly well or appeared. The ones that have ranked aren't ranking high even though they have better content than competitors sites... And my old domain. Do I just need to hang tight and wait till my domain authority improves? Is there anything I can do to speed up this process? cheers
Web Design | | DavidLenehan0 -
Map Search Tools to integrate on Accommodation Website
Hello all, Can any recommend a Map Search tool that i can integrate into my accommodation website. Ideally I want to be able to pin my clients on the map search with links back to their listing page on my website and provide an alternative search facility for clients looking for accommodation. Am covering the South Africa region specifically. Am assuming that i could go down the Google Maps route but would really like to know what alternatives there are on offer. Also what SEO considerations do i need to think about when adding this to my website. Thanks in advance for any help.
Web Design | | SamanthaRiggien0 -
Recommended Website Monitoring Tools
Hi, I was wondering what people would recommend for website monitoring (IE is my website working as it should!). I need something that will:
Web Design | | James77
1/. Allow multiple page monitoring not just homepage
2/. Do header status checking
3/. Do page content checking (ie if the page changes massively, or include the word "error") then we have an issue!
4/. Multiple alert possibilities. We currently use www.websitepulse.com and it is a good service that does all the above, however it just seems so overly complex that its hard to understand what is going on, and its complex functionality and features are really a negative in our case. Thanks0