Trouble Indexing one of our sitemaps
-
Hi everyone thanks for your help. Any feedback is appreciated. We have three separate sitemaps:
blog/sitemap.xml
events.xml
sitemap.xml
Unfortunately we keep trying to get our events sitemap to pickup and it just isn't happening for us. Any input on what could be going on?
-
There also seem to be url's which are duplicated:
/new-york-city-tickets/elektra-theatre-tickets/50-shades-the-musical-mar-21-2015-1283412.html
/new-york-city-tickets/elektra-theatre-tickets/50-shades-the-musical-mar-25-2015-1283241.html
/new-york-city-tickets/elektra-theatre-tickets/50-shades-the-musical-mar-27-2015-1283246.html=> 3 different url's - but the content seems to be identical on these pages.
You could try to do a full crawl with Screamingfrog - and check the semi-duplicates on your site (identical H1, metadescription,... and so on)
-
If I do a site:yoursite.com/minneapolis-tickets in Google I get results - so these pages seem to be in the index, even if this is not shown on the sitemap level in WMT.
I notice you use noindex on a substantial number of pages (for expired events) - maybe it would be better to use the unavailable after meta tag. See also: http://searchenginewatch.com/sew/news/2334932/ecommerce-seo-tips-for-unavailable-products-from-googles-matt-cutts
-
Update - if your site is identical to your username - the cause is almost certain related to the lack of indexable content on these pages. The event pages, while very userfriendly & valuable for end users, are too light for Google in terms of content. Apart from the title, most of this pages are quite identical (the maps, dates & prices are different) if you look at the source code.
-
Hi Dirk,
Thanks for your response. We have used fetch as google to test out a couple of the URL's and it worked on 1 out of 3. All the pages do have light content and I checked on the pages that we fetched that weren't indexed and we don't have any noindex, nofollow tags on the page. It is frustrating as we can see our competitors event pages indexing with no content. So any help is appreciated.
-
There could be many reasons why this sitemap is not indexed.
Are there any duplicates between the different sitemaps (if there are duplicates, they are not listed as indexed in the 2nd sitemap)
It could also be that the pages are too light in terms of content to get indexed - example - if you only list the event name, date, and place, without additional content it will probably not get indexed.
Are you sure that all the url's in these sitemap can be indexed (not blocked by robots.txt or noindex tag)- you could try a few url's of the sitemap in Fetch like google and see if they are fetched properly.
rgds
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pointing additional domains at your main one
I have two questions: I have bought a domain that is a misspelled version of my domain. I have created an A record with DNS provider to point to my main domain's IP and on my main site I modified .htaccess file to make a 301 redirect if referrer is that misspelled domain. I also bought an expired domain with some relevant backlinks. I intend to create a simple page for that domain and add a link to my main site. Which of those two approaches are best from SEO point of view? Thanks
Intermediate & Advanced SEO | | usabiliTEST_ux1 -
Should m-dot sites be indexed at all
I have a client with a site with a m-dot mobile version. They will move it to a responsive site sometime next year but in meanwhile I have a massive doubt. This m-dot site has some 30k indexed pages in Google. Each of this page is bidirectionally linked to the www. version (rel="alternate on the www, rel canonical on the m-dot) There is no noindex on the m-dot site, so I understand that Google might decide to index the m-dot pages regardless of the canonical to the www site. But my doubts stays: is it a bad thing that both the version are indexed? Is this having a negative impact on the crawling budget? Or risking some other bad consequence? and how is the mobile-first going to impact on this? Thanks
Intermediate & Advanced SEO | | newbiebird0 -
International XML Sitemaps - Standalone, or Integrate into Existing XML Sitemap?
Hi there, We understand that hreflang tagging can be incorporated into an existing XML sitemap. That said, is there any inherent issue with having two sitemaps - your regular XML sitemap plus an international XML sitemap which lists off many of the same URLs as your original XML sitemap? For example, one of our clients has an XML sitemap file they don't want to have to edit, but we want to implement international hreflang xml sitemaps for them. Can we add an "English" XML sitemap with the proper hreflang tagging even though this new sitemap contains many duplicates as the existing XML sitemap file? Thank you!
Intermediate & Advanced SEO | | FPD_NYC0 -
How to Index Faster?
Hello, I have a new website and updated fresh content regularly. My indexing status is very slow. When I search how to improve my indexing rate by Google, I found most of the members of Moz community replied there is no certain technique to improve your indexing. Apart from this you should keep posting fresh content more and more and wait for Google Indexing. Some of them asked for submitting sitemap and share posts on Twitter, Facebook and Google Plus. Well the above comments are from the year of 2012. I'm curious to know is there any new technique or methods are used to improve indexing rate? Need your suggestions! Thanks.
Intermediate & Advanced SEO | | TopLeagueTechnologies0 -
Infinite Scrolling: how to index all pictures
I have a page where I want to upload 20 pictures that are in a slideshow. Idea is that pictures will only load when users scroll down the page (otherwise too heavy loading). I see documentation on how to make this work and ensure search engines index all content. However, I do not see any documentation how to make this work for 20 pictures in a slideshow. It seems impossible to get a search engines to index all such pictures, when it shows only as users scroll down a page. This is documentation I am already familiar with, and which does not address my issue:
Intermediate & Advanced SEO | | khi5
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/ thank you0 -
Video XML Sitemap
I've been recently been information by our dev team that we are not allowed legally to make our raw video files available in a video XML sitemap...This is one of the required tags. Has anyone run into a similar situation and has figured out a way around it? Any ideas would be greatly appreciated. Thanks! Margarita
Intermediate & Advanced SEO | | MargaritaS0 -
How to deal with old, indexed hashbang URLs?
I inherited a site that used to be in Flash and used hashbang URLs (i.e. www.example.com/#!page-name-here). We're now off of Flash and have a "normal" URL structure that looks something like this: www.example.com/page-name-here Here's the problem: Google still has thousands of the old hashbang (#!) URLs in its index. These URLs still work because the web server doesn't actually read anything that comes after the hash. So, when the web server sees this URL www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact (www.example.com/#!page-name-here). Hopefully, that makes sense. So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage. Essentially, I'm afraid that Google is seeing thousands of versions of our homepage. Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no. And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months. Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs? Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G
Intermediate & Advanced SEO | | Celts180 -
Static index page or not?
Are there any advantages of dis-advantages to running a static homepage as opposed to a blog style homepage. I have be running a static page on my site with the latest posts displayed as links after the homepage content. I would like to remove the static page and move to a more visually appealing homepage that includes graphics for each post and the posts droppping down the page like normal blogs do. How will this effect my site if I move from a static page to a more dynamic blog style page layout? Could I still hold the spot I currently rank for with the optimized index content if I turn to a more traditional blog format? cheers,
Intermediate & Advanced SEO | | NoCoGuru0