Trouble Indexing one of our sitemaps
-
Hi everyone thanks for your help. Any feedback is appreciated. We have three separate sitemaps:
blog/sitemap.xml
events.xml
sitemap.xml
Unfortunately we keep trying to get our events sitemap to pickup and it just isn't happening for us. Any input on what could be going on?
-
There also seem to be url's which are duplicated:
/new-york-city-tickets/elektra-theatre-tickets/50-shades-the-musical-mar-21-2015-1283412.html
/new-york-city-tickets/elektra-theatre-tickets/50-shades-the-musical-mar-25-2015-1283241.html
/new-york-city-tickets/elektra-theatre-tickets/50-shades-the-musical-mar-27-2015-1283246.html=> 3 different url's - but the content seems to be identical on these pages.
You could try to do a full crawl with Screamingfrog - and check the semi-duplicates on your site (identical H1, metadescription,... and so on)
-
If I do a site:yoursite.com/minneapolis-tickets in Google I get results - so these pages seem to be in the index, even if this is not shown on the sitemap level in WMT.
I notice you use noindex on a substantial number of pages (for expired events) - maybe it would be better to use the unavailable after meta tag. See also: http://searchenginewatch.com/sew/news/2334932/ecommerce-seo-tips-for-unavailable-products-from-googles-matt-cutts
-
Update - if your site is identical to your username - the cause is almost certain related to the lack of indexable content on these pages. The event pages, while very userfriendly & valuable for end users, are too light for Google in terms of content. Apart from the title, most of this pages are quite identical (the maps, dates & prices are different) if you look at the source code.
-
Hi Dirk,
Thanks for your response. We have used fetch as google to test out a couple of the URL's and it worked on 1 out of 3. All the pages do have light content and I checked on the pages that we fetched that weren't indexed and we don't have any noindex, nofollow tags on the page. It is frustrating as we can see our competitors event pages indexing with no content. So any help is appreciated.
-
There could be many reasons why this sitemap is not indexed.
Are there any duplicates between the different sitemaps (if there are duplicates, they are not listed as indexed in the 2nd sitemap)
It could also be that the pages are too light in terms of content to get indexed - example - if you only list the event name, date, and place, without additional content it will probably not get indexed.
Are you sure that all the url's in these sitemap can be indexed (not blocked by robots.txt or noindex tag)- you could try a few url's of the sitemap in Fetch like google and see if they are fetched properly.
rgds
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting Google to index our sitemap
Hi, We have a sitemap on AWS that is retrievable via a url that looks like ours http://sitemap.shipindex.org/sitemap.xml. We have notified Google it exists and it found our 700k urls (we are a database of ship citations with unique urls). However, it will not index them. It has been weeks and nothing. The weird part is that it did do some of them before, it said so, about 26k. Then it said 0. Now that I have redone the sitemap, I can't get google to look at it and I have no idea why. This is really important to us, as we want not just general keywords to find our front page, but we also want specific ship names to show links to us in results. Does anyone have any clues as to how to get Google's attention and index our sitemap? Or even just crawl more of our site? It has done 35k pages crawling, but stopped.
Intermediate & Advanced SEO | | shipindex0 -
Facets Being Indexed - What's the Impact?
Hi Our facets are from what I can see crawled by search engines, I think they use javascript - see here http://www.key.co.uk/en/key/lockers I want to get this fixed for SEO with an ajax solution - I'm not sure how big this job is for developers, but they will want to know the positive impact this could have & whether it's worth doing. Does anyone have any opinions on this? I haven't encountered this before so any help is welcome 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Crawl efficiency - Page indexed after one minute!
Hey Guys,A site that has 5+ million pages indexed and 300 new pages a day.I hear a lot that sites at this level its all about efficient crawlabitliy.The pages of this site gets indexed one minute after the page is online.1) Does this mean that the site is already crawling efficient and there is not much else to do about it?2) By increasing crawlability efficiency, should I expect gogole to crawl my site less (less bandwith google takes from my site for the same amount of crawl)or to crawl my site more often?Thanks
Intermediate & Advanced SEO | | Mr.bfz0 -
301 from one site to another
I have two e-commerce websites and i'm going to remove some products from website as requested by a supplier and sell them only on one site. Is it a good idea to 301 redirect the pages from site 1 to site 2?? Thanks for your help
Intermediate & Advanced SEO | | Aikijeff0 -
Sitemap.xml
Hi guys I read the seomoz article about sitemap.xml dated 2008. Just wanted to check views on: Is it worthwhile using the 'priority' What if everything is set to 100% Any tips to using the priority Many thanks in advance! Richard
Intermediate & Advanced SEO | | Richard5550 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Should I Combine 30 websites into one?
I have a Private health care company that I have just begun consulting for. Currently in addition to the main website serving the whole group, 30 individual sites which are for each of the hospitals in their group. Each has it's own domain. Each site, has practically identical content: something that will be addressed in my initial audits. But should I suggest that they combine all the sites into one domain, providing individual category pages for each hosptial, or am I really going to suggest that each of the 30 sites, create unique content of their own. This means thirty pages of content on "hip replacements" thirty different versions of "our treatement" etc, and bearing in mind they all run off the same CMS, even with different body text, the pages are going to be practically identical. It's a big call either way! The reason they started out with all these sites, is that each hospital is it's own cost centre and whilst the web development team is a centralized resource. They each have their own sites to try and rank indivdually for local searches, naturally as they will each tend to get customers from their own local area. Not every hospital provides the full range of treatments.
Intermediate & Advanced SEO | | Ultramod0