Can anyone help me diagnose an indexing/sitemap issue on a large e-commerce site?
-
Hey guys. Wondering if someone can help diagnose a problem for me.
Here's our site: https://www.flagandbanner.com/
We have a fairly large e-commerce site--roughly 23,000 urls according to crawls using both Moz and Screaming Frog. I have created an XML sitemap (using SF) and uploading to Webmaster Tools. WMT is only showing about 2,500 urls indexed. Further, WMT is showing that Google is indexing only about 1/2 (approx. 11,000) of the urls. Finally (to add even more confusion), when doing a site search on Google (site:) it's only showing about 5,400 urls found. The numbers are all over the place!
Here's the robots.txt file:
User-agent: *
Allow: /
Disallow: /aspnet_client/
Disallow: /httperrors/
Disallow: /HTTPErrors/
Disallow: /temp/
Disallow: /test/Disallow: /i_i_email_friend_request
Disallow: /i_i_narrow_your_search
Disallow: /shopping_cart
Disallow: /add_product_to_favorites
Disallow: /email_friend_request
Disallow: /searchformaction
Disallow: /search_keyword
Disallow: /page=
Disallow: /hid=
Disallow: /fab/*Sitemap: https://www.flagandbanner.com/images/sitemap.xml
Anyone have any thoughts as to what our problems are??
Mike
-
A site running ASP should be perfectly fine. I bet you will see substantial increases in a lot of positive metrics by just pairing down that navigation.
-
Thanks so much for your response, Russ.
You're confirming one of the many issues we have identified (too many internal links) but I had not connected it to indexing or site speed. When I use the Google Page Speed Tool, many of our pages are not even registering. It seems like it's taking too long to load them so it times out. Could the crazy amount of links have to do with this, too?
Moreover, our mobile speed is especially poor. This could be an even bigger problem in mobile, no?
Are you familiar with .asp sites, in particular, having indexing issues...or is that a false assumption?
Mike
-
Thanks for the question!
First, it is very common to get inconsistent answers from GSC, site:, sitemap and crawl results. Don't worry too much about that.
Your goal is to get as many of your pages indexed and that is a function of links pointing to your site and internal link structure. While it is an imperfect analogy, we often refer to this as "crawl budget". There are essentially 2 solutions to this...
1. Get more/better backlinks to a diversity of pages on your site.
2. Improve your internal link architecture so that Googlebot finds your pages more quickly.
I think the problem in your case is that the site inundates bots with generic navigational links. For example, this page...
http://www.flagandbanner.com/products/chrome-air-force-lt-general-flag-kit.asp
has 1400 internal links! That is crazy!
This page has 1500!
https://www.flagandbanner.com/products/citizenship-gifts.asp
You need to reel this back in dramatically. Your navigation should like to top level categories or maybe a handful of subcategories. Once in a category, you can reveal deeper categories. This will increase the likelihood that the related and "also" buy links that you find on product pages will get found and followed by Googlebot.
Finally, on a different note, you need to make sure you standardize the casing of URLs (ie: /Products/ or /products/) I noticed that you have links both internal and external that do not take this into account, causing unnecessary duplicate content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site structure / IA out of balance? What does that mean to SEO?
I often see commentators mentioning out of balance site structures/IA but what does this actually mean in SEO terms? For example, Yoast advises: "If one category grows much larger than others, your site’s pyramid could be thrown off balance." Neil Patel advises "Try to balance the amount of subcategories within each category. Basically, try to keep it approximately even. If one main category has fourteen subcategories, while another main category has only three subcategories, this could become a little unbalanced." Does this have any direct influence on SEO (crawlability, etc.) or is this more a UX issue? I look forward to receiving your feedback.
Intermediate & Advanced SEO | | McTaggart0 -
E-commerce site blog creating bad signals?
I have an e-commerce site with quite a large (subdirectory) blog attached. The blog is very successful, having attracted about 2 million visitors last year - almost 4 times that of our actual e-commerce pages. Although all content is tangentially relevant, the blog does not convert well directly (mostly because it attracts people at the wrong point in the funnel). Our average bounce rate on e-commerce pages is around 40%, while the blog is about 90% (it answers questions directly with some outbound links); and average page visits to e-commerce pages is 4, compared to 1.3 on the blog. I am concerned that this 80% of my traffic that does not often convert and leaves the site quickly, is costing me in rankings on the pages that do perform well. We recently re-released the e-commerce section of the site and despite cleaning up our structure and content, fixing bad URL structure etc., we saw little benefit. I am therefore considering taking the blog OFF our site and moving it elsewhere, linking back to the e-commerce site and allowing it to stand on its own two feet. Is this a bad idea? Thoughts?
Intermediate & Advanced SEO | | redtalons10 -
How can a Page indexed without crawled?
Hey moz fans,
Intermediate & Advanced SEO | | atakala
In the google getting started guide it says **"
Note: **Pages may be indexed despite never having been crawled: the two processes are independent of each other. If enough information is available about a page, and the page is deemed relevant to users, search engine algorithms may decide to include it in the search results despite never having had access to the content directly. That said, there are simple mechanisms such as robots meta tags to make sure that pages are not indexed.
" How can it happen, I dont really get the point.
Thank you0 -
Help my site it's not being indexed
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design.... The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site. After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do.... Any help would be greatly appreciated Thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
Sitemap Issue - vol 2
Hello everyone! I validated the sitemap with different tools (w3Schools, and so on..) and no errors were found. So I uploaded into my site, tested it through GWT and BANG! all of a sudden there is a parsing error, which correspond to the last, and I mean last piece of code of thousand of lines, . I don't know why it isn't reading the code and it's giving me this as there are no other errors and I haven't got a clue about what to do in order to fix it! Thanks
Intermediate & Advanced SEO | | PremioOscar0 -
404 Errors with my RSS Feed/sitemap
In my google webmasters I just started getting 404 errors that I'm not sure how to redirect. I'm getting quite a few that are ending in /feed/ for instance /nyc-accident-injury/feed/
Intermediate & Advanced SEO | | jsmythd
contact-us-thank-you/feed/ and then also a problem with my sitemap I guess? With /site-map/?postsort=tags The domain is pulversthompson.com0 -
Redirecting site from html/php to wordpress
I've never come across this and haven't been able to really find anything that explains it very well. I want to get opinions before we make a definitive decision. Here's the scenario... I am working on a site that was built in HTML/PHP and some of the pages are ranking pretty well. (some page 1, but not number 1) We are going to start using the Wordpress platform by year's end. The pages that were built in html have been built a little spammy but they still rank. I just think they are keyword stuffed a little and not very "reader friendly" (I think the last person was spinning content). So, we've built completely new content on our new pages and we've commissioned really good content writers for them. I will be handling the on-page SEO going forward so I know what to do there. My questions are this.... Should I 301 the old pages to the new pages with the better content? (old pages have the .html or .php extensions so www.example.com/keyword.php will become www.example.com/keyword-keyword Is there any negative side to doing this since the content will be completely different then the old pages that are being 301 from. (Keywords are pretty much staying the same with the exception of minor variations. ie, www.example.com/red-cashmere-sweater.php to www.example.com/cashmere-sweater) I ask this because I've moved sites before where I've just changed the location of the same content. I've never done it where the content is changing and so is the URL extension. Thank you in advance for your help and guidance.
Intermediate & Advanced SEO | | DarinPirkey0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0