Does Google index more than three levels down if the XML sitemap is submitted via Google webmaster Tools?
-
We are building a very big ecommerce site. The site has 1000 products and has many categories/levels. The site is still in construccion so you cannot see it online. My objective is to get Google to rank the products (level 5)
Here is an example
level 1 - Homepage - http://vulcano.moldear.com.ar/
Level 2 - http://vulcano.moldear.com.ar/piscinas/
Level 3 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/
Level 4 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes.html/
Level 5 - Product is on this level - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes/autocebante-recomendada-para-filtros-vc-10.html
Thanks
-
Excellent! Thanks Takeshi....I need to improve my internal linking then because all my internal linking is from the sites main menu.
-
Yup, Google should have no trouble crawling all the links in a sitemap, even if they are in sub-sub-sub-sub folders. With sitemaps, it's more about the number of links in the sitemap, moreso than the structure of your URLs.
As for actual indexation, make sure that even your deep pages are being linked to from your site. You want to structure the site so that your link value flows to all of your pages, especially your most valuable ones. Sitemaps can help Google with crawling your site and discovering new pages, but when it comes to rankings & indexation, links (external & internal) are King.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript content not being indexed by Google
I thought Google has gotten better at picking up unique content from javascript. I'm not seeing it with our site. We rate beauty and skincare products using our algorithms. Here is an example of a product -- https://www.skinsafeproducts.com/tide-free-gentle-he-liquid-laundry-detergent-100-fl-oz When you look at the cache page (text) from google none of the core ratings (badges like fragrance free, top free and so forth) are being picked up for ranking. Any idea what we could do to have the rating incorporated in the indexation.
Intermediate & Advanced SEO | | akih0 -
Site not showing up in search - was hacked - huge comment spam - cannot connect Webmaster tools
Hi Moz Community A new client approached me yesterday for help with their site that used to rank well for their designated keywords, but now is not doing well. Actually, they are not on Google at all. It's like they were removed by Google. There are not reference to them when searching with "site: url". I investigated further and discovered the likely problem . . . 26 000 spam comments! All these comments have been removed now. I clean up this Wordpress site pretty well. However, I want to connect it now to Google webmaster tools. I have admin access to the WP site, but not ftp. So I tried using Yoast to connect. Google failed to verify the site. So the I used a file uploading console to upload the Google html code instead. I check that the code is there. And Google still fails to verify the site. It is as if Google is so angry with this domain that they have wiped it completely from search and refuse to have any dealings with it at all. That said, I did run the "malware" check or "dangerous content" check with them that did not bring back any problems. I'm leaning towards the idea that this is a "cursed" domain in Google and that my client's best course of action is to build her business around and other domain instead. And then point that old domain to the new domain, hopefully without attracting any bad karma in that process (advice on that step would be appreciated). Anyone have an idea as to what is going on here?
Intermediate & Advanced SEO | | AlistairC0 -
Completely redesigned webmaster - set up new site in Google Webmaster Tools, or keep existing??
Hi - our company just completely redesigned our website and went from a static HTML site to a PHP based site, so every single URL has changed (around 1500 pages). I put the same verification code into the new site and re-verified but now Google is listing tons and tons of 404's. Some of them are really old pages that haven't existing in a long time, it would literally be impossible to create all the redirects for the 404s it's pulling. Question - when completely changing a site like this, should I have created a whole new Search Console? Or did I do the right thing by using the existing one?
Intermediate & Advanced SEO | | Jenny10 -
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Google Index Constantly Decreases Week over Week (for over 1 year now)
Hi, I recently started working with two products (one is community driven content), the other is editorial content, but I've seen a strange pattern in both of them. The Google Index constantly decreases week over week, for at least 1 year. Yes, the decrease increased 🙂 when the new Mobile version of Google came out, but it was still declining before that. Has it ever happened to you? How did you find out what was wrong? How did you solve it? What I want to do is take the sitemap and look for the urls in the index, to first determine which are the missing links. The problem though is that the sitemap is huge (6 M pages). Have you find out a solution on how to deal with such big index changes? Cheers, Andrei
Intermediate & Advanced SEO | | andreib0 -
Https & http urls in Google Index
Hi everyone, this question is a two parter: I am now working for a large website - over 500k monthly organic traffic. The site currently has both http and https urls in Google's index. The website has not formally converted to https. The https began with an error and has evolved unchecked over time. Both versions of the site (http & https) are registered in webmaster tools so I can clearly track and see that as time passes http indexation is decreasing and https has been increasing. The ratio is at about 3:1 in favor of https at this time. Traffic over the last year has slowly dipped, however, over the last two months there has been a steady decline in overall visits registered through analytics. No single page appears to be the culprit, this decline is occurring across most pages of the website, pages which traditionally draw heavy traffic - including the home page. Considering that Google is giving priority to https pages, could it be possible that the split is having a negative impact on traffic as rankings sway? Additionally, mobile activity for the site has steadily increased both from a traffic and a conversion standpoint. However that traffic has also dipped significantly over the last two months. Looking at Google's mobile usability error's page I see a significant number of errors (over 1k). I know Google has been testing and changing mobile ranking factors, is it safe to posit that this could be having an impact on mobile traffic? The traffic declines are 9-10% MOM. Thank you. ~Geo
Intermediate & Advanced SEO | | Geosem0 -
Google Webmaster Remove URL Tool
Hi All, To keep this example simple.
Intermediate & Advanced SEO | | Mark_Ch
You have a home page. The home page links to 4 pages (P1, P2, P3, P4). ** Home page**
P1 P2 P3 P4 You now use Google Webmaster removal tool to remove P4 webpage and cache instance. 24 hours later you check and see P4 has completely disappeared. You now remove the link from the home page pointing to P4. My Question
Does Google now see only pages P1, P2 & P3 and therefore allocate link juice at a rate of 33.33% each. Regards Mark0 -
Why do pages with a 404 error drop out of webmaster tools only to reappear again?
I have noticed a lot of pages which have fallen out of webmaster tools crawl error log that had bee 404'ing are reappearing again Any suggestions as to why this might be the case? How can I make sure they don't reappear again?
Intermediate & Advanced SEO | | Towelsrus0