Sitemap & noindex inconstancy?
-
Hey Moz Community!
On a the CMS in question the sitemap and robots file is locked down. Can't be edited or modified what so ever.
If I noindex a page in the But it is still on the xml sitemap... Will it get indexed?
Thoughts, comments and experience greatly appreciate and welcome.
-
The site map is an indication to Google to crawl those pages, there are instances where people have meta tags with noindex, follow and would list them in their sitemaps so that Google will crawl all the links listed on the page but not index the page itself.
The meta tags or headers on your page will be the signal to Googlebot on how to handle that page regardless of your sitemap and whats on it.
-
It will get crawled, but it will not be indexed. Basically the crawler will get the page, see that and not index it. It could follow if you put noindex, follow. Or it cannot follow if you put noindex, nofollow.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed, not submitted in sitemap
I have this problem for the site's blog
Technical SEO | | seomozplan196
There is no problem when I check the yoast plugin setting , but some of my blog content is not on the map site but indexed. Did you have such a problem? What is the cause? my website name is missomister1 -
Amp version of website
Hello & thanks for reading its maybe the monday morning blues but i have two versions of a website - www.gardeners.scot and www.gardeners.scot/AMP/ the pages on the amp version have canonicals pointing to the "normal" website Should the links on "www.example.com/AMP/" point to the amp website or the normal website? what are your thougths?
Technical SEO | | livingphilosophy0 -
Google Bot Noindex
If a site has the tag, can it still be flagged for duplicate content?
Technical SEO | | MayflyInternet0 -
Why is robots.txt blocking URL's in sitemap?
Hi Folks, Any ideas why Google Webmaster Tools is indicating that my robots.txt is blocking URL's linked in my sitemap.xml, when in fact it isn't? I have checked the current robots.txt declarations and they are fine and I've also tested it in the 'robots.txt Tester' tool, which indicates for the URL's it's suggesting are blocked in the sitemap, in fact work fine. Is this a temporary issue that will be resolved over a few days or should I be concerned. I have recently removed the declaration from the robots.txt that would have been blocking them and then uploaded a new updated sitemap.xml. I'm assuming this issue is due to some sort of crossover. Thanks Gaz
Technical SEO | | PurpleGriffon0 -
I need an XML sitemap expert for 5 minutes!
Hi all! I'm hoping that someone with a lot of experience with XML sitemaps can help me out here... When submitting my sitemap in Google Webmaster Tools, these are the results:
Technical SEO | | IcanAgency
2,414,714 Submitted
34,721 Indexed And there's also tonnes of warnings. Would anyone be able to take a quick look at these sitemaps to perhaps advise me on what's going wrong there? These do not load without the www, not sure if this is an issue? http://www.eumom.ie/sitemap.xml
http://www.eumom.ie/sitemap.xml.gz Thanks everyone in advance!! Gavin0 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
Which is best of narrow by search URLs? Canonical or NOINDEX
I have set canonical to all narrow by search URLs. I think, it's not working well. You can get more idea by following URLs. http://www.vistastores.com/table-lamps?material_search=1328 http://www.vistastores.com/table-lamps?finish_search=146 These kind of page have canonical tag which is pointing to following one. http://www.vistastores.com/table-lamps Because, it's actual page which I want to out rank. But, all narrow by search URLs have very different products compare to base URLs. So, How can we say it duplicate one? Which is best solution for it. Canonical or NOINDEX it by Robots?
Technical SEO | | CommercePundit0 -
Sitemap with References to Second Domain
I have just discovered a client site that is serving content from a single database into two separate domains and has created xml sitemaps which contain references to both domains in an attempt to avoid being tagged for duplicate content. I always thought that a sitemap was intended to show the files inside a single domain and the idea of multiple domains in the sitemap had never occurred to me... The sites are both very large storefronts and one of them (the larger of the two) has recently seen a 50% drop in search traffic and loss of some 600 search terms from top 50 positions in Google. My first instinct is that the sitemaps should be altered to only show files within each domain, but am worried about causing further loss of traffic. Is it possible that the inclusion URLs for the second domain in the sitemap may in fact be signalling duplicate content to Search Engines? Does anyone have a definitive view of whether these sitemaps are good, bad or irrelevant?
Technical SEO | | ShaMenz0