Summarize your question.Sitemap blocking or not blocking that is the question?
-
Hi from wet & overcast wetherby UK
Ones question is this...
" Is the sitemap plus boxes blocking bots ie they cant pass on this page http://www.langleys.com/Site-Map.aspx "
Its just the + boxes that concern me, i remeber reading somewherte javascript nav can be toxic.
Is there a way to test javascript nav set ups and see if they block bots or not?
Thanks in advance
-
I use Screaming Frog SEO Spider (free version) to check the internal link structure of a website. If a page is blocking ALL spiders it will pick it up.
Another thing I would say would be to check in Google Webmaster Tools to see if there are any crawl errors.
And the last thing I would add is to make sure that you have a non-JavaScript way to find all the pages on your website - through strong internal linking or a manual sitemap page that isn't generated through JS.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL structuring / redirect question
Hi there, I have a URL structuring / redirect question. I have many pages on my site but I set each page up to fall under one of two folders as I serve two unique markets and want each side to be indexed properly. I have SIDE A: www.domain/FOLDER-A.com and SIDE B: www.domain/FOLDER-B. The problem is that I have a page for www.domain.com and www.domain/FOLDER-A/page1.com but I do NOT have a page for www.domain/FOLDER-A. The reason for this is that I've opted to make what would be www.domain/FOLDER-A be www.domain.com and act the primary landing page the site. As a result, there is no page located at www.domain/FOLDER-A. My WordPress template (Divi by Elegant Themes) forced me to create a blank page to be able to build off the FOLDER-A framework. My question is that given I am forced to have this blank page, do I leave it be or create a 302 or 307 redirect to www.domain.com? I fear using a 301 redirect given I may want to utilize this page for content at some point in the future. This isn't the easiest post to follow so please let me know if I need to restate the question. Many thanks in advance!
Technical SEO | | KurtWSEO0 -
Bing rankings question
Hi, We just wrapped up a website redesign about a month ago. The content stayed primarily the same. Once we launched the new site all of our rankings in Google stayed the same but we lost rank for all competitive keywords on Bing. I looked in Bing Webmaster tools and it doesn't show any penalties but it does show that we have too many H1 tags. I don't think the H1 tag thing is the issue but maybe. Do you know what could be causing this?
Technical SEO | | BT20090 -
XML Sitemap Generators
I am looking to use a different sitemap generator that can do 5 thousand or more pages at once. Any recommendations? Thanks guys.
Technical SEO | | Chenzo0 -
Using a single sitemap for multiple domains
We have a possible duplicate content issue based on the fact that we have a number of websites run from the same code base across .com / .co.uk / .nl / .fr / .de and so on. We want to update our sitemaps alongside using the href lang tags to ensure Google knows we've got different versions of essentially the same page to serve different markets. Google has written an article on tackling this:https://support.google.com/webmasters/answer/75712?hl=en but my question remains whether having a single sitemap accessible from all the international domains is the best approach here or whether we should have individual sitemaps for each domain.
Technical SEO | | jon_marine0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Blocked by robots
my client GWT has a number of notices for "blocked by meta-robots" - these are all either blog posts/categories/or tags his former seo told him this: "We've activated following settings: Use noindex for Categories Use noindex for Archives Use noindex for Tag Archives to reduce keyword stuffing & duplicate post tags
Technical SEO | | Ezpro9
Disabling all 3 noindex settings above may remove google blocks but also will send too many similar tags, post archives/category. " is this guy correct? what would be the problem with indexing these? am i correct in thinking they should be indexed? thanks0 -
Meta tags question - imagetoolbar
We inherited some sites from another vendor & they have these tags in the head of all pages. Are they of any value at all? Thanks for the help! Wick Smith
Technical SEO | | wcksmith0 -
More than 1 XML Sitemap
I recently took over administration of my site and I have 2 XML sitemaps for my main site and 1 XML sitemap for my blog (which is a sub-page of the main site). Don't I only need 1 sitemap for my site and one for my blog? I don't know which one to delete - they both has the same page authority. Also, only 1 of them is accessible by browser search. http://www.rmtracking.com/rmtracking-sitemap.xml - accessible in browser http://www.rmtracking.com/sitemap.xml - regularly updated in Google Webmaster Tools but not accessible in search browser. I don't have any error messages in Webmaster tools.
Technical SEO | | BradBorst0