Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
-
Hello,
I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file).
There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort?
Many thanks!
Mark -
Hi Mark,
Sure, I would still submit these kind of domains as well to Google Search Console as submitting the domain name doesn't say anything about the indexation of the URL. It's a monitoring which you could use to submit URLs to be indexed but it's not a required thing. We have certain subdomains or root domains in there for our CDN for example that don't provide us with any additional SEO benefits but we still would like to make sure we can monitor some metrics there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do home page carry more seo benefit than other pages?
hi, i would like to include my kws in the URL and they are under 50 characters. is there anything in the algo that tells engines to give more importance to homepage?
White Hat / Black Hat SEO | | alan-shultis0 -
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
What can I put on a 404 page?
When it comes to SEO what can I put on a 404 page? I want to add content that actually makes the page useful so visitors will more likely stay on the website. Most pages just have a big image of 404 and a couple sentences saying what happened. I am wondering if Google would like if there was blog suggestions or navigational functions?
White Hat / Black Hat SEO | | JoeyGedgaud0 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
Why should I reach out to webmasters before disavowing links?
Almost all the blogs, and Google themselves, tell us to reach out to webmasters and request the offending links be removed before using Google's Disavow tool. None of the blogs, nor Google, suggest why you "must" do this, it's time consuming and many webmasters don't care and don't act. Why is this a "required" thing to do?
White Hat / Black Hat SEO | | RealSelf0 -
Has anyone had experience with the Disavow Links tool? If so did you notice positive results from it?
I recently noticed a large number of backlinks from low authority directories coming in for one of my clients. These links were either purchased from a competitor or from a directory service site that knows we might be willing to pay to have bad links removed. I've contacted the website admin and they require a payment of $.30 per link to have them removed from their directory. Has anyone had a similar experience? I'm also considering using the disavow tool but I've heard the outcome of using this tool is usually bad. I'd appreciate any feedback, thanks!
White Hat / Black Hat SEO | | Leadhub1 -
How does Google rank a websites search queries
Hello, I can't seem to find an answer anywhere. I was wondering how a websites search query keyword string url can rank above other page results that have stronger backlinks. The domain is usually strong, but that url with the .php?search=keyword just seems like it doesn't fit in. How does Google index those search string pages? Is it based off of traffic alone to that url? Because those urls typically don't have backlinks, right? Has anyone tried to rank their websites search query urls ever? I'm just a little curious about it. Thanks everyone. Jesse
White Hat / Black Hat SEO | | getrightmusic0 -
Single Domain With Different Pages Deep Linking To Different Pages On External Domain
I've been partaking in an extensive trial study and will be releasing the results soon, however I do have quite a strong indication to the answer to this question but would like to see what everyone else thinks first, to see where the common industry mindset is at. Let's say SiteA.com/page1.html is PR5 and links out to SiteB.com/page1.html This of course would count as a valuable backlink. Now, what would happen if SiteA.com/page2.html, which is also PR5, links out to SiteB.com/page2.html ? The link from SiteA is coming from a different page, and is also pointing to a different deeplink on SiteB, however it will contain the same IP address. What would the benefit be for having multiple deeplinks in this way (as outlined above, please read it carefully before responding) as opposed to having just a single deeplink from the domain? If a benefit does exist, then does the benefit start to become trivial? This has nothing to do with sitewide links. Serious answers only please.
White Hat / Black Hat SEO | | stevenheron1