Placing a robots.txt in the root of the subdomain will do it:
User-agent: *
Disallow: /
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Director
Company: MBC Netweather Ltd
Placing a robots.txt in the root of the subdomain will do it:
User-agent: *
Disallow: /
Here's info on noopener, I only noticed this coming up very recently on a lighthouse report. As I understand it, it tells the browser to use a separate process for whatever the linked page has on it, rather than the existing one which could hurt performance on your own page if that one is very heavy.
https://developers.google.com/web/tools/lighthouse/audits/noopener
You maybe right about Google+, but that said, in the google news guidelines, they do suggest that they may use google+ to 'better surface content'.
"Add or edit your Google+ Page URL. We may use publicly available information from your Google+ page to deliver a better news experience and to better surface your content. "
https://support.google.com/news/publisher/answer/4581428?hl=en-GB
Although I think it depends a little on how 'important' it thinks your site is, I think google crawls sites with regularly updated content very frequently, so I'd suspect that as long as you're linking to this content from somewhere prominent on your site, you shouldn't have an issue.
You could also use a sitemap to tell google about fresh content, perhaps also consider applying to google news if you can, and then use a news sitemap which google will certainly be very quick to check and spider new urls from. In the advice for google news, they also suggest posting content to google+, so I'd also use that to post new content to, as it's another method of telling google you have something fresh for them to spider.
Hi, as google are displaying more and more weather forecasts within results, we'd like to explore whether there's a possibility of exposing our api to google to allow them to use us as one of the providers for the data.
At the moment it appears they use weather.com, weather underground and maybe also accuweather (although I've not seen them mentioned for a while on there), but I'm not sure if this is some sort of commercial agreement, or whether it's simply that google have been given access to the api's from those providers in return for the link in the weather panel in the search results.
Does anyone have any information about this sort of thing (I assume weather isn't unique in this respect), or know of any way to contact google and find out at all please?
Thanks
Paul
Thanks, although their blog is even more worrying as they'll be stripping all info from searches where the user is signed into google.. (Will try to post the link properly this time)
http://analytics.blogspot.com/2011/10/making-search-more-secure-accessing.html
Seems like SEO is about to get that much harder:
http://analytics.blogspot.com/2011/10/making-search-more-secure-accessing.html
Any thoughts on this?
Here's info on noopener, I only noticed this coming up very recently on a lighthouse report. As I understand it, it tells the browser to use a separate process for whatever the linked page has on it, rather than the existing one which could hurt performance on your own page if that one is very heavy.
https://developers.google.com/web/tools/lighthouse/audits/noopener
Seems like SEO is about to get that much harder:
http://analytics.blogspot.com/2011/10/making-search-more-secure-accessing.html
Any thoughts on this?
Although I think it depends a little on how 'important' it thinks your site is, I think google crawls sites with regularly updated content very frequently, so I'd suspect that as long as you're linking to this content from somewhere prominent on your site, you shouldn't have an issue.
You could also use a sitemap to tell google about fresh content, perhaps also consider applying to google news if you can, and then use a news sitemap which google will certainly be very quick to check and spider new urls from. In the advice for google news, they also suggest posting content to google+, so I'd also use that to post new content to, as it's another method of telling google you have something fresh for them to spider.
Placing a robots.txt in the root of the subdomain will do it:
User-agent: *
Disallow: /
Looks like your connection to Moz was lost, please wait while we try to reconnect.