Is possible to submit a XML sitemap to Google without using Google Search Console?
-
We have a client that will not grant us access to their Google Search Console (don't ask us why).
Is there anyway possible to submit a XML sitemap to Google without using GSC?
Thanks
-
Rosemary, please tell us how well this method works. It's hard, nowadays to rely on free platforms.
-
perfect, just what I needed. I just hope these "old school" ping efforts still work. The client won't let us access their Google Search Console and yet we need their website crawled asap.
Thanks!
-
Enter the full http address for your sitemap here https://www.xml-sitemaps.com/validate-xml-sitemap.html
then press Validate
on next page you can press Notify
-
Rosemary, Gaston is right--we generally list our sitemap URLs in the robots.txt file, which typically is enough for the search engine crawlers to find them. Keep in mind, though, that a sitemap file or files isn't really required at all if you have a really good site structure.
-
Thank you very much. Do you know where I can find more information about HTTP ping? The google articles don't really provide step by step information on how to do this.
-
Hello Rosemary,
Yeap, it is possible to tell Google your sitemap. In this article (official Webmasters Central), they offer 3 options:- Via Search Console.
- Via the robots.txt file.
- Using a HTTP Ping.
Hope I've helped.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible to expand organic reach in multiple countries/markets without localized content?
Hi everyone, I was recently hired as Content Lead for a SaaS company. We are based in Germany with plans to expand into the UK, Ireland, Spain, and the Netherlands. All of our website content is entirely in English and we don't have plans to localize content for any of the new markets. At least not yet.
Intermediate & Advanced SEO | | localyze_mason
One of my responsibilities will be to expand our organic reach through, mostly through SEO content. Though I'm comfortable with the fundamentals of SEO, I'm no expert and I certainly don't have experience with international SEO. I consulted a couple of resources like this guide to international SEO from Moz and this video from Semrush. In a nutshell, this is what I gather: if you want to expand organic reach in foreign countries/markets, you need to 1) decide what kind of domain you want to use and then implement the necessary technical configurations and 2) create localized content in the target market's language. As I mentioned, we won't be localizing any content at first. My question, then, is can we go about creating content in English and hope to gain any kind of meaningful organic exposure in non-English speaking markets? If so, what's the best approach? I apologize in advance if any of this isn't clear or if the answer is super obvious. Happy to provide further details upon request. Thanks in advance for any help that can be offered!0 -
Penalized By Google
My site name is bestmedtour .it's in English. I also want to have the Arabic version of the site. If I translate it with Google Translate, is it possible that the Arabic version of the site will be penalized?
Intermediate & Advanced SEO | | aalinlandacc0 -
Is it necessary to use Google's Structured Data Markup or alternative for my B2B site?
Hi, We are in the process of going through a re-design for our site. Am trying to understand if we need to use some sort of structured data either from Google Structured data or schema. org?
Intermediate & Advanced SEO | | Krausch0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
Google search results
I have been doing some searches on google to see where my new site shows up, I started using the search words "graphic design firm st. louis" as a gauge, because my title is St. Louis Missouri Graphic Design Firm. I showed up on about page 5 to start , if I include the word "firm" and a few pages further back if I just search "graphic design st. louis", without the word firm. It seemed i was slowly moving up pages with both searches and then a few days ago I jumped to page 1 for search "graphic design firm st. louis" the thing is it doesnt show up at all now if i search "graphic design st. louis" without the word firm. what would cause the one search to jump so high while the other one dissapeared completely?? and what can i do? my keyword density is same for both , any ideas.
Intermediate & Advanced SEO | | eric69660 -
Custom sitemap or sitemap generator tool
I have recently launched a website which is using a free sitemap generator (http://web-site-map.com/). It's a large travel agency site (www.yougoadventure.com) with predominantly dynamically generated content - users can add their products as and when and be listed automatically. The guy doing the programming for the site says the sitemap generator is not up to the job and that I should be ranking far better for certain search terms than the site is now. He reckons it doesn't provide lastmod info and the sitemap should be submitted every time a new directory is added or change made. He seems to think that I need to spend £400-£500 for him to custom build a site map. Surely there's a cheaper option out there for a sitemap that can be generated daily or 'ping' google every-time an addition to the site is made or product added? Sorry for the non tech speak - Ive got my web designer telling one thing and the programmer another so im just left trawling through Q&As. Thanks
Intermediate & Advanced SEO | | Curran0 -
No Results for Google/Bing Keyword Search by Domain Name
My site is bestwebconsult [dot] com When I do a search for my exact domain name in Google and bing, it does not appear at all. I have submitted a sitemap to Webmaster Tools. It is a relatively new site completed with in the last month. Built with Joomla. This leads me to believe that something is misconfigured on the website. Please advise, thanks!
Intermediate & Advanced SEO | | crave811 -
"nocontent" class use for Google Custom Search: SEO Ramifications?
Hi all, Have a client that uses Google Custom Search tool which is crawling, indexing and returning millions of irrelevant results for keywords that are on every page of the site. IT/Web dev. team is considering adding a class attribute to prohibit Google Custom Search from indexing bolierplate content regions. Here's the link to Google's custom search help page: http://support.google.com/customsearch/bin/answer.py?hl=en&answer=2364585 "...If your pages have regions containing boilerplate content that's not relevant to the main content of the page, you can identify it using the nocontent class attribute. When Google Custom Search sees this tag, we'll ignore any keywords it contains and won't take them into account when calculating ranking for your Custom Search engine. (We'll still follow and crawl any links contained in the text marked nocontent.) To use the nocontent class attribute, include the boilerplate content in a tag (for example, span or div) like this: Google Custom Search also notes:"Using nocontent won't impact your site's performance in Google Web Search, or our crawling of your site, in any way. We'll continue to follow any links in tagged content; we just won't use keywords to calculate ranking for your Custom Search engine."Just want to confirm if anyone can forsee any SEO implications the use of this div could create? Anyone have experience with this?Thank you!
Intermediate & Advanced SEO | | MRM-McCANN0