Can submitting sitemap to Google webmaster improve SEO?
-
Can creating fresh sitemap and submitting to Google webmaster improve SEO?
-
Thanks for sharing Geoff Andrews!
-
Hi Kelly,
Also consider making sitemaps if:
- you have alternate versions of the website - like a mobile version for example
- you are using video content on the website
With the video content a sitemap will help you get a thumbnail in the results which can attract more clicks - especially with your how to type articles. If you are using a service like Wistia.com it's very easy to create a Video Sitemap.
-
Noted with thanks Davinia.
Appreciate your advice! -
Refreshing your sitemap wouldn't help SEO unless you had a sitemap that was incorrect or didn't have one at all. If you have a website that adds new pages often, like an eCommerce site, then you should use a dynamic XML sitemap (not a static xml sitemap). This will ensure that as your website changes the search engines are able to find the new pages.
It's also good that once you have upload your sitemap (via Google Webmaster Tools) that you monitor it for a while to ensure that all of your pages are being indexed and that there are no issues with how your website has been constructed.
Good luck,
Davinia -
Thanks for sharing.
-
Having a sitemap can yes because it tells Google about your site, but there is no reason to resubmit one if you've already got an up to date sitemap.
In short -
Up to date site map yay
Updating an up to date site map nay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If Fetch As Google can render website, it should be appear on SERP ?
Hello everyone and thank you in advance for helping me. I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP). Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties! I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host! Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem. If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP? mokaab_serp.png
Intermediate & Advanced SEO | | hamoz10 -
Can you create town focused landing pages for a website without breaking Google guidelines?
I recently watched a webmaster video that said that town focused landing pages are seen as doorway pages if they only exist to capture search traffic. And then I read that just because you can sell your product/service in a certain area, doesn't mean you can have a page for it on your website. Is it possible to create town focused landing pages for a website without breaking Google guidelines?
Intermediate & Advanced SEO | | Silkstream1 -
Google Webmaster tools -Fixing over 20,000+ crawl errors
Hi, I'm trying to gather all the 404 crawl errors on my website after a recent hacking that I've been trying to rectify and clean up. Webmaster tools states that I have over 20 000+ crawl errors. I can only download a sample of 1000 errors. Is there any way to get the full list instead of correcting 1000 errors, marking them as fixed and waiting for the next batch of 1000 errors to be listed in Webmaster tools? The current method is quite timely and I want to take care of all errors in one shot instead of over a course of a month.
Intermediate & Advanced SEO | | FPK0 -
Google does not index image sitemap
Hi, we put an image sitemap in the searchconsole/webmastertools http://www.sillasdepaseo.es/sillasdepaseo/sitemap-images.xml it contains only the indexed products and all images on the pages. We also claimed the CDN in the searchconsole http://media.sillasdepaseo.es/ It has been 2 weeks now, Google indexes the pages, but not the images. What can we do? Thanks in advance. Dieter Lang
Intermediate & Advanced SEO | | Storesco0 -
How can I make a list of all URLs indexed by Google?
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
Intermediate & Advanced SEO | | Bryggselv.no0 -
Can 301 redirects that are inaccurate cause Google suppressions on rankings?
In an interesting study by DeganSEO titled 'Negative Impact of 301 Redirects - A Case Study' a drop of rankings was observed when popular blog posts were redirected to product pages. One hypothesis is that the suppression is due to topical difference between the redirected pages (blog posts) and the target page. The topical difference issue is an interesting one when you consider it in the context of website migrations. We always recommend that 301 redirects are done at a page level and that if an equivalent page doesn't exist to just 301 anyway but to the most logical page. If you think about it Google are likely to frown on this because a) it's not a good experience for the user - 404 would be more accurate for them
Intermediate & Advanced SEO | | QubaSEO
b) it's lazy - if you have good content that has gained authority/trust then create the same content on the new site don't trytp pass that to an entirely different page. Thoughts? Experiences?0 -
Suggestion for improvement
I am working on the website and feel like we have a lot more content (original) also all of it is above the fold, however I don't seem to find the website ranked higher compared to other sites with similar keywords. My URL is: http://www.cypressindustries.com/Any suggestion for improvement or which areas I should be focusing?
Intermediate & Advanced SEO | | HasitR0 -
Random Google?
In 2008 we performed an experiment which showed some seemingly random behaviour by Google (indexation, caching, pagerank distributiuon). Today I put the results together and analysed the data we had and got some strange results which hint at a possibility that Google purposely throws in a normal behaviour deviation here and there. Do you think Google randomises its algorithm to prevent reverse engineering and enable chance discoveries or is it all a big load balancing act which produces quasi-random behaviour?
Intermediate & Advanced SEO | | Dan-Petrovic0