Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
-
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools.
However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
-
Thanks, Ryan. I was confusing those.
To execute the sitemap index, would I just point the crawlers to the index? Do you have any links to overviews of how to set that up?
-
There are also scripts you can purchase for very little cost and have them install it for you on your server and set up a cron job to have your sitemap run automatically each week and ping the search engines to find your sitemap.
One such service is at xml-sitemaps.com - they can install it for you and set up the cron job as well.
Just make sure you are on a good server that can handle the script if your website is large.
-
I think you may be getting xml sitemaps confused with sitemap pages. Your xml sitemap should live at /sitemap.xml as Alan pointed out. The seomoz and other sites that have a /sitemap page is for different purposes. Its not your xml file, its a "topical guide" to your website and all the major sections of your site.
Remember that you can also create a xml sitemap index if you need to have different sitemaps (video, news, content) that houses all the different xml sitemaps underneath it.
-
Typically I use /sitemap.xml
I dont think it matters what you call it, as long as its submitted to Google webmaster tools.
Check out sitemaps.org for more info on how to create a quality sitemap
-
Great, thanks!
Should we also have it live at /sitemap, as SEOmoz does?
-
Yes they will pick it up, you can put the address in your robots.txt file,
http://thatsit.com.au/robots.txt
then you dont need to submit it and all search engines will find it.
Keeep it up todate, free from 404's, 301's all urls should be status code 200, and keep it accurate. Bing for one will igniore it if it is not clean and acurate, they allow only a couple of percent error rate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Can I Do To Improve The SEO of My Site?
We have a website that is ranking okay but we can't seem to get past #6 or #7 for a specific national keyword, "self storage software". We are working on a more effective back-linking strategy right now, but we really are having a hard time identifying steps to take besides that. If anyone can help me out and give me some suggestions I would be very appreciative. Maybe even seeing a competitive analysis from someone else would help catch something that I am not seeing. Website is www.storageunitsoftware.com Thanks in advance.
Technical SEO | | kenturley0 -
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
How can i do SEO For Ecommerce site
I am doing SEO for my WP blog but now I am starting my recently launch an eCommerce site where I am selling electronics products. I want to know how can I do the SEO so at least I can top 10 position for my google India. Second how can i avoid duplicate content about copying manufacture contents. Please help
Technical SEO | | chandubaba0 -
Having to type Google CAPTCHA all the time
Hi guys, Our office has about 15 computers all on the same IP address and about 10 actively search on Google. Recently we have been asked to type in CAPTCHA almost every single time searching on Google and would like to know if you have any suggestions of resolving this. We do use Firefox Rank Checker to check ranking once per week (around 400 keywords) but we use Hide My Ass to hide the IP. No malware or virus detected on computers in the network. Many thanks for your help in advance David
Technical SEO | | sssrpm0 -
How can you manually diagnose the canonical problem
Good Monrning from snow dusted minus 3 degrees C Wetherby UK... Is there a quick way to diagnose wether or not a website has a canonical problem or not? So far Ive been doing this for example: Typing a full web address then one without the w's and seeing if a 301 redirect has been set up. But I'm not confident this is the best way to diagnose if there is a canonical problem with a site. I would like to ad that I want to see if a canonical problem exists with any site and webmanster tools is not available. Any insights welcome 🙂
Technical SEO | | Nightwing1 -
Google Duplicate Content Penalty On My Own Site?
I am certain that I have hit a google penalty filter for my site http://www.playpokeronline.ca for my main keywords "play poker online" in google.ca I rank 670th and used to be on the first page between 1 and 10 in June. On Bing I am like 9th On my site I found the entire site duplicated as follows Original: www.playpokeronline.ca Duplicate www.playpokeronline.ca/playpokeronline/ this duplicate was not intentional and seems to be a result of my hosting at godaddy. for every page on my site and it shows up in webmaster tools I blocked the duplicate with robots.txt and a few days ago dropped it and wrote a rel=connonical tag in the top of each page visitors dropped from 100 per day in august to 12-20 in the last month. Google says that if duplicate content is made to try to game serps they may filter or penalize my site. Have I triggered this penalty or a different sort of over optimization penalty? Will the rel= canonical tags fix this or should i do something else? This Penalty Business is Not my Idea of a good time Thank You Jeb
Technical SEO | | PokerCanada0 -
Removing a site from Google's index
We have a site we'd like to have pulled from Google's index. Back in late June, we disallowed robot access to the site through the robots.txt file and added a robots meta tag with "no index,no follow" commands. The expectation was that Google would eventually crawl the site and remove it from the index in response to those tags. The problem is that Google hasn't come back to crawl the site since late May. Is there a way to speed up this process and communicate to Google that we want the entire site out of the index, or do we just have to wait until it's eventually crawled again?
Technical SEO | | issuebasedmedia0 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0