Is submitting site map advisable after 1.5 yrs
-
Hello,
is submitting site map advisable after 1.5 yrs - especially when we are finding mostly all of pages getting indexed
we have basically 2 questions
- is submitting site map good for seo and crawling
- Does it not hinder the crawler to index the submitted pages only - as we get on close to 250 - 300 new pages monthly on site added on every month
Basis above - seek expert opinion to
- submit site map to google or not ?
-
If yes - in how much time do we need to update site map
-
which are good site map generator sites
-
Depending on your sites hierarchy and structure, the sitemap may make the whole crawling process easier for Google rather than making the crawler dig through your menus and links to find pages within your site. So in that case, it wont limit Google.
Sitemaps can also include instructions on how often a page changes to determine how often Google should revisit that page. If your pages that have already been crawled are fairly static and do not change often, set the crawl frequency rate to low so you dont have to worry about taking up too much of your crawling bandwidth on already indexed pages.
Here is a good blog post about setting your sitemap page priority levels:
http://www.seoboy.com/does-setting-priority-and-frequency-in-your-sitemap-help-increase-rankings/
-
Hi May,
Thanx
just a query - when a site map is submitted , and later on when new pages n urls are been added - will it not limit google to crawl only the pages as per site map and not additional pages.
pl advise
-
Hi
I would say yes, I am of the belief that a sitemap is never a bad idea and submitting one (even later on) won't hurt anything. It also sounds like you have a rather large site, so this will help get those new pages added each month crawled. Even if all of your URLs are indexed, you can still use sitemaps to provide Google with information about other content such as videos and mobile versions of your site (if any). It should actually help the crawler rather than hinder it.
As far as generator sites, I don't have a particular one to recommend but doing a google search for "sitemap generator" pulls up a lot of results. You may want to test some of those and see what works best for your needs as many of them limit the number of pages crawled.
If you are using a CMS like Wordpress or Joomla there are sitemap plugins to help with this process.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Indexed vs. Submitted
My sitemap has been submitted to Google for well over 6 months and is updated frequently, a total of 979 URLs have been submitted by only 145 indexed. What can I do to get Google to index them all?
Intermediate & Advanced SEO | | moon-boots0 -
Ajax tabs on site
Hello, On a webpage I have multiple tabs, each with their own specific content. Now these AJAX/JS tabs, if Google only finds the first tab when the page loads the content would be too thin. What do you suggest as an implementation? With Google being able to crawl and render more JS nowadays, but they deprecated AJAX crawling a while back. I was maybe thinking of doing a following implementation where when JS is disabled, the tabs collapse under each other with the content showing. With JS enabled then they render as tabs. This is usually quite a common implementation for tabbed content plugins on Wordpress as well. Also, Google had commented about that hidden/expandable content would count much less, even with the above JS fix. Look forward to your thoughts on this. Thanks, Conrad
Intermediate & Advanced SEO | | conalt1 -
Moving to a new site while keeping old site live
For reasons I won't get into here, I need to move most of my site to a new domain (DOMAIN B) while keeping every single current detail on the old domain (DOMAIN A) as it is. Meaning, there will be 2 live websites that have mostly the same content, but I want the content to appear to search engines as though it now belongs to DOMAIN B. Weird situation. I know. I've run around in circles trying to figure out the best course of action. What do you think is the best way of going about this? Do I simply point DOMAIN A's canonical tags to the copied content on DOMAIN B and call it good? Should I ask sites that link to DOMAIN A to change their links to DOMAIN B, or start fresh and cut my losses? Should I still file a change of address with GWT, even though I'm not going to 301 redirect anything?
Intermediate & Advanced SEO | | kdaniels0 -
SEO Priorities for Ecommerce Sites
Hello All! What is the best way to rank SEO tasks by PRIORITY for Ecommerce sites to improve?? It can be quite overwhelming with all the types of projects/tasks needed to improve organic rankings... How would you rank the most CRITICAL tasks to spend the MOST TIME on to the tasks you spend less on. Appreciate your input in advance 🙂 Thank you! Mark
Intermediate & Advanced SEO | | wickerparadise0 -
Bad site migration - what to do!
Hi Mozzers - I'm just looking at a site which has been damaged by a very poor site migration. Basically, the old URLs were 301'd to a page on the new website (not a 404) telling everyone the page no longer existed. They did not 301 old pages to equivalent new pages. So I just checked Google WMT and saw 1,000 crawl errors - basically the old URLs. This migration was done back in February, since when traffic to the website has never recovered. Should I fix this now? Is it worth implementing the correct 301s now, after such a timelapse?
Intermediate & Advanced SEO | | McTaggart0 -
On-Site Directory - Delete or Keep?
We have 2 ecommerce sites. Both have been hit by Penguin (no warnings in WMT) and we're in the process of cleaning up backlinks. We have link directories on both sites. They've got links that are relevant to the sites but also links that aren't relevant. And they're big directories - we're talking thousands of links to other sites. What's the best approach here? Do we leave it alone, delete the whole thing, or manually review and keep highly relevant links but get rid of the rest?
Intermediate & Advanced SEO | | Kingof50 -
Why is my blog out-ranking my main site?
Please see attached ranking history chart. On June 5th the chart shows that my main site is not coming up under our main keyword "door hangers" From then on, our blog took over. Any ideas why? Thanks Andrea lpEBciu.jpg
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0