Is submitting site map advisable after 1.5 yrs
-
Hello,
is submitting site map advisable after 1.5 yrs - especially when we are finding mostly all of pages getting indexed
we have basically 2 questions
- is submitting site map good for seo and crawling
- Does it not hinder the crawler to index the submitted pages only - as we get on close to 250 - 300 new pages monthly on site added on every month
Basis above - seek expert opinion to
- submit site map to google or not ?
-
If yes - in how much time do we need to update site map
-
which are good site map generator sites
-
Depending on your sites hierarchy and structure, the sitemap may make the whole crawling process easier for Google rather than making the crawler dig through your menus and links to find pages within your site. So in that case, it wont limit Google.
Sitemaps can also include instructions on how often a page changes to determine how often Google should revisit that page. If your pages that have already been crawled are fairly static and do not change often, set the crawl frequency rate to low so you dont have to worry about taking up too much of your crawling bandwidth on already indexed pages.
Here is a good blog post about setting your sitemap page priority levels:
http://www.seoboy.com/does-setting-priority-and-frequency-in-your-sitemap-help-increase-rankings/
-
Hi May,
Thanx
just a query - when a site map is submitted , and later on when new pages n urls are been added - will it not limit google to crawl only the pages as per site map and not additional pages.
pl advise
-
Hi
I would say yes, I am of the belief that a sitemap is never a bad idea and submitting one (even later on) won't hurt anything. It also sounds like you have a rather large site, so this will help get those new pages added each month crawled. Even if all of your URLs are indexed, you can still use sitemaps to provide Google with information about other content such as videos and mobile versions of your site (if any). It should actually help the crawler rather than hinder it.
As far as generator sites, I don't have a particular one to recommend but doing a google search for "sitemap generator" pulls up a lot of results. You may want to test some of those and see what works best for your needs as many of them limit the number of pages crawled.
If you are using a CMS like Wordpress or Joomla there are sitemap plugins to help with this process.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Ajax tabs on site
Hello, On a webpage I have multiple tabs, each with their own specific content. Now these AJAX/JS tabs, if Google only finds the first tab when the page loads the content would be too thin. What do you suggest as an implementation? With Google being able to crawl and render more JS nowadays, but they deprecated AJAX crawling a while back. I was maybe thinking of doing a following implementation where when JS is disabled, the tabs collapse under each other with the content showing. With JS enabled then they render as tabs. This is usually quite a common implementation for tabbed content plugins on Wordpress as well. Also, Google had commented about that hidden/expandable content would count much less, even with the above JS fix. Look forward to your thoughts on this. Thanks, Conrad
Intermediate & Advanced SEO | | conalt1 -
I've submitted a disavow file in my www version account, should I submit one in my non-www version account as well?
My non-www version account is my preferred domain. Should I submit in both account? Or one account will take care of the disavow?
Intermediate & Advanced SEO | | ChelseaP0 -
Merging 3 websites into 1
Hi I was wondering if someone can give me a bit of advice - outside of my full time job I run three websites - all in the same area, but all do three different things 1. Directory (DA19) 2. Blog (DA 23) 3. Products (DA35) I want to merger all three website into one and all be on one website (rather do it now than keep working on three websites). I have included the DAs of each site ( I know they are not amazing, but i've only recently started working on two of them), but I want to place all three websites under the Blog url. Regarding 301's of the pages, would I be better doing at the top level and 301 all the pages to the home page, or spending the time and 301 the old product page for instance to the new product page - this is a much bigger project, but what are the potential gains. Is there anything else I should consider when switching the sites - all three are wordpress sites (I know it has its limitations but they are easy to create). Thanks in Advance Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
3 Wordpress sites 1 Tumblr site coming under 1domain(4subdomains) WPMU: Proper Redirect?
Hey Guys, witnessSF.org (WP), witnessLA.org(Tumblr), witnessTO.com(WP), witnessHK.com(WP), and witnessSEOUL.com(new site no redirects needed) are being moved over to sf.ourwitness.com, la.ourwitness.com and so forth. All under on large Wordpress MU instance. Some have hundreds of articles/links others a bit less. What is the best method to take, I understand there are easy redirects, and the complete fully manual one link at a time approach. Even the WP to WP the permalinks are changing from domain.com/date/post-name to domain.com/post-name? Here are some options: Just redirect all previous witinessla.org/* to la.ourwitness.org/ (automatic direct all pages to home page deal) (easiest not the best)2) Download Google Analytics top redirected domains about 50 urls have significant ranking and traffic (in LA's sample) and just redirect those to custom links. (most bang for the buck for the articles that rank manually set up to the correct place) 3) Best of the both worlds may be possible? Automated perhaps?I prefer working with .htaccess vs a redirect plugin for speed issues. Please advise. Thanks guys!
Intermediate & Advanced SEO | | vmialik0 -
Any Suggestions For My Site?
I've recently started a website that is based on movie posters. The site has fundamentally been built for users and not SEO but I'm wondering if anyone can see any problems or just general advice that may help with our SEO efforts? The "content" on the website are the movie posters. I know Google likes text content, but I don't see what else we could add that wouldn't be purely for SEO. My site is: http://www.bit.ly/ZSPbTA
Intermediate & Advanced SEO | | whispertera0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Getting rid of a site in Google
Hi, I have two sites, lets call them site A and site B, both are sub domains of the same root domain. Because of a server config error, both got indexed by Google. Google reports millions of inbound links from Site B to Site A I want to get rid of Site B, because its duplicate content. First I tried to remove the site from webmaster tools, and blocking all content in the robots.txt for site B, this removed all content from the search results, but the links from site B to site A still stayed in place, and increased (even after 2 months) I also tried to change all the pages on Site B to 404 pages, but this did not work either I then removed the blocks, cleaned up the robots.txt and changed the server config on Site B so that everything redirects (301) to a landing page for Site B. But still the links in Webmaster Tools to site A from Site B is on the increase. What do you think is the best way to delete a site from google and to delete all the links it had to other sites so that there is NO history of this site? It seems that when you block it with robots.txt, the links and juice does not disappear, but only the blocked by robots.txt report on WMT increases Any suggestions?
Intermediate & Advanced SEO | | JacoRoux0