Do I submit a sitemap for a highly dynamic site or not? If so, what's the best way to go about doing it?
-
I do SEO for online boutique marketplace. I've been here for about 4 weeks and no one's done there SEO (they've been around for about 5 years), so there's lots to do. A big concern is whether or not to submit a sitemap, and if I do submit one, what's the best way to go about doing one.
-
Like Bernadette already mentioned, it's very important as well for a dynamic site to have a sitemap. It will make sure that the pages that are new are submitted to Google faster then them finding out by themselves what kind of pages are new.
-
Welcome to Moz! It looks like the site has about 169,000 pages indexed Google currently. So, if that's the number of pages you have on your site, then they're crawling and indexing it just fine.
Since you did bring up the fact that you're dealing with dynamic pages, or dynamic URLs, it is important that you have a sitemap (probably multiple sitemaps) available so that Google can quickly crawl and have the proper URLs indexed.
You currently don't have a sitemap file here: https://jane.com/sitemap.xml which is where it should reside. I recommend also listing the sitemap file(s) in your robots.txt file here, as well: https://jane.com/robots.txt
Your site's web development team will need to auto generate the sitemap files, which currently isn't happening right now. I recommend having up to 50,000 URLs in each file, as it can get quite large if it's over that number. If you're able to generate the files based on certain criteria (such as main pages in the site, categories, or something else), then that would be helpful, as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One company, 3 countries, 3 sites - best solution?
Hi all, I'm working with a company that has 3 x websites all on separate WordPress platforms. One is at .com, the others .fr and .de - they are essentially very similar. I have suggested that it is worth exploring setting all of these websites up on the .com domain with country-specific directories to combine their authority and help all 3 websites naturally rank due to combining incoming links, authority etc. Quesitons: To ensure each country has control of their site, would you maintain a separate install of WP at each directory, i.e: .com/fr/ and .com/de or would you put it all on the same WP? Would you go down this route of combining all 3 sites onto one domain with country-specific directories? What are the pitfalls?
Technical SEO | | Bee1590 -
Conversion of URL's for Readability
Reading over Rands latest Post about URL structure I had a quick question about the best way to convert URL's that don't have perfect URL structure... Current the Structure of our E-commerce store has a structure that is not friendly with domain.com/product/zdcd-jobd3d-fdoh what is the easiest way to convert these to read URL's without causing any disruptions with the SERP. Are we talking about a MOD-Rewrite in the CMS.......
Technical SEO | | CMcMullen0 -
Are backlinks the reason for my site's much lower SERP ranking, despite similar content?
Hi all, I'm trying to determine why my site (surfaceoptics.com) ranks so much lower than my competitor's sites. I do not believe the site / page content explains this differential in ranking, and I've done on-site / on-page SEO work without much or any improvement. In fact I believe my site is very similar in quality to competitor sites that rank much higher for my target keyword of: hyperspectral imaging. This leads me to believe there is a technical problem with the site that I'm not seeing, or that the answer lies in our backlink profile. The problem is that I've compared our site with 4 of our competitors in the Open Site Explorer and I'm not seeing a strong trend when it comes to backlinks either. Some competitors have more links / better backlink profiles but then other sites have no external links to their pages and lower PA and DA and still outrank us by 30+ positions. How should I go about determining if the problem is backlinks or some technical issue with the site?
Technical SEO | | erin_soc0 -
Robots.txt crawling URL's we dont want it to
Hello We run a number of websites and underneath them we have testing websites (sub-domains), on those sites we have robots.txt disallowing everything. When I logged into MOZ this morning I could see the MOZ spider had crawled our test sites even though we have said not to. Does anyone have an ideas how we can stop this happening?
Technical SEO | | ShearingsGroup0 -
What is the best way to optimize a page for a magazine
Hi i have a serious problem with a website that i am building http://www.cheapflightsgatwick.com/ with reference to letting the search engines know what the magazine is about. I am building a holiday magazine which will focus on holiday news, cheap deals and holiday reviews. I am wanting the home page to feature for the following keywords holiday news, holiday magazine, holiday ideas, best holiday deals, but the problem i have is, i have tried putting an introduction on the home page but it looks out of place, so what is the best way for me to let google know about what the site is about and to get it ranking well in the search engines any help and advice would be great
Technical SEO | | ClaireH-1848860 -
Best way to handle different views of the same page?
Say I have a page: mydomain.com/page But I also have different views: /?sort=alpha /print-version /?session_ID=2892 etc. All same content, more or less. Should the subsequent pages have ROBOTS meta tag with noindex? Should I use canonical? Both? Thanks!
Technical SEO | | ChatterBlock0 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0