Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I submit a sitemap for a site with dynamic pages?
-
I have a coupon website (http://couponeasy.com)
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically.I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical.
I have about 8-9 pages which are static and hence I can include them in sitemap.
Now the question is....
If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages?
NOTE: I need to create the sitemap for getting expanded sitelinks.
-
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
Hello Ahuj,
To answer your final question first:
Crawlers will not stop until they encounter something they cannot read or are told not to continue beyond a certain point. So your site will be updated in the index upon each crawl.
I did some quick browsing and it sounds like an automated sitemap might be your best option. Check out this link on Moz Q&A:
https://moz.com/community/q/best-practices-for-adding-dynamic-url-s-to-xml-sitemap
There are tools out there that will help with the automation process, which will update hourly/daily to help crawlers find your dynamic pages. The tool suggested on this particular blog can be found at:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
I have never used it, but it is worth looking into as a solution to your problem. Another good suggestion I saw was to place all removed deals in an archive page and make them unavailable for purchase/collection. This sounds like a solution that would minimize future issues surrounding 404's, etc.
Hope this helps!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
Spam sites with low spam score?
Hello! I have a fair few links on some of the old SEO 'Directory' sites. I've got rid of all the obviously spammy ones - however there are a few that remain which have very low spam scores, and decent page authority, yet they are clearly just SEO directories - I can't believe they service any other purpose. Should we now just be getting rid of all links like this, or is it worth keeping if the domain authority is decent and spam score low? Thanks Sam
White Hat / Black Hat SEO | | wearehappymedia0 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Unique page URLs and SEO titles
www.heartwavemedia.com / Wordpress / All in One SEO pack I understand Google values unique titles and content but I'm unclear as to the difference between changing the page url slug and the seo title. For example: I have an about page with the url "www.heartwavemedia.com/about" and the SEO title San Francisco Video Production | Heartwave Media | About I've noticed some of my competitors using url structures more like "www.competitor.com/san-francisco-video-production-about" Would it be wise to follow their lead? Will my landing page rank higher if each subsequent page uses similar keyword packed, long tail url? Or is that considered black hat? If advisable, would a url structure that includes "san-francisco-video-production-_____" be seen as being to similar even if it varies by one word at the end? Furthermore, will I be penalized for using similar SEO descriptions ie. "San Francisco Video Production | Heartwave Media | Portfolio" and San Francisco Video Production | Heartwave Media | Contact" or is the difference of one word "portfolio" and "contact" sufficient to read as unique? Finally...am I making any sense? Any and all thoughts appreciated...
White Hat / Black Hat SEO | | keeot0 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
Would linking out to a gambling/casino site, harm my site and the other sites it links out to?
I have been emailed asking if I sell links on one of my sites. The person wants to link out to slotsofvegas[dot]com or similar. Should I be concerned about linking out to this and does it reduce the link value to any of the other sites that the site links out to? Thanks, Mark
White Hat / Black Hat SEO | | Markus1111