Should I submit a sitemap for a site with dynamic pages?
-
I have a coupon website (http://couponeasy.com)
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically.I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical.
I have about 8-9 pages which are static and hence I can include them in sitemap.
Now the question is....
If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages?
NOTE: I need to create the sitemap for getting expanded sitelinks.
-
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
Hello Ahuj,
To answer your final question first:
Crawlers will not stop until they encounter something they cannot read or are told not to continue beyond a certain point. So your site will be updated in the index upon each crawl.
I did some quick browsing and it sounds like an automated sitemap might be your best option. Check out this link on Moz Q&A:
https://moz.com/community/q/best-practices-for-adding-dynamic-url-s-to-xml-sitemap
There are tools out there that will help with the automation process, which will update hourly/daily to help crawlers find your dynamic pages. The tool suggested on this particular blog can be found at:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
I have never used it, but it is worth looking into as a solution to your problem. Another good suggestion I saw was to place all removed deals in an archive page and make them unavailable for purchase/collection. This sounds like a solution that would minimize future issues surrounding 404's, etc.
Hope this helps!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site just dropped significant!
Just noticed that my website onlinecasting.co.za just dropped 50+ places on basically all the keywords I'm following.
White Hat / Black Hat SEO | | KasperGJ
I can also see, that today there almost havent been any new sign-ups so something happened.
I didnt change anything. On issue, which might have something to do with it, is that I own several "copies" of the same site, just in different countries (domains). I host the websites myself, and they are all on the same server. The text and design in the same in some of the countries except that "jobs" are unique for the country. I also have:
onlinecasting.ae (english)
onlinecasting.sg (english)
onlinecasting.mx
and more coming So, could that be the reason, that google somehow now decided, that it wont accept the "allmost same site"?0 -
Somebody took an article from my site and posted it on there own site but gave it credit back to my site is this duplicate content?
Hey guys, This question may sound a bit drunk, but someone copied our article and re-posted it on their site the exact article, however the article was credited to our site and the original author of the article had approved the other site could do this. We created the article first though, Will this still be regarded as duplicate content? The owner of the other site has told us it wasn't because they credited it. Any advice would be awesome Thanks
White Hat / Black Hat SEO | | edward-may0 -
Are links on a press page considered "reciprocal linking"?
Hi, We have a press page with a list of links to the articles that have mentioned us (most of which also have a link to our website). Is there any SEO impact with this approach? Does Google consider these reciprocal links? And if so, would making the links on the press page 'nofollow' solve the issue?
White Hat / Black Hat SEO | | mikekeeper0 -
Have just submitted Disavow file to Google: Shall I wait until after they have removed bad links to start new content lead SEO campaign?
Hi guys, I am currently conducting some SEO work for a client. Their previous SEO company had built a lot of low quality/spam links to their site and as a result their rankings and traffic have dropped dramatically. I have analysed their current link profile, and have submitted the spammiest domains to Google via the Disavow tool. The question I had was.. Do I wait until Google removes the spam links that I have submitted, and then start the new content based SEO campaign. Or would it be okay to start the content based SEO campaign now, even though the current spam links havent been removed yet.. Look forward to your replies on this...
White Hat / Black Hat SEO | | sanj50500 -
Alt tag best practices for a mutli gallery site with hundreds of images
I have neglected to add alt tags to one of my sites, and am ready to tackle the project. I want to make sure I do not do something that will have a negative impact on rankings....and I have not been able to find info that fits my situation. The pics are all about a product I make and sell through the site. I have a free gallery section that has about 10 galleries with about 20 pics each. Each gallery page has a different model and/or context of how the product. might be used. These are not sales pages directly just thumbnail galleries linked to larger images for the viewers enjoyment. I have 10 or so keyword phrases that would be good to use, with the intent to start getting listed in google images and other rank enhancements. Can I choose one keyword phrase as my alt tag choice for a whole gallery and give each individual large pic in the gallery that same alt tag, And use a different phrase for the next gallery's pics etc.? Or is that thought of as stuffing, and I would have to come up with a different keyword phrase for each pic? I hope that makes sense. Thanks Galen
White Hat / Black Hat SEO | | Tetruss0 -
How to Handle Sketchy Inbound Links to Forum Profile Pages
Hey Everyone, we recently discovered that one of our craft-related websites has a bunch of spam profiles with very sketchy backlink profiles. I just discovered this by looking at the Top Pages report in OpenSiteExplorer.org for our site, and noticed that a good chunk of our top pages are viagra/levitra/etc. type forum profile pages with loads of backlinks from sketchy websites (porn sites, sketchy link farms, etc.). So, some spambot has been building profiles on our site and then building backlinks to those profiles. Now, my question is...we can delete all these profiles, but how should we handle all of these sketchy inbound links? If all of the spam forum profile pages produce true 404 Error pages (when we delete them), will that evaporate the link equity? Or, could we still get penalized by Google? Do we need to use the Link Disavow tool? Also note that these forum profile pages have all been set to "noindex,nofollow" months ago. Not sure how that affects things. This is going to be a time waster for me, but I want to ensure that we don't get penalized. Thanks for your advice!
White Hat / Black Hat SEO | | M_D_Golden_Peak0 -
If Google Authorship is used for every page of your website, will it be penalized?
Hey all, I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc. I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs? Thanks and much appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
Methods for getting links to my site indexed?
What are the best practices for getting links to my site indexed in search engines. We have been creating content and acquiring backlinks for the last few months. They are not being found in the back link checkers or in the Open Site Explorer. What are the tricks of the trade for imporiving the time and indexing of these links? I have read about some RSS methods using wordpress sites but that seems a little shady and i am sure google is looking for that now. Look forward to your advice.
White Hat / Black Hat SEO | | devonkrusich0