Should I submit a sitemap for a site with dynamic pages?
-
I have a coupon website (http://couponeasy.com)
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically.I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical.
I have about 8-9 pages which are static and hence I can include them in sitemap.
Now the question is....
If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages?
NOTE: I need to create the sitemap for getting expanded sitelinks.
-
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
Hello Ahuj,
To answer your final question first:
Crawlers will not stop until they encounter something they cannot read or are told not to continue beyond a certain point. So your site will be updated in the index upon each crawl.
I did some quick browsing and it sounds like an automated sitemap might be your best option. Check out this link on Moz Q&A:
https://moz.com/community/q/best-practices-for-adding-dynamic-url-s-to-xml-sitemap
There are tools out there that will help with the automation process, which will update hourly/daily to help crawlers find your dynamic pages. The tool suggested on this particular blog can be found at:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
I have never used it, but it is worth looking into as a solution to your problem. Another good suggestion I saw was to place all removed deals in an archive page and make them unavailable for purchase/collection. This sounds like a solution that would minimize future issues surrounding 404's, etc.
Hope this helps!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dodgy links across top ranking sites in a certain industry - Could this be negative link building?
Hi, I've noticed repeated low value / high spam backlinks directing to a site that I manage, and despite disavowing, new links showing similar anchor text keep appearing. See sample in the table below: <colgroup><col width="514"> <col width="407"> <col width="364"></colgroup>
White Hat / Black Hat SEO | | Alexanders
| biz.mfso.info/files/images/vertical-blinds-for-bifold-doors.html | | Get free high quality HD wallpapers vertical blinds for bifold doors |
| nmr.mfso.info/files/images/mould-on-vertical-blinds.html | | Get free high quality HD wallpapers mould on vertical blinds |
| nmr.mfso.info/files/images/install-vertical-blinds.html | | Get free high quality HD wallpapers install vertical blinds |
| nmr.mfso.info/files/images/cutting-vertical-blinds.html | | Get free high quality HD wallpapers cutting vertical blinds |
| rre.uere.info/files/images/high-quality-vertical-blinds.html | HD wallpapers high quality vertical blinds rre.uere.info | Get free high quality HD wallpapers high quality vertical blinds |
| dig.uere.info/files/images/mould-on-vertical-blinds.html | HD wallpapers mould on vertical blinds dig.uere.info | Get free high quality HD wallpapers mould on vertical blinds |
| dig.uere.info/files/images/mould-on-vertical-blinds.html | HD wallpapers mould on vertical blinds dig.uere.info | Get free high quality HD wallpapers mould on vertical blinds |
| hja.uere.info/files/images/cost-vertical-blinds.html | HD wallpapers cost vertical blinds hja.uere.info | Get free high quality HD wallpapers cost vertical blinds | I also looked across 5 high ranking sites in the same industry and noticed they too have these 'dodgy' links in their backlink profiles. Could this be negative link building? If so, does anyone know a way to trace it or get it stop?0 -
More or Less pages helps in SEO?
Hi all, I have gone through some articles where less pages are suggested and they claim that they will be favoured by Google. I'm not sure as with limited pages, we can only target limited keywords. There might be threat from Google in-terms of doorway pages for more pages. But one of our competitor has many pages like dedicated page for every keyword. And their website ranks high and good for all keywords. I can see three pages created with differnet phrases for same on keyword. If less pages are good, how come this works for our competitor? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Ecommerce sites we own have similar products, is this OK?
Hello, In one of our niches, we have a big site with all products and a couple more sites that are smaller niches of the same niche. The product descriptions are different with different product names. Is this OK. We've got one big site and 2 smaller subsides in different niches that cross over with the big site. Let me know if Google is OK with this. We will have a separate blog for each with completely different content. There's not really duplicate content issues and although only the big site has a blog right now, the small ones eventually will have their own unique blog. Is this OK in Google's eyes now and in the future? What can we do to ensure we are OK? Thank you.
White Hat / Black Hat SEO | | BobGW1 -
Has our site been attacked?
Hello fellow mozers! I am having a problem you might be able to help me with and any thoughts on the issue will be greatly appreciated. Yesterday, I received an automated monthly report from Quill Engage, a tool that fetches data from Google Analytics and generates reports in a narrative format. Last month's 'referral traffic' section indicates two incredibly spammy websites driving more than 200 sessions to our website. Naturally, I checked out GWT and Open Site Explorer but couldn't find any traces of such activity. Futhermore, all our metrics seem ok. Can this possibly be a negative SEO attack that was only traced by the aforementioned tool? Can you propose any other way to test this and make sure we're not being attacked?
White Hat / Black Hat SEO | | SMD_0 -
Have just submitted Disavow file to Google: Shall I wait until after they have removed bad links to start new content lead SEO campaign?
Hi guys, I am currently conducting some SEO work for a client. Their previous SEO company had built a lot of low quality/spam links to their site and as a result their rankings and traffic have dropped dramatically. I have analysed their current link profile, and have submitted the spammiest domains to Google via the Disavow tool. The question I had was.. Do I wait until Google removes the spam links that I have submitted, and then start the new content based SEO campaign. Or would it be okay to start the content based SEO campaign now, even though the current spam links havent been removed yet.. Look forward to your replies on this...
White Hat / Black Hat SEO | | sanj50500 -
I think my site is affected by a Google glitch...or something
Although google told me No manual spam actions found i had not received an unnatural link request notice i figured it would be a good idea to clean these up so i did. So i have submitted 3 reconsideration requests from google. They all came back with the same response: No manual spam actions found. I really doubt that anyone at google really checked those out.You will notice that i don't even appear on page 1-10 at all...its clearly google filtering the site out from the results(except for my brand terms), but i have no idea what for.What do you guys think it is? If you see anythign let me know so i can have it fixed.This has been going on for 2 months now...my company has been around for a long time...i dont understand why suddenly im not showing up in searches for the keyword si used to rank for...
White Hat / Black Hat SEO | | CMTM0 -
Is pulling automated news feeds on my home page a bad thing?
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom. After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
White Hat / Black Hat SEO | | amit20760 -
Problems with link spam from spam blogs to competitor sites
A competitor of ours is having a great deal of success with links from spam blogs (such as: publicexperience.com or sexylizard.org) it is proving to be a nightmare. Google does not detect these (the competitor has been doing well now for over a year) and my boss is starting to think if you can’t beat them, join them. Frankly, he is right – we have built some great links but it is nigh on impossible to beat 400+ highly targeted spam links in a niche market. My question is, has anyone had success in getting this sort of stuff brought to the attention of Google and banned (I actually listed them all in a message in webmaster tools and sent them over to Google over a year ago!). This is frustrating, I do not want to join in this kind of rubbish but it is hard to put a convincing argument against it when our competitor has used the technique successfully for over a year without any penalty. Ideas? Thoughts? All help appreciated
White Hat / Black Hat SEO | | RodneyRiley0