Should I submit a sitemap for a site with dynamic pages?
-
I have a coupon website (http://couponeasy.com)
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically.I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical.
I have about 8-9 pages which are static and hence I can include them in sitemap.
Now the question is....
If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages?
NOTE: I need to create the sitemap for getting expanded sitelinks.
-
Hi Anuj -
I think you are operating from a very false assumption that is going to hurt your organic traffic (I suspect it has already).
The XML sitemap is one of the the very best ways to tell the search engines about new content on your website. Therefore, by not putting your new coupons in the sitemap, you are not giving the search engines one of the strongest signals possible that new content is there.
Of course, you have to automate your sitemap and have it update as often as possible. Depending on the size of your site and therefore the processing time, you could do it hourly, every 4 hours, something like that. If you need recommendations for automated sitemap tools, let me know. I should also point out that you should put the frequency that the URLs are updated (you should keep static URLs for even your coupons if possible). This will be a big win for you.
Finally, if you want to make sure your static pages are always indexed, or want to keep an eye on different types of coupons, you can create separate sitemaps under your main sitemap.xml and segment by type. So static-pages-sitemap.xml, type-1-sitemap.xml, etc. This way you can monitor indexation by type.
Hope this helps! Let me know if you need an audit or something like that. Sounds like there are some easy wins!
John
-
Hello Ahuj,
To answer your final question first:
Crawlers will not stop until they encounter something they cannot read or are told not to continue beyond a certain point. So your site will be updated in the index upon each crawl.
I did some quick browsing and it sounds like an automated sitemap might be your best option. Check out this link on Moz Q&A:
https://moz.com/community/q/best-practices-for-adding-dynamic-url-s-to-xml-sitemap
There are tools out there that will help with the automation process, which will update hourly/daily to help crawlers find your dynamic pages. The tool suggested on this particular blog can be found at:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
I have never used it, but it is worth looking into as a solution to your problem. Another good suggestion I saw was to place all removed deals in an archive page and make them unavailable for purchase/collection. This sounds like a solution that would minimize future issues surrounding 404's, etc.
Hope this helps!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Server and multiple sites
We have multiple sites selling similar products in different ways but have always kept them separate on the off chance that google does not like it or they penalize one site. We have always put them on different servers but now thinking for performance as they are on shared hosting to put them on a single server which would be our own but we do not know the SEO considerations. We can assign multiple IPs to a server but I am not 100% sure whether there is still a negative impact of running multiple sites on the same server even if from a different IP. Any help would be appreciated, what I am really asking is could if they are on the same server with different IP's be still linked together by google?
White Hat / Black Hat SEO | | BobAnderson0 -
Traffic going down in all sites in a niche
Hello, A client has three Ecommerce sites in a niche. Because of competition and a (possibly) non manual penalty due to doorways and paid links (though I think it's mainly competition too) our traffic is going down. What are the keys to increasing traffic at this point. Feel free to include tricks that cost money. A Hrefs (I love Moz though!) has some neat content tricks. Please give me the best tricks in the industry to increase traffic. We're adding content to the main site of the three and maybe that's what to focus on, but we're having trouble driving serious traffic with the content. We need serious traffic. We are experts in our field and capable of almost anything as far as information goes in our field. Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Sitemap issues 19 warnings
Hi Guys I seem to be having a lot of sitemap issues. 1. I have 3 top level domains, all with the co.nz sitemap that was submitted 2. I'm in the midst of a site re-design so I'm unsure if I should be updating these now or when the new site goes live (in two weeks) 3. I have 19 warnings from GWT for the co.nz site and they gave me 3 examples looks like 404 errors however I'm not too sure and a bit green on my behalf to find out where the issues are and how to fix them. (it is also showing that 95 pages submitted and only 53 were indexed) 4. I generated recently 2 sitemaps for .com and com.au submitted these both to google and when i create i still see the co.nz sitemap Would love some guidance around this. Thanks
White Hat / Black Hat SEO | | edward-may0 -
On-site duplication working - not penalised - any ideas?
I've noticed a website that has been set up with many virtually identical pages. For example many of them have the same content (minimal text, three video clips) and only the town name varies. Surely this is something that Google would be against? However the site is consistently ranking near the top of Google page 1, e.g. http://www.maxcurd.co.uk/magician-guildford.html for "magician Guildford", http://www.maxcurd.co.uk/magician-ascot.html for "magician Ascot" and so on (even when searching without localisation or personalisation). For years I've heard SEO experts say that this sort of thing is frowned on and that they will get penalised, but it never seems to happen. I guess there must be some other reason that this site is ranked highly - any ideas? The content is massively duplicated and the blog hasn't been updated since 2012 but it is ranking above many established older sites that have lots of varied content, good quality backlinks and regular updates. Thanks.
White Hat / Black Hat SEO | | MagicianUK0 -
Can a hidden menu damage a website page?
Website (A) - has a landing page offering courses Website (B) - ( A different organisation) has a link to Website A. The goal landing page when you click on he link takes you to Website A's Courses page which is already a popular page with visitors who search for or come directly into Website A. Owners of Website A want to ADD an Extra Menu Item to the MENU BAR on their Courses page to offer some specific courses to visitors who come from Website (B) to Website (A) - BUT the additional MENU ITEM is ONLY TO BE DISPLAYED if you come from having clicked on the link at Website (B). This link both parties are intending to track However, if you come to the Courses landing page on Website (A) directly from a search engine or directly typing in the URL address of the landing page - you will not see this EXTRA Menu Item with its link to courses, it only appears should you visit Website (A) having come from Website (B). The above approach is making me twitch as to what the programmer wants to do as to me this looks like a form of 'cloaking'. What I am not understanding that Website (A) URL ADDRESS landing page is demonstrating outwardly to Google a Menu Bar that appears normal, but I come to the same URL ADDRESS from Website (B) and I end up seeing an ADDITIONAL MENU ITEM How will Google look at this LANDING PAGE? Surely it must see the CODING INSTRUCTIONS sitting there behind this page to assist it in serving up in effect TWO VERSIONS of the page when actually the URL itself does not change. What should I advise the developer as I don't want the landing page of Website (A) which is doing fine right now, end up with some sort of penalty from the search engines through this exercise. Many thanks in advance of answers from the community.
White Hat / Black Hat SEO | | ICTADVIS0 -
Victim of Negative SEO - Can I Redirect the Attacked Page to an External Site?
My site has been a victim of Negative SEO. During the course of 3 weeks, I have received over 3000 new backlinks from 200 referring domains (based on Ahref report). All links are pointing to just 1 page (all other pages within the site are unaffected). I have already disavowed as many links as possible from Ahref report, but is that all I can do? What if I continue to receive bad backlinks? I'm thinking of permanently redirecting the affected page to an external website (a dummy site), and hope that all the juice from the bad backlinks will be transferred to that site. Do you think this would be a good practice? I don't care much about keeping the affected page on my site, but I want to make sure the bad backlinks don't affect the entire site. The bad backlinks started to come in around 3 weeks ago and the rankings haven't been affected yet. The backlinks are targeting one single keyword and are mostly comment backlinks and trackbacks. Would appreciate any suggestions 🙂 Howard
White Hat / Black Hat SEO | | howardd0 -
How to Handle Sketchy Inbound Links to Forum Profile Pages
Hey Everyone, we recently discovered that one of our craft-related websites has a bunch of spam profiles with very sketchy backlink profiles. I just discovered this by looking at the Top Pages report in OpenSiteExplorer.org for our site, and noticed that a good chunk of our top pages are viagra/levitra/etc. type forum profile pages with loads of backlinks from sketchy websites (porn sites, sketchy link farms, etc.). So, some spambot has been building profiles on our site and then building backlinks to those profiles. Now, my question is...we can delete all these profiles, but how should we handle all of these sketchy inbound links? If all of the spam forum profile pages produce true 404 Error pages (when we delete them), will that evaporate the link equity? Or, could we still get penalized by Google? Do we need to use the Link Disavow tool? Also note that these forum profile pages have all been set to "noindex,nofollow" months ago. Not sure how that affects things. This is going to be a time waster for me, but I want to ensure that we don't get penalized. Thanks for your advice!
White Hat / Black Hat SEO | | M_D_Golden_Peak0 -
Switching site content
I have been advised to take a particular path with my domain, to me it seems "black hat" but ill ask the experts: Is it acceptable when one owns an exact match location domain eg london.com, to run as a tourist information site, gathering links from wikipedia,bbc,local paper/radio/sports websites etc, then after 6 - 12 months, switch the content to a business site? What could the penalties be? Please advise...
White Hat / Black Hat SEO | | klsdnflksdnvl0