What is the best way to deal with pages whose content changes?
-
My site features businesses that offers activities for kids. Each business has its own page on my site. Business pages contains a listing of different activities that organization is putting on (such as events, summer camps, drop-in activities). Some businesses only offer seasonal activities (for example, during Christmas break and summer camps). The rest of the year, the business has no activities -- the page is empty.
This is creating 2 problems. It's poor user experience (which I can fix no problem) but it also is thin content and sometimes treated as duplicate content.
What's the best way to deal with pages whose content can be quite extensive at certain points of the year and shallow or empty at other parts? Should I include a meta ROBOTS tag to not index when there is no content, and change the tag to index when there is content? Should I just ignore this problem? Should I remove the page completely and do a redirect?
Would love to know people's thoughts.
-
Excellent. That's what we'll probably end up doing. Egol and Alex, thanks for the responses.
-
You could have something about the success of past events? Maybe you could encourage the businesses to write something themselves as good promotion.
How about some permanent text about the business and/or events they run on the page? If there are no events - perhaps a brief explanation that there are none yet, along with some suggestions of other local businesses running events and/or a call-to-action to sign up for notifications?
I wouldn't noindex the page when there is no content.
-
Hey thanks for the reply. I should clarify that the listing page is just that -- a listing of upcoming events. A separate page features information about the business. The listings are database driven. Here is an example...
http://www.chatterblock.com/facility/13/juan-de-fuca-recreation-centre-victoria-bc/events/
-
If you write the content for these pages, get busy writing more.
If the featured businesses write the content you will need to find carrots, sticks or pink slips.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Partial duplicate content (reviews) on product pages - is this ok?
Hello, we recently received some really good reviews about a range of products we sell (there are normally 8 products in a range). Due to the industry we are in it made no sense to try and get reviews on each individual product within the range as they differ only ever so slightly. So my question is we want to add these reviews to each of the 8 products that lie within each range, but by adding them it would mean that each page has around 600 words of unique product description followed by approx 600 words of reviews that are the same on each of the products within that range. Is this ok? my only other option would be to screenshot the reviews and upload them as images below each product description. If anyone could offer advice here that would be much appreciated. Thanks
Technical SEO | | livs20130 -
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
Best practice when reducing the number of pages
Hi guys, I have recently changed one of our sites pages from having 1 main department page that links off to 6 others to showing all the products on one long page in an effort to have one authoritative page for this product type that can then be promoted on its own rather than promoting 7 pages and thus watering down any value. My question is now I have removed the links to the old pages should I just delete these pages altogether and setup redirects or should I leave them up as individual landing pages for those smaller product groups? If you look at the page below you can still access the smaller pages from the links in the jquery banner. http://www.orderblinds.co.uk/dept/wooden-venetian-blinds_d0161.htm Thank you in advance, Matt
Technical SEO | | OrderBlinds0 -
Best way to fix a whole bunch of 500 server errors that Google has indexed?
I got a notification from Google Webmaster tools saying that they've found a whole bunch of server errors. It looks like it is because an earlier version of the site I'm doing some work for had those URLs, but the new site does not. In any case, there are now thousands of these pages in their index that error out. If I wanted to simply remove them all from the index, which is my best option: Disallow all 1,000 or so pages in the robots.txt ? Put the meta noindex in the headers of each of those pages ? Rel canonical to a relevant page ? Redirect to a relevant page ? Wait for Google to just figure it out and remove them naturally ? Submit each URL to the GWT removal tool ? Something else ? Thanks a lot for the help...
Technical SEO | | jim_shook0 -
What's the best way to handle Overly Dynamic Url's?
So my question is What the best way to handle Overly Dynamic Url's. I am working on a real estate agency website. They are selling/buying properties and the url is as followed. ttp://www.------.com/index.php?action=calculator&popup=yes&price=195000
Technical SEO | | Angelos_Savvaidis0 -
Issue: Duplicate Pages Content
Hello, Following the setting up of a new campaign, SEOmoz pro says I have a duplicate page content issue. It says the follwoing are duplicates: http://www.mysite.com/ and http://www.mysite.com/index.htm This is obviously true, but is it a problem? Do I need to do anything to avoid a google penalty? The site in question is a static html site and the real page only exsists at http://www.mysite.com/index.htm but if you type in just the domain name then that brings up the same page. Please let me know what if anything I need to do. This site by the way, has had a panda 3.4 penalty a few months ago. Thanks, Colin
Technical SEO | | Colski0 -
Getting home page content at top of what robots see
When I click on the text-only cache of nlpca(dot)com on the home page http://webcache.googleusercontent.com/search?q=cache:UIJER7OJFzYJ:www.nlpca.com/&hl=en&gl=us&strip=1 our H1 and body content are at the very bottom. How do we get the h1 and content at the top of what the robots see? Thanks!
Technical SEO | | BobGW0 -
Seomoz is showing duplicate page content for my wordpress blog
Hi Everyone, My seomoz crawl diagnostics is indicating that I have duplicate content issues in the wordpress blog section of my site located at: http://www.cleversplash.com/blog/ What is the best strategy to deal with this? Is there a plugin that can resolve this? I really appreciate your help guys. Martin
Technical SEO | | RogersSEO0