Pages with Little Content
-
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed.
I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed.
Here is an example page with very little content
-
If you've already cut internal links and they're only in RSS, I probably would NOINDEX them, personally. You're basically telling Google that they're low-value (for now, at least) but then you're letting them be [possibly] indexed and potentially diluting the rest of your index (and ranking power). That way, people can still access them, but you're not risking Panda issues or other problems.
Over time, as you add content and boost your domain authority and link profile, you can start phasing in more pages.
-
Hi As Brent has said you cannot prevent Google from crawling the site. Once the site is indexed and Google starts to visit the site more frequently it is likely at some point to find the pages. However, based on the limited information and more importantly the lack of any real unique content i would advise that you take what ever steps you can from limiting the visibility of these pages. I do not believe you will lose rank but you certainly will not gain any. Thanks
-
_First of all Google would still find the pages since your site most _
likely links to them?
Actually there are only currently links to the pages with an actual
description. I have disabled links to those with no description. The
way the site is set up at the moment means that a page is created,
but not necessarily linked to
(see [http://www.dublinbynumbers.com/events/live-music](http://www.dublinbynumbers.com/events/live-music)).
_After looking at your example page what if you made a unique _
_description for each artist / performer and then have a list _
_of events that is always changing under them that way you have _
_some type of unique content that would be helpful._
I think I'd have the same problem there would still be thousands of
artists (nevermind festivals etc). I do want to list as many events
as possible and I think my users would prefer a comprehensive number
of listings without descriptions rather than a small number of event
listings with descriptions. Some listings do have more substantial
descriptions e.g. [St Patrick's Day in Dublin](http://www.dublinbynumbers.com/events/arts-culture/festivals/st-patricks-festival-and-parade "St Patrick's Day")
My overall question is should I prevent Google from indexing pages
with little content (as the original example) or just let it
do it's thang au naturale?
-
Hi Andy,
First of all Google would still find the pages since your site most likely links to them? ( And if not how would users find these otherwise? )
After looking at your example page what if you made a unique description for each artist / performer and then have a list of events that is always changing under them that way you have some type of unique content that would be helpful.
-Brent
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competing with doorway pages
Hi all, it's my understanding that 'doorway pages' are bad practice. However, when googling for the services that our company offers, along the lines of '[service] [location]', businesses turn up in Google SERPs that outrank us purely with doorway pages. Take this as an example: https://www.google.co.uk/search?q=seo+dorking One of the results is this company who seem to rank for pretty much every town modifier: https://prioritypixels.co.uk/seo-agency-dorking/ If you look at their sitemaps you'll see thousands of these pages: https://prioritypixels.co.uk/page-sitemap16.xml All the content is slightly different but broadly speaking it is very similar. It seems that, in the short term, we can't compete with this company but we could if we employed the same tactics. So my question is: is what they are doing really risking a penalty? b1Lpp5
Intermediate & Advanced SEO | | Bee1590 -
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
Intermediate & Advanced SEO | | 3mil0 -
I have 2 keywords I want to target, should I make one page for both keywords or two separate pages?
My team sells sailboats and pontoon boats all over the country. So while they are both boats, the target market is two different types of people... I want to make a landing page for each state so if someone types in "Pontoon Boats for sale in Michigan" or "Pontoon boats for sale in Tennessee," my website will come up. But I also want to come up if someone is searching for sailboats for sale in Michigan or Tennessee (or any other state for that matter). So my question is, should I make 1 page for each state that targets both pontoon boats and sailboats (total of 50 landing pages), or should I make two pages for each state, one targeting pontoon boats and the other sailboats (total of 100 landing pages). My team has seen success targeting each state individually for a single keyword, but have not had a situation like this come up yet.
Intermediate & Advanced SEO | | VanMaster0 -
Better UX or more Dedicated Pages (and page views)?
Hi, I'm building a new e-commerce site and I'm conflicting about what to do in my category pages. If we take for example a computer store.
Intermediate & Advanced SEO | | BeytzNet
I have a category of laptops and inside there are filters by brand (Samsung, HP, etc.). I have two options - either having the brand choice open a new dedicated page -
i.e. Samsung-Laptops.aspx or simply do a JQuery filter which gives a better and faster user experience (immediate, animated and with no refresh). **Which should I use? (or does it depend on the keyword it might target)? **
Samsung laptops / dell laptops / hp laptops - are a great keyword on there own! By the way, splitting Laptops.aspx to many sub category physical pages might also help by providing the site with many actual pages dealing with laptops altogether.0 -
Can I delay an AJAX call in order to hide specific on page content?
I am an SEO for a people search site. To avoid potential duplicate content issues for common people searches such as "John Smith" we are displaying the main "John Smith" result above the fold and add "other John Smith" search results inside an iframe. This way search engines don't see the same "other John Smith" search results on all other "John Smith" profile pages on our site and conclude that we have lots of duplicate content. We want to get away from using an iframe to solve potential duplicate content problem. Question: Can we display this duplicate "John Smith" content using a delayed AJAX call and robot.txt block the directory that contains the AJAX call?
Intermediate & Advanced SEO | | SEOAccount320 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0 -
The system shows duplicate content for the same page (main domain and index.html). Is this an error of SEOMOZ?
Should I be worried that this will affect SEO? Most sites redirect to the index.html page, right? [edited by staff to remove toolbar data]
Intermediate & Advanced SEO | | moskowman0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0