Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
-
Greetings Moz Community:
As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories:
1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate
2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
15 PAGES Low bounce rate.3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%)Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me.
Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us.
Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well?
Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"?
Any harm in doing this for about half the pages on the site?
I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains.
Thanks in advance for your responses!! Alan
-
Is there a risk to no-indexing the listing pages? Thanks, Alan
-
Hi Benjamin:
I think your suggestions are excellent. However from a practical point of view there are 350 listings so it is a lot of work to beef them all up.
Once visitors are on the site and they run listing searches, the click thru rate is pretty good, but the problem is more with Google, that many listing are not indexed and they don't generate many clicks.
My SEO company suggests deindexing them because they don't generate much click thru and the high bounce rate may be harming our overall indexing. They are of the opinion that it is best to focus on improving content on categories of pages that have a high click thru rate like neighborhoods and types of space, deindexing listings (I don't know how 350 no indexes would look to Google) and displaying the listings in more appealing manner like in lists and maps.
As for video, do you think that would attract more interest than photos?
Thanks,Alan
-
Hi Prestashop:
I am a commercial real estate broker, so no Zillow or anything comparable in my industry.
If I were to beef up the listing content, would 200-300 words be enough? Should I add some H2 tags and headings?
There are 350 listings, so it is a lot of work if it going to be professionally written.
Thanks,
Alan -
Adding the Zillow api would be a once only thing that would add a lot of value and original content to the pages. I personally would look into that.
-
Also, try adding something unique to your listing. Take 5 minutes and write about the building, area, things to do in that neighborhood - stuff off the top of your head that would be useful to a searcher. That makes you the authority, will make your content more apt to be socially bookmarked and gives you some unique elements to the page. Also try loading a video of the listing if it is yours.
-
I wouldn't build out elaborate content for the property listing pages. I would however build out elaborate content in my website blog about Manhattan real estate, where I discuss the market, types of housing, moving tips, renters insurance, etc - things helpful to the person looking to buy, rent or sell - and that you can keep on your site as evergreen content. Do that first before you start no indexing listing pages. Keep the meta data on the listing pages unique to your site. google knows catalog sites will have the same short description, etc. Your real traffic will come from the content you create in your blog that relate to the listings in your catalog. Then do internal linking from the blog posts to the listing pages. If you have an admin section or other part of the site you do not want people to find in organic search, then no index those....but I wouldnt tell google not to index my product inventory.
-
Thanks Prestashop, Benjamin, Devanur!In principal I understand it is better to beef up content, however the listings get rented quickly. They take 30-60 minutes each to create these pages between the content, tags and photos.
They all get rented within a few weeks to a few months, there is major turn over. So it would be extremely labor intensive to write elaborate content for each.
Furthermore it means that it makes it very difficult to add a lot of listings because of the amount of content if I have to take ranking and amount of content into account each time I write a listing.
Is there any risk that Google would penalize the site by setting these listings to "No-Index"? It would make thing easier.
It may make more sense to add content to the building pages as they are permanent and there are only 150 of them.
Thoughts??
Thanks,
Alan -
I agree. Do you have a blog on your site? If not, I would create one and load content there over the category listing pages. There are tons of real estate sites (catalog sites) that share the same listing content.
Just make sure you have unique page meta data and h1's. Then beef up your site with high quality content about your area, 800+ words each.
But I would not no index / no follow the listings pages.
-
Lesley is absolutely correct. I would never want to remove my pages from Google there by reducing the number of indexed pages (as the website has only about 600 pages), but would beef them up with unique and sizeable content of at least around 500 words each.
-
Do I think that Google will see anything wrong with the no-index'd pages? No, that is pretty much what they are asking for. Would I handle it that way? No, not really.
Listings and buildings seem to be the areas that need to be worked on from what you listed above. This is what I would do. I would have someone write text for each listing. It might seem like a big cost up front, but in the end it evens out. Depending on the current on page non duplicate content (by duplicate content I mean items that are global on the site such as navigation, footer text, links in the footer, side bar, and other things that are on every page) I would put at least 500 words of original content on every page.
This will serve two purposes in my mind, real estate is high in NY, I am not really going to check out a site that does not have enough information on it. The second is to help in the search engines. I do a lot of ecommerce work and one thing I tell my clients is that their current revenues can be increased without doing any SEO at all. Turn the bounces into buyers. Traffic does nothing for a site, conversions mean everything.
I am just shooting off the hip and I could be totally wrong, but I am guessing you are using Wordpress since it is so common. I would get someone to make a plugin so that you can "emulate" content. Sounds pretty shady, but at the same time it adds value.
Think of it this way, you can have a plugin developed where (if you are using Wordpress, or whatever CMS) that on the listing you enter the address. Once that is entered, you load content from Zillow. Content like sale dates on the location, school information, neighborhood info, ect. (you can see a complete list here http://www.zillow.com/howto/api/APIBenefits.htm ) That content will help thicken up your content and enrich the site to your viewers. At the same time I would also have someone rewrite and wordify the 100 word descriptions on the pages too.
The same thing basically for the buildings pages. If the buildings pages are like a landing page and on the page you have linked all of the different suites or condos in the building, I would handle it differently. I would have building descriptions written and if needed spin them, not using a program, but spin them by hand. Hire someone that writes to do it. You could even do it as broad as per borough. Like write one description per borough then hire someone (US native english speaker, college students work for cheap) to rewrite the same couple paragraphs with different wording, adding and taking away from it several dozen times.
That is what I would recommend. The loading cost at this point might be high, but the maintenance cost in the end will be low, you might only be sending out 10 listings a month for like $50 to be rewritten.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Referring subdirectory pages from 3rd hierarchy level pages. Will this hurts?
Hi all, We have product feature pages at 3rd tier like website.com/product/features. We have the help guides for each of these features on a different subdirectory like website.com/help/guides. We are linking these help guides from every page of features. So, will it hurts us anywhere just because we are encouraging 4th tier pages in website, moreover they are from different sub-directory. Thanks
Web Design | | vtmoz0 -
I have a site that has a 302 redirect loop on the home page (www.oncologynurseadvisor.com) i
i am trying to do an audit on it using screaming frog and the 302 stops it. My dev team says it is to discourage Non Human Traffic and that the bots will not see it. Is there any way around this or what can I tell the dev team that shows them it is not working as they state.
Web Design | | HayMktVT0 -
Why is Google displaying meta descriptions for pages that are nowhere contained in said page metas?
Certain search keywords are pulling up incorrect page titles and meta descriptions for our site. I've looked through our code, and the text used by Google in the search results is nowhere found inside our site. I've also looked at previous iterations of our site from over a decade ago and still haven't found it. I then searched specifically for the exact phrased incorrect meta descriptions and found a long list of spammy sites linking to our domain with the exact, incorrect meta description. Is this why Google is displaying the incorrect data, and how do I get Google to use the meta descriptions from my actual site?
Web Design | | Closetstogo0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Subdomains For Real Estate Website
I am currently working on a proposal for a clients Wordpress website development which includes ongoing SEO after the website is developed. I have looked into a number of options and the one that seems the most cost effective involves using subdomains for the individual listings pages. What I want: clientsdomain.com/listings/idxnumber/ What I can get for a decent price: listings.clientsdomain.com/idxnumber/ So the majority of the website will actually exist on a subdomain because the IDX API will automatically populate pages for all of the MLS listings in the area (hundreds or thousands). Meanwhile the domain itself will have all the neighborhood pages and other optimized content, blogs and whatnot. My concern is that dividing the website like this will have negative effects on SEO. There wont be duplicate content across subdomain and main domain, but they will share a lot of links back and forth. I haven't found any recent sources on the topic. Almost everything I have found says that dividing a website in this manor is bad for SEO, but these articles are often many years old. Does anyone know of a Wordpress plugin/IDX company that can provide a solution that doesn't use a subdomain and actually just lists each MLS page within a directory? I am open to using another platform, I am just most familiar with Wordpress. Will using a subdomain in the ways mentioned above have a profound negative effect on SEO? Thank you for your time in responding, I greatly appreciate it.
Web Design | | TotalMarketExposure0 -
Will a .com and .co.uk site (with exact same content) hurt seo
hello, i am sure this question has been asked before, but while i tried to search i could not find the right answer. my question is i have a .com and .co.uk site. both sites have exact same product, exact same product descriptions, and everything is the same. the reason for 2 sites is that .com site shows all the details for US customers and in $, and .co.uk site shows all the details to UK customers and with Pound signs. the only difference in the 2 sites might be the privacy policy (different for US and UK) and different membership groups the site belongs to (US site belong to a list of US trade groups, UK belongs to a list of UK trade groups). my question is other than the minor difference above, all the content of the site is exactly the same, so will this hurt seo for either one or both the site. Our US site much more popular and indexed already in google for 4 years, while our UK site was just started 1 month ago. (also both the sites are hosted by same hosting company, with one site as main domain and the other site as domain addon (i thought i include this information also, if it makes sense to readers)) i would appreciate a reply to the question above thanks
Web Design | | kannu10 -
Comparing the site structure/design of my live site to my new design
Hi SEOmoz team, for the last few months I've been working on a new design for my website, the old, live design can be viewed at http://www.concerthotels.com - it is primarily focused on helping users find hotels close to concert venues throughout North America. The old structure was built in such a way that each concert venue had a number of different pages associated with it (all connected via tabs) - a page with information about the venue, a page with nearby hotels to the venue, a page of upcoming events, a page of venue reviews. An example of these pages can be seen at: http://www.concerthotels.com/venue/madison-square-garden/304484 http://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 http://www.concerthotels.com/venue-events/madison-square-garden-events/304484 http://www.concerthotels.com/venue-reviews/madison-square-garden-reviews/304484 The /venue-hotels/ pages are the most important pages on my website - and there is one of these pages for each concert venue - they are the landing pages for about 90% of the traffic on the website. I decided that having four pages for each venue was probably a poor design, since many of the pages ended up having little or no useful, unique content. So my new design attempts to bring a lot of the venue information together into fewer pages. My new website redesign is temporarily situated at: (not currently launched to the public) http://www.concerthotels.com/frontend The equivalent pages for Madison Square Garden are now: http://www.concerthotels.com/frontend/venue/madison-square-garden/304484 (the page above contains venue information, events and reviews) and http://www.concerthotels.com/frontend/venue-hotels/madison-square-garden-hotels/304484 I would really appreciate any feedback from you guys, based on what you think of the new site design compared to the old design from an SEO point of view. Of course, any feedback on site speed, easy of use etc compared to the old design would also be greatly appreciated. 🙂 My main fear is that when I launch the new design (the new URLs will be identical to the old ones), Google will take a dislike to it - I currently receive a large percentage of my traffic through Google organic search, so I don't want to launch a design that might damage that traffic. My gut instinct tells me that Google should prefer the new design - vastly reduced number of pages, each page now contains more unique content, and it's very much designed for users, so I'm hoping bounce rate, conversion etc will improve too. But my gut has been wrong in the past! 🙂 But I'd love to hear your thoughts, and thanks in advance for any feedback, Cheers Mike
Web Design | | mjk260 -
Redirecting duplicate pages
For whatever reason, X-cart creates duplicates of our categories and articles so that we have URLs like this www.k9electronics.com/dog-training-collars
Web Design | | k9byron
www.k9electronics.com/dog-training-collars/ or http://www.k9electronics.com/articles/anti-bark-collar
http://www.k9electronics.com/articles/anti-bark-collar/ now our SEO guy says that we dont have to redirect these because google is "smart enough" to know they are the same, and that we should "leave it as-is". However, everything I have read online says that google sees this as dupe content and that we should redirect to one or the other / or no /, depending on which most of our internal links already point to, which is with a slash. What should we do? Redirect or leave it as is? Thanks!0