Better for SEO to No-Index Pages with High Bounce Rates
-
Greeting MOZ Community:
I operate www.nyc-officespace-leader.com, a New York City commercial real estate web site established in 2006. An SEO effort has been ongoing since September 2013 and traffic has dropped about 30% in the last month.
The site has about 650 pages. 350 are listing pages, 150 are building pages. The listing and building pages have an average bounce rate of about 75%. The other 150 pages have a bounce rate of about 35%. The building and listing pages are dragging down click through rates for the entire site. My SEO firm believe there might be a benefit to "no-index, follow" these high bounce rate URLs.
From an SEO perspective, would it be worthwhile to "no-index-follow" most of the building and listing pages in order to reduce the bounce rate? Would Google view the site as a higher quality site if I had these pages de-indexed and the average bounce rate for the site dropped significantly. If I no-indexed these pages would Google provide bette ranking to the pages that already perform well?
As a real estate broker, I will constantly be adding many property listings that do not have much content so it seems that a "no-index, follow" would be good for the listings unless Google penalizes sites that have too many "no-index, follow" pages.
Any thoughts???
Thanks,
Alan -
Hi Samuel:
Thanks so much for taking the time to respond to my post!!
You make an excellent point about the necessity to create content useful for humans rather than search engines, a position my SEO firm has also taken.
My site has received no manual penalty from Google. Besides launching an upgraded version that made mostly cosmetic changes, not much has changed on the site since February.
But I should mention that in late April a link removal requests were made to about 100 toxic domains. About 30 web masters voluntarily removed their links. In mid May we filed a disavow request with Google for the other 70 domains. Could the removal of these links and the disavowal request have something to do with the fall in ranking and traffic? Please note the site only had about 280 domains linking to it in March and now there are even less. The quality of the incoming domains was pretty poor.
Good suggestion regarding adding no-follows to the poorly performing building and listing pages. But we have a bit of a challenge with listing pages. They get rented quickly and it becomes unfeasible to add them to the site, and they are absolutely essential, if we need to add 300-400 words of content and write title and description tags. So how would you suggest we manage listings if we should not "no-index" them?
Regarding our potentially spammy domain, we have used it for the site since 2006. An alternative domain (www.metro-manhattan.com) exists that redirects to our primary domain (www.nyc-officespace-leader.com). Do you think it would be better to redirect the site to the www.metro-manhattan.com domain? It better matches the brand "Metro Manhattan Office Space". But I have heard domain changes can be dangerous nightmares.
You point out a potential issue with dashed in our domain. Do you think the single dash in Metro-Manhattan.com would also appear spammy? Incidentally, I don't think the content on our site looks spammy at all, maybe there is some thin content but not spammy.
Thanks for your assistance!!! Alan
-
First, I'd address the "human" part of the question: WHY are the bounce rates for those two sets of pages high? Are you targeting the wrong audience via organic and/or paid search? Are the calls-to-action unattractive? Is the design not user-friendly? Does the form have too many fields? Is the text poorly written? There are countless variables to explore and then test via conversion optimization.
The key to online-marketing success is to think about the human visitors first and Google second because Google is trying to become an algorithm that thinks like a human.
Second, Google officially states that bounce rates do not affect rankings and that Google Analytics data is not used. This makes sense because sometimes a high bounce rate is good. Google wants to provide people with the answers they need as soon as possible. If they search Google, come to your website, find the answer, and go elsewhere within a few seconds, then Google would think it was a job well done.
Third, there could be a lot of problems. Your domain, for example, contains keywords and has two hyphens -- and that looks very sketchy to visitors and probably search engines as well. That's a common characteristic of spam sites. I haven't looked at your site, so I don't know. A lot depends on what your "SEO firm" did and has been doing. Too many firms don't really know what they're doing and can do harmful rather than helpful work. Maybe some bad links were built recently. Maybe a lot of keyword stuffing was done lately. Check your Google Webmaster Tools to see if you've got any penalty messages from Google.
Also, this is a busy time of year for properties and rentals (the summer) so maybe your competitors are going big on PPC ads so that there are fewer organic clicks in general.
One final note: I'd never no-index large portions of a website. I'd think that would be a big red flag to Google that the site is trying to hide something.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Authority Dropped and Indexed Pages Went Down on Google?
Hi there, We run an e-commerce site on Shopify. Our Domain Authority was 28 at the start of our campaign in May of this year. We also had 610 indexed pages on Google. We did some SEO work which included: Renaming Images for SEO Adding in alt tags Optimizing the meta title to "Product Name - Keyword - Brand Name" for products Optimizing meta descriptions Transition of Hubspot blog to Shopify (it was on a subdomain at Hubspot previously) Fixing some 404s Resubmitting site map after the changes Now it is almost at the 3-month mark and it looks like our Domain Authority has gone down 4 points to 24. The # of indexed pages has gone to down to 555. We made sure all our SEO updates weren't spammy or keyword-stuffed, but took a natural and helpful-sounding approach. We followed guidelines. So there shouldn't be any penalty right? I checked site traffic and it does not coincide with the drop. Our site traffic remains steady. I also looked at "site:" as well as conducted some test searches for the important pages (i.e. main pages, blog pages, and product pages) and they still come up on Google. So could it only be non-important pages being deindexed? My questions are: Why did both the Domain Authority and # of indexed pages go down? Is there any way to see which pages were deindexed? I checked Google Search Console, but couldn't find it. Thank you!
Intermediate & Advanced SEO | | kindalpaca70 -
I am lost at where to go. My optimization rating is 95% + and rankings are on pages 4+. I would like to know what I should do to increase my rankings.
My site is Glare-Guard.com. My Domain Authority has not moved from 17 in a long time. i have done everything to optimize the different pages. I have 90%+ ratings for the various pages, yet I am still not even close to the first page for many of the keywords I am looking to rank for. Do you have any tips or ideas? Should I try to rewrite my content and add more information? I am just at a loss for where I should go to get the right traffic to my site. Any help would be greatly appreciated.
Intermediate & Advanced SEO | | bigskyinc0 -
How can I optimize pages in an index stack
I have created an index stack. My home page is http://www.southernwhitewater.com My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top and links directed to the home page ( which is actually the 1st page). I feel I am going to need a rel=coniacal might be needed somewhere. Any help would be great!!
Intermediate & Advanced SEO | | VelocityWebsites0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Rel=next/prev for paginated pages then no need for "no index, follow"?
I have a real estate website and use rel=next/prev for paginated real estate result pages. I understand "no index, follow" is not needed for the paginated pages. However, my case is a bit unique: this is real estate site where the listings also show on competitors sites. So, I thought, if I "no index, follow" the paginated pages that would reduce the amount of duplicate content on my site and ultimately support my site ranking well. Again, I understand "no index, follow" is not needed for paginated pages when using rel=next/prev, but since my content will probably be considered fairly duplicate, I question if I should do anyway.
Intermediate & Advanced SEO | | khi50 -
Is it better "nofollow" or "follow" links to external social pages?
Hello, I have four outbound links from my site home page taking users to join us on our social Network pages (Twitter, FB, YT and Google+). if you look at my site home page, you can find those 4 links as 4 large buttons on the right column of the page: http://www.virtualsheetmusic.com/ Here is my question: do you think it is better for me to add the rel="nofollow" directive to those 4 links or allow Google to follow? From a PR prospective, I am sure that would be better to apply the nofollow tag, but I would like Google to understand that we have a presence on those 4 social channels and to make clearly a correlation between our official website and our official social channels (and then to let Google understand that our social channels are legitimate and related to us), but I am afraid the nofollow directive could prevent that. What's the best move in this case? What do you suggest to do? Maybe the nofollow is irrelevant to allow Google to correlate our website to our legitimate social channels, but I am not sure about that. Any suggestions are very welcome. Thank you in advance!
Intermediate & Advanced SEO | | fablau9 -
High Bounce with Long Time on Page - Good or Bad?
Hi, I have few articles on my site that are quite good both SEO wise (getting organic search traffic) and as a reading material. They get lots of traffic and people are staying and reading them (2-4 minutes on page). The problem with these articles is that people are reading them and then leave. These are landing pages for certain queries so I assume that the readers are getting what they wanted but when they finish they exit. The bounce rate is 70%-85%. Are these type of pages good or bad for my site? Last note, my site is an e-commerce store. I do try to "motivate" more people to navigate to other pages but the majority do and will probably leave (because they were looking for answers and not products). Needless to say that the articles are related to my site's subjects and products.
Intermediate & Advanced SEO | | BeytzNet0 -
Does many unique pages mean better SERP position?
My site has about 50 pages. All of them are unique 500-700 word articles. Almost every page is on its keyword at the 4-8 position in google/yahoo/bing. I can add a lot of relative unique pages on my site, about 100-200 word content per page. They all will be unique, with unique description and title. I can make about 1000+ pages. Would you suggest me to do this? Will this action boost my SERP position. Do more pages mean better SERP position?
Intermediate & Advanced SEO | | nycdwellers1