Better for SEO to No-Index Pages with High Bounce Rates
-
Greeting MOZ Community:
I operate www.nyc-officespace-leader.com, a New York City commercial real estate web site established in 2006. An SEO effort has been ongoing since September 2013 and traffic has dropped about 30% in the last month.
The site has about 650 pages. 350 are listing pages, 150 are building pages. The listing and building pages have an average bounce rate of about 75%. The other 150 pages have a bounce rate of about 35%. The building and listing pages are dragging down click through rates for the entire site. My SEO firm believe there might be a benefit to "no-index, follow" these high bounce rate URLs.
From an SEO perspective, would it be worthwhile to "no-index-follow" most of the building and listing pages in order to reduce the bounce rate? Would Google view the site as a higher quality site if I had these pages de-indexed and the average bounce rate for the site dropped significantly. If I no-indexed these pages would Google provide bette ranking to the pages that already perform well?
As a real estate broker, I will constantly be adding many property listings that do not have much content so it seems that a "no-index, follow" would be good for the listings unless Google penalizes sites that have too many "no-index, follow" pages.
Any thoughts???
Thanks,
Alan -
Hi Samuel:
Thanks so much for taking the time to respond to my post!!
You make an excellent point about the necessity to create content useful for humans rather than search engines, a position my SEO firm has also taken.
My site has received no manual penalty from Google. Besides launching an upgraded version that made mostly cosmetic changes, not much has changed on the site since February.
But I should mention that in late April a link removal requests were made to about 100 toxic domains. About 30 web masters voluntarily removed their links. In mid May we filed a disavow request with Google for the other 70 domains. Could the removal of these links and the disavowal request have something to do with the fall in ranking and traffic? Please note the site only had about 280 domains linking to it in March and now there are even less. The quality of the incoming domains was pretty poor.
Good suggestion regarding adding no-follows to the poorly performing building and listing pages. But we have a bit of a challenge with listing pages. They get rented quickly and it becomes unfeasible to add them to the site, and they are absolutely essential, if we need to add 300-400 words of content and write title and description tags. So how would you suggest we manage listings if we should not "no-index" them?
Regarding our potentially spammy domain, we have used it for the site since 2006. An alternative domain (www.metro-manhattan.com) exists that redirects to our primary domain (www.nyc-officespace-leader.com). Do you think it would be better to redirect the site to the www.metro-manhattan.com domain? It better matches the brand "Metro Manhattan Office Space". But I have heard domain changes can be dangerous nightmares.
You point out a potential issue with dashed in our domain. Do you think the single dash in Metro-Manhattan.com would also appear spammy? Incidentally, I don't think the content on our site looks spammy at all, maybe there is some thin content but not spammy.
Thanks for your assistance!!! Alan
-
First, I'd address the "human" part of the question: WHY are the bounce rates for those two sets of pages high? Are you targeting the wrong audience via organic and/or paid search? Are the calls-to-action unattractive? Is the design not user-friendly? Does the form have too many fields? Is the text poorly written? There are countless variables to explore and then test via conversion optimization.
The key to online-marketing success is to think about the human visitors first and Google second because Google is trying to become an algorithm that thinks like a human.
Second, Google officially states that bounce rates do not affect rankings and that Google Analytics data is not used. This makes sense because sometimes a high bounce rate is good. Google wants to provide people with the answers they need as soon as possible. If they search Google, come to your website, find the answer, and go elsewhere within a few seconds, then Google would think it was a job well done.
Third, there could be a lot of problems. Your domain, for example, contains keywords and has two hyphens -- and that looks very sketchy to visitors and probably search engines as well. That's a common characteristic of spam sites. I haven't looked at your site, so I don't know. A lot depends on what your "SEO firm" did and has been doing. Too many firms don't really know what they're doing and can do harmful rather than helpful work. Maybe some bad links were built recently. Maybe a lot of keyword stuffing was done lately. Check your Google Webmaster Tools to see if you've got any penalty messages from Google.
Also, this is a busy time of year for properties and rentals (the summer) so maybe your competitors are going big on PPC ads so that there are fewer organic clicks in general.
One final note: I'd never no-index large portions of a website. I'd think that would be a big red flag to Google that the site is trying to hide something.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing indexed internal search pages from Google when it's driving lots of traffic?
Hi I'm working on an E-Commerce site and the internal Search results page is our 3rd most popular landing page. I've also seen Google has often used this page as a "Google-selected canonical" on Search Console on a few pages, and it has thousands of these Search pages indexed. Hoping you can help with the below: To remove these results, is it as simple as adding "noindex/follow" to Search pages? Should I do it incrementally? There are parameters (brand, colour, size, etc.) in the indexed results and maybe I should block each one of them over time. Will there be an initial negative impact on results I should warn others about? Thanks!
Intermediate & Advanced SEO | | Frankie-BTDublin0 -
Merging Pages and SEO
Hi, We are redesigning our website the following way: Before: Page A with Content A, Page B with Content B, Page C with Content C, etc
Intermediate & Advanced SEO | | viatrading1
e.g. one page for each Customer Returns, Overstocks, Master Case, etc
Now: Page D with content A + B + C etc.
e.g. one long page containing all Product Conditions, one after the other So we are merging multiples pages into one.
What is the best way to do so, so we don't lose traffic? (or we lose the minimum possible) e.g. should we 301 Redirect A/B/C to D...?
Is it likely that we lose significant traffic with this change? Thank you,0 -
How can I optimize pages in an index stack
I have created an index stack. My home page is http://www.southernwhitewater.com My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top and links directed to the home page ( which is actually the 1st page). I feel I am going to need a rel=coniacal might be needed somewhere. Any help would be great!!
Intermediate & Advanced SEO | | VelocityWebsites0 -
Does anyone know of any tools that can help split up xml sitemap to make it more efficient and better for seo?
Hello All, We want to split up our Sitemap , currently it's almost 10K pages in one xml sitemap but we want to make it in smaller chunks splitting it by category or location or both. Ideally into 100 per sitemap is what I read is the best number to help improve indexation and seo ranking. Any thoughts on this ? Does anyone know or any good tools out there which can assist us in doing this ? Also another question I have is that should we put all of our products (1250) in one site map or should this also be split up in to say products for category etc etc ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
Greeting MOZ Community: I run www.nyc-officespace-leader.com, a real estate website in New York City. The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two? What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same. I am concerned that page bloat has something to do with a recent drop in ranking. Thanks everyone!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
| On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |
Intermediate & Advanced SEO | | Kingalan10 -
Incorrect cached page indexing in Google while correct page indexes intermittently
Hi, we are a South African insurance company. We have a page http://www.miway.co.za/midrivestyle which has a 301 redirect to http://www.miway.co.za/car-insurance. Problem is that the former page is ranking in the index rather than the latter. The latter page does index occasionally in the same position, but rarely. This is primarily for search phrases like "car insurance" and "car insurance quotes". The ranking was knocked down the index with Penquin 2.0. It was not ranking at all but we have managed to recover to 12/13. This abnormally has only been occurring since the recovery. The correct page does index for other search terms like "insurance for car". Your help would be appreciated, thanks!
Intermediate & Advanced SEO | | miway0 -
E Commerce product page canonical and indexing + URL parameters
Hi, I'm having some issues on the best way to handle site structure. The technical side of SEO isn't my strong point so I thought I'd ask the question before I make the decision. Two examples for you to look at. This is a new site http://www.tester.co.uk/electrical/multimeters/digital. By selecting another page to see more products you get this url string where/p/2. This page also has the canonical tag relating to this page and not the original page. Now if say for example I exclude this parameter (where) in webmaster tools will I be stopping Google indexing the products on the other pages where/p/2, 3, 4 etc. and the same if I make the canonical point to multimeters/digital/ instead of multimeters/digital/where/p/2 etc.? I have the same question applied to the older site http://www.pat-services.co.uk/digital-multimeters-26.html. which no longer has an canonical tags at all. The only real difference is Google is indexing http://www.pat-services.co.uk/digital-multimeters-26.html?page=2 but not http://www.tester.co.uk/electrical/multimeters/digital/where/p/2 Thanks for help in advance
Intermediate & Advanced SEO | | PASSLtd0