Better for SEO to No-Index Pages with High Bounce Rates
-
Greeting MOZ Community:
I operate www.nyc-officespace-leader.com, a New York City commercial real estate web site established in 2006. An SEO effort has been ongoing since September 2013 and traffic has dropped about 30% in the last month.
The site has about 650 pages. 350 are listing pages, 150 are building pages. The listing and building pages have an average bounce rate of about 75%. The other 150 pages have a bounce rate of about 35%. The building and listing pages are dragging down click through rates for the entire site. My SEO firm believe there might be a benefit to "no-index, follow" these high bounce rate URLs.
From an SEO perspective, would it be worthwhile to "no-index-follow" most of the building and listing pages in order to reduce the bounce rate? Would Google view the site as a higher quality site if I had these pages de-indexed and the average bounce rate for the site dropped significantly. If I no-indexed these pages would Google provide bette ranking to the pages that already perform well?
As a real estate broker, I will constantly be adding many property listings that do not have much content so it seems that a "no-index, follow" would be good for the listings unless Google penalizes sites that have too many "no-index, follow" pages.
Any thoughts???
Thanks,
Alan -
Hi Samuel:
Thanks so much for taking the time to respond to my post!!
You make an excellent point about the necessity to create content useful for humans rather than search engines, a position my SEO firm has also taken.
My site has received no manual penalty from Google. Besides launching an upgraded version that made mostly cosmetic changes, not much has changed on the site since February.
But I should mention that in late April a link removal requests were made to about 100 toxic domains. About 30 web masters voluntarily removed their links. In mid May we filed a disavow request with Google for the other 70 domains. Could the removal of these links and the disavowal request have something to do with the fall in ranking and traffic? Please note the site only had about 280 domains linking to it in March and now there are even less. The quality of the incoming domains was pretty poor.
Good suggestion regarding adding no-follows to the poorly performing building and listing pages. But we have a bit of a challenge with listing pages. They get rented quickly and it becomes unfeasible to add them to the site, and they are absolutely essential, if we need to add 300-400 words of content and write title and description tags. So how would you suggest we manage listings if we should not "no-index" them?
Regarding our potentially spammy domain, we have used it for the site since 2006. An alternative domain (www.metro-manhattan.com) exists that redirects to our primary domain (www.nyc-officespace-leader.com). Do you think it would be better to redirect the site to the www.metro-manhattan.com domain? It better matches the brand "Metro Manhattan Office Space". But I have heard domain changes can be dangerous nightmares.
You point out a potential issue with dashed in our domain. Do you think the single dash in Metro-Manhattan.com would also appear spammy? Incidentally, I don't think the content on our site looks spammy at all, maybe there is some thin content but not spammy.
Thanks for your assistance!!! Alan
-
First, I'd address the "human" part of the question: WHY are the bounce rates for those two sets of pages high? Are you targeting the wrong audience via organic and/or paid search? Are the calls-to-action unattractive? Is the design not user-friendly? Does the form have too many fields? Is the text poorly written? There are countless variables to explore and then test via conversion optimization.
The key to online-marketing success is to think about the human visitors first and Google second because Google is trying to become an algorithm that thinks like a human.
Second, Google officially states that bounce rates do not affect rankings and that Google Analytics data is not used. This makes sense because sometimes a high bounce rate is good. Google wants to provide people with the answers they need as soon as possible. If they search Google, come to your website, find the answer, and go elsewhere within a few seconds, then Google would think it was a job well done.
Third, there could be a lot of problems. Your domain, for example, contains keywords and has two hyphens -- and that looks very sketchy to visitors and probably search engines as well. That's a common characteristic of spam sites. I haven't looked at your site, so I don't know. A lot depends on what your "SEO firm" did and has been doing. Too many firms don't really know what they're doing and can do harmful rather than helpful work. Maybe some bad links were built recently. Maybe a lot of keyword stuffing was done lately. Check your Google Webmaster Tools to see if you've got any penalty messages from Google.
Also, this is a busy time of year for properties and rentals (the summer) so maybe your competitors are going big on PPC ads so that there are fewer organic clicks in general.
One final note: I'd never no-index large portions of a website. I'd think that would be a big red flag to Google that the site is trying to hide something.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle images (lazy loading, compressing, caching...) to impact page load and thus SEO?
Hi all, I am looking for a conclusive answer on how to handle images on Wordpress websites. Most of the time we encounter the same problems regarding images. There are several options to make sure that images don't increase page load too much: Page caching and compressing: standard Lazy loading: helps decrease page load time, but Google might not crawl the images so not good for SEO. See this article on Googlebot scrolling. Correct image format (for example WebP): tried it several times and doesn't help much to decrease page load time. What is best practice? Are there standards or preferred options for the image dimensions and quality (max height, width, number of pixels, rectangular or square) before you upload it, also regarding responsiveness? Is it better to use .jpg, .png or WebP? To sum up, what should you do by default to handle images on websites so you can still have a good page speed even with loads of images? Thanks for your answers!
Intermediate & Advanced SEO | | Mat_C0 -
SEO - Use pages on main site or set up outside keyword rich domains and websites
I have a client who is wanting to target searches for competitors products. His idea was to purchase domains related to the searches he's targeting (for example, people looking for another company's app) and to build out one page websites addressing the search query and why a customer would choose his app solution over a competitor. I know he'd have to build a handful of links to each site for any chance of success but I wanted to ask the following.. Would doing this be better than just building pages addressing the searches on his main website domain? Is there an SEO risk to doing this? Potential for a penalty? Anything we need to do to structure these in a way that won't violate Google's SEO guidelines? Any other thoughts on pros and cons of each strategy? Thank you! Ricky
Intermediate & Advanced SEO | | RickyShockley0 -
SEO - is it site or page
Hi When we're talking about SEO does the search engine only look at the whole site in general or do they look at the individual page when we're talking about SERP? So if you have a keyword "my search term" Does the search engine look at the site first or the page with the term on then rank you or is it the page then the site.
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
How to find all indexed pages in Google?
Hi, We have an ecommerce site with around 4000 real pages. But our index count is at 47,000 pages in Google Webmaster Tools. How can I get a list of all pages indexed of our domain? trying to locate the duplicate content. Doing a "site:www.mydomain.com" only returns up to 676 results... Any ideas? Thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Backlinks question: High Domain Authority, Lower Page Authority
We have a possibility of contributing guest blogs (with followed backlinks) to a site with very high domain authority (and highly trafficked), but when we've looked at the blog entires they already have, most of them have a much lower page authority. How do relevant links from a page with a lower PA but on a domain with a really high DA end up impacting our overall backlink profile? Can an expert or two give me some advice on what this may mean for us if we choose to go for it? In your opinion, does having lots of relevant links from a site with a much higher domain authority than ourselves (to give you an idea, our domain authority is in the low 60's, this site has a domain authority of almost 90) worth the time/effort/resources unto itself? Thanks!
Intermediate & Advanced SEO | | GrowOrganic0 -
HTML5 one page website on-site SEO
Hey guys, If for example, I'm faced with a client who has a website similar to: http://www.symphonyonline.co.uk/ How should I proceed with the on-site optimization? Should I create new pages on the website? Should I create a blog for the site to increase my reach? Please give me your tips on how to proceed with this kind of website. Thanks.
Intermediate & Advanced SEO | | BruLee0 -
Why is a page with a noindex code being indexed?
I was looking through the pages indexed by Google (with site:www.mywebsite.com) and one of the results was a page with "noindex, follow" in the code that seems to be a page generated by blog searches. Any ideas why it seems to be indexed or how to de-index it?
Intermediate & Advanced SEO | | theLotter0 -
Indexing specified entry pages
Hi,We are currently working on location based info.Basically, when someone searches from Florida they will get specific Florida results and when they search from California they will specific California results.How does this location based info affect crawling and indexing?Lets say we have location info for googlebot, sometimes they crawl from a New York ip address, sometimes they do it from Texas and sometimes from California. In this case google will index 3 different pages with 3 different prices and a bit different text, and I'm afraid they might see these as some kind of cloaking or suspicious movement because we serve different versions of the page. What's the best way to handle this?
Intermediate & Advanced SEO | | SEODinosaur0