Is there any use in reducing organic bounce rate if they are going to leave anyway?
-
Is bounce rate itself an important factor? For example if my page which shows pricing for a service ranks high, people usually come to that page, then go to the contact page to see where I am. Most of them then realise I'm not close enough and exit. Now, I could give these people that info on the pricing page but that would drive my bounce rate up. Does it make a difference from Google POV?
-
Start using the DFP ad server and target the ads geographically.
When someone in your service area lands on the page, do not show the ads or show a house ad.
When someone outside of your service area arrives, then show them the ads. Adsense will target by contextual relevance, geographic area, behavior, remarketing, etc to give those people valuable ads.
Sell that ad space by geographic area to people in your industry who serve other areas.
-
Hi Ali. So from Google's perspective, they're measuring bounce rate in relation to how long someone goes from their results to a site, then back to your results. There was a nice discussion around this in Rand's whiteboard Friday a couple years back: http://moz.com/blog/solving-the-pogo-stick-problem-whiteboard-friday and then more recently in the Q & A here: http://moz.com/community/q/pogo-stick-or-not.
Regardless of how many pages someone looks at on your site, if they're doing so VERY quickly and very consistently returning to the search engine and looking for other results, that's a Pogo stick problem to worry about. If your page offers relevant information to the point that you're seeking out your location, at least your matching the searchers intent well up to that point. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content? other issues? using vendor info when selling their prodcuts?
When building content for vendors that we sell their products? best practices? ok, to copy and paste "about us" info? or will that be flagged as duplicate content.
On-Page Optimization | | bakergraphix_yahoo.com0 -
SEO Implications of using Images for Article Titles
Hi guys! New to Moz Pro. I just recently completed an online course with Moz... I have a client who is writing some new content for their site, and we are approaching it with SEO in mind. I was wondering about using an image with text on it as the article title, instead of an actual "text on the page" title. Wondering if that's going to "cost" us anything, SEO wise. I guess we could use alt-text/title/description fields to make sure the keywords are crawlable for our article title but do they have less "weight" than a standard title? How does that work? Hope my question makes sense. Article header attached mB0PXsA.jpg
On-Page Optimization | | JakeWarren1 -
Can I use nofollow to limit the number of links on a page?
My website is an ecommerce and we have on homepage about 470 links ! 1. We have a top bar with my account, login, faq, home, contact us and link to a content page. 2 . Then we have multistore selection 3. Then we have the departament menu, with several parants + child category links 4. Then we have a banner 5. Then we have a list of the recently sold and new products. 6. then we have an image grid with the most important cms/content pages (like faq, about us, etc) 7. then we have footer, with all info pages, contact us, about us, my account etc. There are some links that are repeted 2, 3 times. For a user it is easier to find the informations but I'm not sure how search bots (google) deal with that. So I was thinking on how can I have around 150 links to be followed. To remove the links from the page is not possible. What about to add nofollow to repeted links and some child category, as the spider will crawl the father and will access child on the next page? Is this a good strategy?
On-Page Optimization | | qgairsoft0 -
Using Google structured Data for SEO benefit
Hi there I run www.isacleanse.com.au and I've set up some Structured data using Google Webmaster Tools which says it will be picked up during the next Google update (has been set up over 4 weeks ago), however I dont seem to see any of the structured data for the products/reviews/ratings etc coming through in search results. Question at hand: Is there additional things I need to do in the code of the website or should this be sufficient? (see attached screenshot) szpFUpX
On-Page Optimization | | IsaCleanse1 -
E-Commerce SEO: What are the Schema.org properties you use?
Quick question about Schema.org for products. What are the properties you like to list on product-pages for an E-Commerce? Thanks.
On-Page Optimization | | AdrienOLeary0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
Correct use of Canonical link vs 301 redirect
Hi All, Seeking yet more advice. SEOMOZ tools have told me I have duplicate content on one of my sites and I am keen to clean this up. I am not to familiar with the following so thought I would ask. The duplicate content is shown on : www.mysite.com www.mysite.com/index.html Obviously I only see index.html when I check the code so what is the best method of resolving the duplicate content, Canonical or 301? Can you give me an example 🙂 Thanks all
On-Page Optimization | | wedmonds0