Help with a unique bounce rate problem!
-
My company ranks very well for the target keyword "blue link" but as it turns out Hyundai launched a new "Blue Link" service. Given that we are a much more niche offering, many of the searches for blue link are for hyundai.
Because people tend to click on the first results without even reading anything, we have seen an increase in traffic as well as a huge spike in bounce rates once they realize we are not the right company.
Our listing on Google is pretty clear what we are so I'm not sure how to fix this problem . . .
-
No problem, glad it helped!
-
Thanks for the video. It's a somewhat older video but hopefully it still applies today. Thanks!
-
Hey David
Webspam does not use analytics data for ranking like this and this is clearly a navigational search so seriously, this is not a problem.
Video may help put your mind at ease:
http://www.youtube.com/watch?v=PZoesvNUPDQ
This will not impact your sites performance in any way.
Hope this helps!
Marcus
-
So there's nothing I can do about this? Will Google not look at our high bounce rate and penalize us in some way?
-
It's not a problem as such, just ignore that traffic that bounces.
If I Google 'blue link' and I see hyundai and then you guys but Google is changing the page title to just the business name trying to help people see which one they want but... people are click happy, and in this instance, I can't really see what you can do but just filter this traffic out of any reports you generate.
Hope that helps.
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How big is the problem: 404-errors as result of out of stock products?
We had a discussion about the importance of 404-errors as result of products which are out of stock. Of course this is not good, but what is the leverance in terms of importance: low-medium-high?
Technical SEO | | Digital-DMG0 -
Authorship and Aggregate Rating in SERPS
I've setup authorship and aggregate rating information for our website. It all checks in the Structured Data Testing Tool, but the results in the SERPs have been on then off. At first my authorship image showed on all articles were the markup existed, then suddenly it went away. Then more recently, the aggregate rating information displayed on all pages were the markup existed, then again, it suddenly disappeared. I'm curious if anyone knows if the disappearance of these things are the result of manual action from Google or simply because the algorithm gathering more information that would cause the items to stop showing for one reason or another? In both cases the markup didn't change previous to the results disappearing. This leads me to believe the change in the SERPs wasn't a result of the markup, but rather something on Google's side.
Technical SEO | | Tim.Paulino0 -
RegEx help needed for robots.txt potential conflict
I've created a robots.txt file for a new Magento install and used an existing site-map that was on the Magento help forums but the trouble is I can't decipher something. It seems that I am allowing and disallowing access to the same expression for pagination. My robots.txt file (and a lot of other Magento site-maps it seems) includes both: Allow: /*?p= and Disallow: /?p=& I've searched for help on RegEx and I can't see what "&" does but it seems to me that I'm allowing crawler access to all pagination URLs, but then possibly disallowing access to all pagination URLs that include anything other than just the page number? I've looked at several resources and there is practically no reference to what "&" does... Can anyone shed any light on this, to ensure I am allowing suitable access to a shop? Thanks in advance for any assistance
Technical SEO | | MSTJames0 -
How do I fix this type of duplicate page content problem?
Sample URLs with this Duplicate Page Content URLs Internal Links External Links Page Authority Linking Root Domains http://rogerelkindlaw.com/index.html 30 0 26 1 http://www.rogerelkindlaw.com/index.html 30 0 20 1 http://www.rogerelkindlaw.com/ | 1,630 | 613 | 43 | 110 | As you can see there are three duplicate pages; http://rogerelkindlaw.com/index.html http://www.rogerelkindlaw.com/index.html http://www.rogerelkindlaw.com/ What would be the best and most efficient way to fix this problem and also how to prevent this from happening? Thank you.
Technical SEO | | brianhughes0 -
Problems with my Site? (If you have time take a look :P thanks)
Hey! If anyone has a moment I would really appreciate any tips or problems you see about our website. I don't expect much but would appreciate any suggestions and such. We are working on back linking and content generation but I know that may not be much use if the website itself is not built well enough! Thank you to anyone who takes a moment of their time to take a look! Site: http://earthsaverequipment.com I am more interested in SEO issues or suggestions not comments that you dislike my artwork 😛 haha Cheers Charles
Technical SEO | | WebNooby0 -
A website that will not load on a particular computer? Help Me Please!
We took on a new client about two weeks ago, took them off a proprietary CMS, placed them on a WordPress site, optimized the site, etc. and were finishing up small details three days ago. My PC in my personal office all of a sudden would not load the site from a Google search, from a direct url, etc.
Technical SEO | | RobertFisher
Our office was using a D-Link wireless router but my PC is hardwired in the office. I cranked up my MacBook Pro with solid state drive (6 months old), got on wireless, and....site would not load. PC's and Macs in offices around me would all load the site. A search online brought up a fix for the PC and tried it - did not work, had lead dev try it - did not work, called a server side friend and he had never heard of such a thing. Every fix revolved around changing IP addresses, etc. I uninstalled my antivirus programs on my PC, installed every update that was outstanding, there was no new software installed on either box prior to problem. Can you help??? Is there any chance someone not associated with us and just looking for my client or someone entering a direct url could experience?0 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0 -
Sitemap.xml problem in Google webmaster
Hi, My sitemap.xml is not submitting correctly in Google Webmaster. There is 697 url submitted but only 56 are in Google index. At the top of webmaster this is what it says ->>> http://www.example.com/sitemap.xml has been resubmitted. But when when I clicked status button RED X occurs. Any suggestions about this, thanks...
Technical SEO | | Socialdude0