Why are these sites outranking me?
-
I've been hit by every update and have spent thousands of $ and hundreds of hours trying to survive. Survival looks doubtful if I can't get turned around in 4 weeks or less. I have found adwords and google errors and fixed them. Alexa says us-nano.com is the best ranked site. I used my moz bar and they are doing everything wrong, keyword stuffing, no H1's tags, poor design. How are they ranking & I'm not? My duplicate meta tags are from this week when I added alexa and bing ID's to my header to verify my site ownership.
-
Thanks Chris. I did report the bounce rate question and someone else indicated the spammy links. I will start there. I think it might be related to my javascript not working right which gives a poor customer experience? After thinking about it, i did use the meta descriptions extensively throughout the site. I have about 125 product pages and 14 product category pages which i spent a lot of time doing the seo on. for instance, if a customer searches for graphene-nanoplatelets my landing page is https://www.cheaptubes.com/product-category/graphene-nanoplatelets/ which I think is highly optimized and not too spammy and we are the 4th organic result. I have neglected some of the non product oriented pages and shouldn't have. I definitely will revisit SEO but right now code issues and spammy links are my #1 suspects. I finally fixed a lot of the script issues but now the site takes 4 seconds to load which is a deal breaker for mobile and that is 30% of my traffic. I'm trying to find someone I can hire to fix it but last time i did that it led to this situation.
-
No problem, happy to help
This is a common scenario we come across where you've got some rankings and don't really want to change anything through fear of losing them but the truth is, as you've already pointed out, there's no use ranking for terms if the vast majority of your users are going to bounce anyway.
You also run the risk of getting hit by the next Panda increment. It's true that you may see a temporary dip in rankings by making these changes but over the next 6 months, the net result should be well and truly worth it. Don't forget that a real SEO campaign is about conversions, not your rankings.
As for the questions about your bounce rate, I'd say this will be a combination of spam referrers "visiting" your site for 0s (100% bounce rate from these spammers, depending on how they go about it) and the awkward keyword stuffing putting people off the site immediately.
Users are steadily becoming more savvy with this stuff and while they may not understand SEO, they do understand that keyword stuffing = untrustworthy. Obviously I'm not a standard user but the second I see it on a site I leave and regular users are starting to head that way too.
Regarding your meta descriptions, it's true that Google doesn't look at them _however, _you can't ignore the users! It's your second opportunity to sell searchers on why they should click your link in the SERPs over the other ~9.
I've always used W3C without issue for HTML/CSS validation, though I'm not a dev so there could be better out there. It may be that there are deeper issues with the template you're currently using and fixing one symptom on the surface is altering something more detrimental. Unfortunately I'm of little help to you in this area.
-
ok, thanks Christy - now that it appears I might have a handle on the errors and my bounce rate is back above 80% i think i will ask it to the moz community
-
Hi there! I meant this followup question: "I seem to have a high bounce rate but am an online store. Any thoughts on how to reduce it?"
-
ps- there has been some wonky code causing redirects. bing says i still have to many redirects and i'm trying to figure out how to fix that
-
Thanks Chris - You've given me something to follow up on. I do rank well for many keywords, like graphene nanoplatelets, i am the 3rd organic hit and am on the 1st page of google for the keywords I'm trying to rank for which makes me reticent to reduce any overused keywords. I added a WYSIWYG editor to the woocommerce product category pages so they will be the ranking pages. Rankings is the whole reason I spent thousands of $ paying someone to redo my site, but they messed it up so much that I had to spend hundred hours fixing it. I had to take it over to recoup my rankings after the algo updates. If my rankings are good now it seems like it might not be an SEO issue but why the high bounce rate? In January I had 95% bounce rate one day, almost 100% of my visitors bounced? Since then I've been aggressively trying to fix it.
I think the work I've been doing has helped. Last week my bounce rate fell to 18% for 1 day with 700 people per day visiting the site but now its back up to 80% and down to 225 visitors but the site had issues all weekend which i think i've mostly resolved. I've spent the last 4 days trying to fix the theme, it seems to have issues. I might have to retheme it but thought it was better working with a known theme trying to fix it rather than trying to retheme over the weekend. I didn't bother with meta descriptions because the search engines don't check them. us-nano has keywords stuffed their metas, i would think they would be penalized for that. I think you are right about alexa. if you know of any good code validators please let me know. I've been using W3C but the code they provide from html tidy mostly worked but caused other errors.
if you have any other suggestions please let me know. i might be missing something? how do i reduce bounce rate?
-
I think the first mistake here is judging the success of a site by Alex rank. IMO, this stat means absolutely nothing since "traffic" can come from anywhere.
We have great rankings including #1 for our toughest term in this state yet if we look at Alexa, we're heavily outranked by Indian SEO firms and dodgy directories.
If you've been hurt by multiple updates over the years, this suggests there are a lot of problems with your site that need to be fixed. Trying to turn this around in 4 weeks is highly improbable, particularly if you have been penalised in the past.
Assuming it's cheaptubes.com, there are quite a few issues I can see immediately:
- Page titles - e.g. the page title on the home page is currently Welcome | Cheap tubes | Cheap Tubes
- Meta descriptions - e.g. Home page meta description is Cheap Tubes Inc Home page
- Fairly thin content - Most pages seem to have ~400ish words and quite a few pages seem to over-use keywords or "Cheap Tubes" - e.g. the Products page starts with Cheap Tubes Inc Online Shop -> Welcome to Cheap Tubes Inc Online Shop!
- I've only had a quick glance at your backlink profile but this seems rather questionable as well. There are a lot of .ru, .tk and .jp links in there. These may be relevant but unlikely
I'd expect that fixing up these elements would see some decent improvements for you reasonably quickly, though seeing real traffic improvements in 4 weeks is still highly unlikely.
Helpful Resources Page Titles
-
Thanks Chris - i've been in WP hell since posting that. I fixed an error but crashed part of the site, no mobil nav and other theme issues. I've been working all weekend trying to get back to where i was thursday. the folks who made my theme weren't on their game i guess. They resent me an updated version of my theme but when i try to validate it at W3C it validates at 2%. The theme as I've edited validates 10-11% so it didn't make sense for me to go backwards and lose several months SEO work and validate worse when I'm trying to clear the errors. I've been validating code and uploading it, checking the site, then the next. I broke the site several times but have it back. I will check the plugin. Right now I'm trying to figure out why I have 151 404 errors according to screaming frog, they look like this. https://www.cheaptubes.com/
I'm also trying to figure out how to fix the theme files i couldn't get to validate or the ones that did validate and still crashed my site....ugh
-
Assuming your site is cheaptubes.com, run it through Googe's Page Speed Insights and look at the scores.
Looks like you could help those low scores quite a bit just by optimizing the images. PSI even provides those optimized resources for you. Google says site speed is a ranking signal although it may not be a significant one. Still it seems pretty easy to fix the "low hanging fruit" and just clean this up. If you want to do more as to JS and CSS files, have a look at the Wordpress plugin "Autoptimize". I have had good luck with it and it is well supported by the author.
Best!
-
Thanks Christy - i did ask one question - how are they out ranking me?
-
the java errors may be resolved. I found out the folks who made my theme don;t know how to call for java
That is something someone hardcoded into your theme.
./cheaptubes/footer-home.php:
./cheaptubes/header.php:
./cheaptubes/layout.php: <script<br>src="js/vendor/modernizr.js">
./cheaptubes/layout.php:Those three files all show you something that is basically the wrong way
to include javascript.A QUICK fix is this:
Edit the files footer-home.php and layout.php
In all cases where you have this:</script<br>
-
Hi there, and welcome to the Moz Forum!
This is a great question, and deserves its own thread, actually. Would you mind starting a new thread with this question? (Asking one question per thread makes it easier for folks with questions similar to yours to find it when searching the forum. It also keeps the forum nice and tidy, and gives your question the best chance of being thoroughly answered.)
Thanks so much for your cooperation.
Christy
-
Bounce rate is normally a signal that the user is quite unhappy for some reason.
You could always look to do a little heat-mapping to see how people are using your pages, where they are clicking or try some live user testing to see what people don't like about the site.
-Andy
-
Send me a PM with your site details and I will run a few checks for you and see if I can see anything glaring.
It will also help if I know when you noticed the drops as this will help rule out / target any penalties.
-Andy
-
i seem to have a high bounce rate but am an online store. Any thoughts on how to reduce it?
-
Thanks Andy
We've been around longer than us-nano but not nanoamor. Us-nano links appear spammy. when I looked in alexa there was a lot of red for spam links. We all have about the same internal linking. I am far out ranking them in organic search now. None of this makes sense. I am fighting for survival now and so far nothing I've done has brought customer inquiries back up. My rankings dropped in Nov but I launched the new site in Sept. I did find problems in adwords where I didn't update links but those are fixed weeks ago. I fixed the 404 errors.
Looking at all 3 sites I think ours looks better than us-nano. maybe we're tied with nanoamor. All Us-nano did was combine keywords into page titles like carbon-nanotubes-graphene, We've put all like products in the same category where they belong. my company is 11 years old. I should be able to get my traffic back. I used to kick both of their butts rankings wise.
They both have a ton of text links on their home pages, we have images and text links. I think it looks much better for page design and for a better user experience. Isn't that what the updates that hit me were supposed to be targeting?
-
Hi,
There are so many possible reasons why they are ranking above you. Google doesn't look at the use of H1's and on-page basics like this, as a strong signal but will look at your content in depth.
I always ask everyone to look at their site first. Take a step back and ask yourself if it is answering the questions people are searching for. If you look at your site and your competitors side by side, apart from the basics, do you think they deserve to be there?
You might find they have a lot of strong links or are doing well with their internal linking or perhaps they have been around longer and have more trust in Google's eyes?
Without actually knowing your site it is a little awkward to give you any real critique that might help.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDF best practices: to get them indexed or not? Do they pass SEO value to the site?
All PDFs have landing pages, and the pages are already indexed. If we allow the PDFs to get indexed, then they'd be downloadable directly from google's results page and we would not get GA events. The PDFs info would somewhat overlap with the landing pages info. Also, if we ever need to move content, we'd now have to redirects the links to the PDFs. What are best practices in this area? To index or not? What do you / your clients do and why? Would a PDF indexed by google and downloaded directly via a link in the SER page pass SEO juice to the domain? What if it's on a subdomain, like when hosted by Pardot? (www1.example.com)
Reporting & Analytics | | hlwebdev1 -
Site property is verified for new version of search console, but same property is unverified in the old version
Hi all! This is a weird one that I've never encountered before. So basically, an admin granted me search console access as an "owner" to the site in search console, and everything worked fine. I inspected some URL's with the new tool and had access to everything. Then, when I realized I had to remove certain pages from the index it directed me to the old search console, as there's no tool for that yet in the new version. However, the old version doesn't even list the site under the "property" dropdown as either verified or unverified, and if I try to add it it makes me undergo the verification process, which fails (I also have analytics and GTM access, so verification shouldn't fail). Has anyone experienced something similar or have any ideas for a fix? Thanks so much for any help!
Reporting & Analytics | | TukTown1 -
Multiple GA codes, one site.
Hi all, Is anyone running two GA codes on one website successfully? My organisation own a number of websites so we used to have one global GA code on all our sites to track global stats, and then we would also have site unique GA on each property to just track that one property. This worked fine, but of late we seem to be getting no data from the globally based code. Obviously, with the site-specific codes we can enter the name for that domain in GA but for the overall code, it is called 'all.com' I'm wondering if Google has now tied the GA domain to the code or if we are doing something wrong. All the codes are the same as they always were but have stopped working. As a stop gap, we have swapped to using Piwik as the all.com code. However, we are then comparing the stats in two different analytics programs so will get a different result. Also, it would be nice to be able to add the all.com to tools such as this to generate weekly reports. Anyone else having GA woe like this? Thanks. Carl
Reporting & Analytics | | WonkyDog0 -
Www and non www versions of the site: 301 redirects but I still get impressions on the wrong version
hallo, I moved from www.bastabollette.it to bastabollette.it, setting a 301 redirect. If I check google search console, I still get impressions and looks like all old www pages are stille indexed. (see attached) why? how can I fix this? thank you
Reporting & Analytics | | micvitale0 -
Implementing demographics for a nopCommerce site
Hi Guys, I have been trying to implement demographics in Analytics for a site built in nopCommerce and I have run out of things to try to correct the problem and get some data! The code I am using is below: Any help would be greatly appreciated - it's driving me mad! Thanks, Dan
Reporting & Analytics | | SEOBirmingham810 -
Site Crash Effect On Traffic
All, I manage a site that unfortunately crashed due to a server issue in late October for about 3 hours. Prior to the crash, traffic was the best it had ever been in the 3+ year history of the site. As you might expect, since the crash traffic has gone gradually down and is now about 15% off pre-crash numbers. I understand that when a site crashes, it disrupts the crawling process and can disrupt traffic (in my case rich snippets were thrown off for days) but would love to hear experiences any of you have had in similar situations. How much did traffic drop after a crash? When did it recover? Other thoughts? Thanks, John
Reporting & Analytics | | JSOC0 -
Google Analytics Site Search to new sub-domain
Hi Mozzers, I'm setting up Google's Site Search on a website. However this isn't for search terms, this will be for people filling in a form and using the POST action to land on a results page. This is similar to what is outlined at http://support.google.com/analytics/bin/answer.py?hl=en&answer=1012264 ('<a class="zippy zippy-collapse">Setting Up Site Search for POST-Based Search Engines').</a> However my approach is different as my results appear on a sub-domain of the top level domain. Eg.. user is on www.domain.com/page.php user fills in form submits user gets taken to results.domain.com/results.php The issue is with the suggested code provided by Google as copied below.. Firstly, I don't use query strings on my results page so I would have to create an artificial page which shouldn't be a problem. But what I don't know is how the tracking will work across a sub-domain without the _gaq.push(['_setDomainName', '.domain.com']); code. Can this be added in? Can I also add Custom Variables? Does anyone have experience of using Site Search across a sub-domain perhaps to track quote form values? Many thanks!
Reporting & Analytics | | panini0 -
Sort referring sites by visit change over time comparison in GA
I can't believe I've never done this before, so I'm going to assume that I previously must have figured it out via excel, but I'm hoping there's an easier way. So I want to compare the referring sites between April and May and see which have sent (specifically) less traffic. The problem with doing a comparison in GA is that it only sorts by the highest traffic for May, when actually I want to see the largest negative change (by number, not percentage) between April and May. Is there a way to do this via the dashboard or am I just going to have to play about in excel for 10 minutes?
Reporting & Analytics | | StalkerB0