Weird behavior with site's rankings
-
I have a problem with my site's rankings.
I rank for higher difficulty (but lower search volume) keywords , but my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off.I thought very seriously to start new with a new domain name, cause what ever i do seems that is not working.
I will admit that in past (2-3 years ago) i used some of those "seo packages" i had found, but those links which were like no more than 50, are all deleted now, and the domains are disavowed.
The only thing i can think of, is that some how my site got flagged as suspicious or something like that in google.Like 1 month ago, i wrote an article about a topic related with my niche, around a keyword that has difficulty 41%. The search term in 1st page has high authority domains, including a wikipedia page, and i currently rank in the 3rd place.
In the other had, i would expect to rank easily for a keyword difficulty of 30-35% but is happening the exact opposite.The pages i try to rank, are not spammy, are checked with moz tools, and also with canirank spam filters. All is good and green. Plus the content of those pages i try to rank have a Content Relevancy Score which varies from 98% to 100%...
Your opinion would be very helpful, thank you.
-
Hi Nikos,
It's important to remember that Keyword Difficulty scores are a Moz metric, not a Google metric - they are based on Moz' ability to judge how well other sites are competing for that term, and may not capture the entire competitive landscape (since nobody except Google knows everything that Google looks at).
Based on your ability to rank well for some terms and not others, it doesn't seem likely to me that you are under any sort of penalty, so much as that Google just isn't ranking you for some terms. In addition to the Keyword Difficulty scores for each term, take a look at which sites rank for the term (you can do this in the SERP Analysis feature of the Keyword Difficulty tool. Ask youself:
- What kinds of sites rank for this term? For example, if you are an individual business, but all of the sites and pages that are ranking for that term are aggregators or lists of multiple sites, it may be that Google has determined that an individual business site is not a good fit for that query. Similarly, if your page is a blog post and no other blog posts appear in the SERP, Google may have decided that a blog post isn't what people are looking for when they search that term.
- What is the search intent of the query? Based on the other pages that rank, what is the question or task that Google has decided users are trying to answer or complete when they search this term? Does your page do a better example of helping answer that question or complete that task than the other pages that rank?
- What types of content are ranking? Do they all have rich snippets? Are there images, video, shopping or maps results? All of these will tell you more about the kind of content Google thinks will match this query.
- Is there a specific page or website that is ranking for that term that you think you could push out of the top 10? Look for areas of opportunity. For example, maybe there is a site with high authority, but the page that ranks has very low page authority and doesn't fit the query very well. Try to create a page that is better than that page, specifically.
- How closely is the phrase related to your niche? You can tell from the keywords you are successfully ranking for, which topic areas Google is associating with your site. If you have a whole site about chocolates, it will be harder to rank a page about asparagus, even if the difficulty score is lower.
Also, don't forget to continue promoting your content to earn high-authority links to individual content pieces. Where it makes sense to do so, you may also want to link internally from some of your more popular and successful pages to some of the pages that are struggling.
I hope that helps!
-
Hi!
I have the same question as before
If someone has an idea, i would love to hear it -
Hi Nikos! Did EGOL answer your question? If so, please mark his response as a "Good Answer." If not, what questions do you still have?
-
Thanks for your answer.
User experience was one of my first concerns. So i purchased a bootstrap theme, which actually looks very good and is very user friendly. You can check it here. The pages i try to rank for, looks very similar to that one.
Time on site and Bounch rate
Average Bounch rate is 60% , and average time on page is 4 minutes, and 10 seconds (average last month metrics). My site is actually a review site if that helps you somehow.I receive often link requests from other webmasters (meaning other people think my site looks, and content is good), so overal, i don't think my site deserving those rankings. Unless some "old sins" are chasing me.
-
my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off.
We often focus too much on competitive metrics and not enough about the presentation that we are making to our visitors. Many search professionals believe that google is looking at the behavior of visitors, how long they stay, how far they scroll, the number who click in, do they bookmark, do they share your site with friends... and more important... Are They Asking for You By Name in navigational and domain queries?
This is much of the "machine learning" that Google has patented and what they say they are using in some of their new algorithms. I've believe that this has been important for a long time and was willing to stick my neck out about it and bet my ranch a long time ago.
lower difficulty, higher volume keywords
The numbers you are looking at are not based upon what visitors think of your site and how they behave, they are based upon completely different things. I don't think that Moz or others who publish keyword difficulty estimations have very good abilities for determining how visitors behave. Google is the one who has that data, both from the SERPs and from Chrome, and from the engagement platforms like bookmarks and + and other things that they either control or can count.
Keyword difficulty is a brute force metric. Visitor satisfaction is much more discerning and very hard to measure.
which literally pisses me off.
How do your visitors feel when they try to use your website? Compare your site to the sites at the top of the SERPs. Do they have better content? Do they give a better visitor experience? Do they have a broader menu? Is their design better for navigation, comfort of reading, scanning, sharing, and all of the things that people want to do on a website. How do visitors feel when they click in.
Lots of people believe that it is really easy to earn good metrics. Really easy. But it is harder than Hell to please your visitor. How are you doing there? Take a look at be honest.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When the site's entire URL structure changed, should we update the inbound links built pointing to the old URLs?
We're changing our website's URL structures, this means all our site URLs will be changed. After this is done, do we need to update the old inbound external links to point to the new URLs? Yes the old URLs will be 301 redirected to the new URLs too. Many thanks!
Intermediate & Advanced SEO | | Jade1 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
One of our web pages will not rank on Google. The website as a whole ranks fine except just one section...We have tested and it looks fine...Google can crawl the page no problem. There are no spurious redirects in place. The content is fine. There is no duplicate page content issue. The page has a dozen product images (photos) but the load time of the page is absolutely fine. We have the submitted the page via webmaster and its fine. It gets listed but then a few hours later disappears!!! The site has not been penalised as we get good rankings with other pages. Can anyone help? Know about this problem?
Intermediate & Advanced SEO | | CayenneRed890 -
My site has a loft of leftover content that's irrelevant to the main business -- what should I do with it?
Hi Moz! I'm working on a site that has thousands of pages of content that are not relevant to the business anymore since it took a different direction. Some of these pages still get a lot of traffic. What should I do with them? 404? Keep them? Redirect? Are these pages hurting rankings for the target terms? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
How do I tell if competitor's links are good?
One strategy I have seen recommended over and over is to look at your competitor's back links and see if any could be relevant for your site and worth pursuing. My question is how do I evaluate a link and not end up pursuing some penalized site? I would guess checking for Google index is a good idea since some of the webmasters may not be aware they are penalized. Is it DA and whether they are indexed alone? Many sites I have seen have DA in the teens but are legitimate in our industry. Should they not be considered due to low DA? Also I see links from directories on many competitor sites. Seems a controversial subject, but assuming the directory is industry specific, is it OK? Thanks in advance!
Intermediate & Advanced SEO | | Chris6610 -
What's the Best Host For WordPress sites
Our site has gone down twice in a week...hosted by Fat Cow. So we're going to switch hosts this week. We currently have 2 WP sites on a Fat Cow VPS. 8 GB file size and 2 GB data transfer monthly. We use a CDN and video hosting company (Wistia) so the file sizes are small. I've contacted several hosts and narrowed it down to WP Engine, Rack Space and A Small Orange. I care about fast page load time (1 second), 99.999% up-time and great support. Price is a secondary concern. I'm leaning towards WP Engine, but wanted to ask Moz community before making a decision. Any other hosting companies I should call?
Intermediate & Advanced SEO | | Branden_S0 -
Do EMD's give the boost everyone says they do?
Hi, I have used a few myself and if I was targeting UK search with a [emd] .co.uk, every time the domain has hit page 1 with little effort. I have done this maybe 4-5 times, my moz stats show 0 but I rank above results on page 1 with moz stats of DA:45+. Can I now say basically any EMD I buy will rocket through the serp's?
Intermediate & Advanced SEO | | activitysuper0