Weird behavior with site's rankings
-
I have a problem with my site's rankings.
I rank for higher difficulty (but lower search volume) keywords , but my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off.I thought very seriously to start new with a new domain name, cause what ever i do seems that is not working.
I will admit that in past (2-3 years ago) i used some of those "seo packages" i had found, but those links which were like no more than 50, are all deleted now, and the domains are disavowed.
The only thing i can think of, is that some how my site got flagged as suspicious or something like that in google.Like 1 month ago, i wrote an article about a topic related with my niche, around a keyword that has difficulty 41%. The search term in 1st page has high authority domains, including a wikipedia page, and i currently rank in the 3rd place.
In the other had, i would expect to rank easily for a keyword difficulty of 30-35% but is happening the exact opposite.The pages i try to rank, are not spammy, are checked with moz tools, and also with canirank spam filters. All is good and green. Plus the content of those pages i try to rank have a Content Relevancy Score which varies from 98% to 100%...
Your opinion would be very helpful, thank you.
-
Hi Nikos,
It's important to remember that Keyword Difficulty scores are a Moz metric, not a Google metric - they are based on Moz' ability to judge how well other sites are competing for that term, and may not capture the entire competitive landscape (since nobody except Google knows everything that Google looks at).
Based on your ability to rank well for some terms and not others, it doesn't seem likely to me that you are under any sort of penalty, so much as that Google just isn't ranking you for some terms. In addition to the Keyword Difficulty scores for each term, take a look at which sites rank for the term (you can do this in the SERP Analysis feature of the Keyword Difficulty tool. Ask youself:
- What kinds of sites rank for this term? For example, if you are an individual business, but all of the sites and pages that are ranking for that term are aggregators or lists of multiple sites, it may be that Google has determined that an individual business site is not a good fit for that query. Similarly, if your page is a blog post and no other blog posts appear in the SERP, Google may have decided that a blog post isn't what people are looking for when they search that term.
- What is the search intent of the query? Based on the other pages that rank, what is the question or task that Google has decided users are trying to answer or complete when they search this term? Does your page do a better example of helping answer that question or complete that task than the other pages that rank?
- What types of content are ranking? Do they all have rich snippets? Are there images, video, shopping or maps results? All of these will tell you more about the kind of content Google thinks will match this query.
- Is there a specific page or website that is ranking for that term that you think you could push out of the top 10? Look for areas of opportunity. For example, maybe there is a site with high authority, but the page that ranks has very low page authority and doesn't fit the query very well. Try to create a page that is better than that page, specifically.
- How closely is the phrase related to your niche? You can tell from the keywords you are successfully ranking for, which topic areas Google is associating with your site. If you have a whole site about chocolates, it will be harder to rank a page about asparagus, even if the difficulty score is lower.
Also, don't forget to continue promoting your content to earn high-authority links to individual content pieces. Where it makes sense to do so, you may also want to link internally from some of your more popular and successful pages to some of the pages that are struggling.
I hope that helps!
-
Hi!
I have the same question as before
If someone has an idea, i would love to hear it -
Hi Nikos! Did EGOL answer your question? If so, please mark his response as a "Good Answer." If not, what questions do you still have?
-
Thanks for your answer.
User experience was one of my first concerns. So i purchased a bootstrap theme, which actually looks very good and is very user friendly. You can check it here. The pages i try to rank for, looks very similar to that one.
Time on site and Bounch rate
Average Bounch rate is 60% , and average time on page is 4 minutes, and 10 seconds (average last month metrics). My site is actually a review site if that helps you somehow.I receive often link requests from other webmasters (meaning other people think my site looks, and content is good), so overal, i don't think my site deserving those rankings. Unless some "old sins" are chasing me.
-
my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off.
We often focus too much on competitive metrics and not enough about the presentation that we are making to our visitors. Many search professionals believe that google is looking at the behavior of visitors, how long they stay, how far they scroll, the number who click in, do they bookmark, do they share your site with friends... and more important... Are They Asking for You By Name in navigational and domain queries?
This is much of the "machine learning" that Google has patented and what they say they are using in some of their new algorithms. I've believe that this has been important for a long time and was willing to stick my neck out about it and bet my ranch a long time ago.
lower difficulty, higher volume keywords
The numbers you are looking at are not based upon what visitors think of your site and how they behave, they are based upon completely different things. I don't think that Moz or others who publish keyword difficulty estimations have very good abilities for determining how visitors behave. Google is the one who has that data, both from the SERPs and from Chrome, and from the engagement platforms like bookmarks and + and other things that they either control or can count.
Keyword difficulty is a brute force metric. Visitor satisfaction is much more discerning and very hard to measure.
which literally pisses me off.
How do your visitors feel when they try to use your website? Compare your site to the sites at the top of the SERPs. Do they have better content? Do they give a better visitor experience? Do they have a broader menu? Is their design better for navigation, comfort of reading, scanning, sharing, and all of the things that people want to do on a website. How do visitors feel when they click in.
Lots of people believe that it is really easy to earn good metrics. Really easy. But it is harder than Hell to please your visitor. How are you doing there? Take a look at be honest.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing URL's During a Site Redesign
What are the effects of changing URL's during a site redesign following all of the important processes (ie: 301 redirects, reindexing in google, submitting a new sitemap) ?
Intermediate & Advanced SEO | | jennifer-garcia0 -
Help! The website ranks fine but one of my web pages simply won't rank on Google!!!
One of our web pages will not rank on Google. The website as a whole ranks fine except just one section...We have tested and it looks fine...Google can crawl the page no problem. There are no spurious redirects in place. The content is fine. There is no duplicate page content issue. The page has a dozen product images (photos) but the load time of the page is absolutely fine. We have the submitted the page via webmaster and its fine. It gets listed but then a few hours later disappears!!! The site has not been penalised as we get good rankings with other pages. Can anyone help? Know about this problem?
Intermediate & Advanced SEO | | CayenneRed890 -
Site re-design, full site domain A/B test, will we drop in rankings while leaking traffic
We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete
Intermediate & Advanced SEO | | leshonk0 -
301's, Mixed-Case URLs, and Site Migration Disaster
Hello Moz Community, After placing trust in a developer to build & migrate our site, the site launched 9 weeks ago and has been one disaster after another. Sadly, after 16 months of development, we are building again, this time we are leveled-up and doing it in-house with our people. I have 1 topic I need advice on, and that is 301s. Here's the deal. The newbie developer used a mixed-case version for our URL structure. So what should have been /example-url became /Example-Url on all URLs. Awesome right? It was a duplicate content nightmare upon launch (among other things). We are re-building now. My question is this, do we bite the bullet for all URLs and 301 them to a proper lower-case URL structure? We've already lost a lot of link equity from 301ing the site the first time around. We were a PR 4 for the last 5 years on our homepage, now we are a PR 3. That is a substantial loss. For our primary keywords, we were on the first page for the big ones, for the last decade. Now, we are just barely cleaving to the second page, and many are 3rd page. I am afraid if we 301 all the URLs again, a 15% reduction in link equity per page is really going to hurt us, again. However, keeping the mixed-case URL structure is also a whammy. Building a brand new site, again, it seems like we should do it correctly and right all the previous wrongs. But on the other hand, another PR demotion and we'll be in line at the soup kitchen. What would you do?
Intermediate & Advanced SEO | | yogitrout10 -
How much does "overall site semantic theme" influence rankings?
OK. I've optimized sites before that are dedicated to 1, 2 or 3 products and or services. These sites inherently talk about one main thing - so the semantics of the content across the whole site reflect this. I get these ranked well on a local level. Now, take an e-commerce site - which I am working on - 2000 products, all of which are quite varied - cookware, diningware, art, decor, outdoor, appliances... there is a lot of different semantics throughout the site's different pages. Does this influence the ranking possibilities? Your opinion and time is appreciated. Thanks in advance.
Intermediate & Advanced SEO | | bjs20100 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
What's the best SEO practice for having dynamic content on the same URL?
Let's use this example... www.miniclip.com and there's a function to log in... If you're logged in and a cookie checks that you're logged in and you're on page, let's say, www.miniclip.com/racing-games however the banners being displayed would have more call to action and offers on the page when a user is not logged in to entice them to sign up but the URL would still be www.miniclip.com/racing-games if and if not logged in, what would be the best URL practice for this? just do it?
Intermediate & Advanced SEO | | AdiRste0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9