Weird behavior with site's rankings
-
I have a problem with my site's rankings.
I rank for higher difficulty (but lower search volume) keywords , but my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off.I thought very seriously to start new with a new domain name, cause what ever i do seems that is not working.
I will admit that in past (2-3 years ago) i used some of those "seo packages" i had found, but those links which were like no more than 50, are all deleted now, and the domains are disavowed.
The only thing i can think of, is that some how my site got flagged as suspicious or something like that in google.Like 1 month ago, i wrote an article about a topic related with my niche, around a keyword that has difficulty 41%. The search term in 1st page has high authority domains, including a wikipedia page, and i currently rank in the 3rd place.
In the other had, i would expect to rank easily for a keyword difficulty of 30-35% but is happening the exact opposite.The pages i try to rank, are not spammy, are checked with moz tools, and also with canirank spam filters. All is good and green. Plus the content of those pages i try to rank have a Content Relevancy Score which varies from 98% to 100%...
Your opinion would be very helpful, thank you.
-
Hi Nikos,
It's important to remember that Keyword Difficulty scores are a Moz metric, not a Google metric - they are based on Moz' ability to judge how well other sites are competing for that term, and may not capture the entire competitive landscape (since nobody except Google knows everything that Google looks at).
Based on your ability to rank well for some terms and not others, it doesn't seem likely to me that you are under any sort of penalty, so much as that Google just isn't ranking you for some terms. In addition to the Keyword Difficulty scores for each term, take a look at which sites rank for the term (you can do this in the SERP Analysis feature of the Keyword Difficulty tool. Ask youself:
- What kinds of sites rank for this term? For example, if you are an individual business, but all of the sites and pages that are ranking for that term are aggregators or lists of multiple sites, it may be that Google has determined that an individual business site is not a good fit for that query. Similarly, if your page is a blog post and no other blog posts appear in the SERP, Google may have decided that a blog post isn't what people are looking for when they search that term.
- What is the search intent of the query? Based on the other pages that rank, what is the question or task that Google has decided users are trying to answer or complete when they search this term? Does your page do a better example of helping answer that question or complete that task than the other pages that rank?
- What types of content are ranking? Do they all have rich snippets? Are there images, video, shopping or maps results? All of these will tell you more about the kind of content Google thinks will match this query.
- Is there a specific page or website that is ranking for that term that you think you could push out of the top 10? Look for areas of opportunity. For example, maybe there is a site with high authority, but the page that ranks has very low page authority and doesn't fit the query very well. Try to create a page that is better than that page, specifically.
- How closely is the phrase related to your niche? You can tell from the keywords you are successfully ranking for, which topic areas Google is associating with your site. If you have a whole site about chocolates, it will be harder to rank a page about asparagus, even if the difficulty score is lower.
Also, don't forget to continue promoting your content to earn high-authority links to individual content pieces. Where it makes sense to do so, you may also want to link internally from some of your more popular and successful pages to some of the pages that are struggling.
I hope that helps!
-
Hi!
I have the same question as before
If someone has an idea, i would love to hear it -
Hi Nikos! Did EGOL answer your question? If so, please mark his response as a "Good Answer." If not, what questions do you still have?
-
Thanks for your answer.
User experience was one of my first concerns. So i purchased a bootstrap theme, which actually looks very good and is very user friendly. You can check it here. The pages i try to rank for, looks very similar to that one.
Time on site and Bounch rate
Average Bounch rate is 60% , and average time on page is 4 minutes, and 10 seconds (average last month metrics). My site is actually a review site if that helps you somehow.I receive often link requests from other webmasters (meaning other people think my site looks, and content is good), so overal, i don't think my site deserving those rankings. Unless some "old sins" are chasing me.
-
my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off.
We often focus too much on competitive metrics and not enough about the presentation that we are making to our visitors. Many search professionals believe that google is looking at the behavior of visitors, how long they stay, how far they scroll, the number who click in, do they bookmark, do they share your site with friends... and more important... Are They Asking for You By Name in navigational and domain queries?
This is much of the "machine learning" that Google has patented and what they say they are using in some of their new algorithms. I've believe that this has been important for a long time and was willing to stick my neck out about it and bet my ranch a long time ago.
lower difficulty, higher volume keywords
The numbers you are looking at are not based upon what visitors think of your site and how they behave, they are based upon completely different things. I don't think that Moz or others who publish keyword difficulty estimations have very good abilities for determining how visitors behave. Google is the one who has that data, both from the SERPs and from Chrome, and from the engagement platforms like bookmarks and + and other things that they either control or can count.
Keyword difficulty is a brute force metric. Visitor satisfaction is much more discerning and very hard to measure.
which literally pisses me off.
How do your visitors feel when they try to use your website? Compare your site to the sites at the top of the SERPs. Do they have better content? Do they give a better visitor experience? Do they have a broader menu? Is their design better for navigation, comfort of reading, scanning, sharing, and all of the things that people want to do on a website. How do visitors feel when they click in.
Lots of people believe that it is really easy to earn good metrics. Really easy. But it is harder than Hell to please your visitor. How are you doing there? Take a look at be honest.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
Magento: Should we disable old URL's or delete the page altogether
Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages. We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google. I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap. My questions are: 1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether? 2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?
Intermediate & Advanced SEO | | andyheath0 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
Why is my website not ranking? Has A grade on-site from Moz.
I have a website that I am trying to get to up the SERPs. However the site isn't appearing in the search - even when I search for the business name. The site is http://www.jl-engineering.com/ The keyword targeted is DPF Cleaning. Could anyone explain to me why the site isn't showing at all - and how to fix this? Thanks
Intermediate & Advanced SEO | | SWD.Advertising0 -
New site causes massive drop off in ranking, old site restored how long to recover?
Hello, We launched and updated version of our site, mainly design changes and some functionality. 3 days after the launch we vanished from the rankings, previous page one results were now out of the top 100. We have identified some of the issues with the new site and chose to restore the old well ranking site. My question is how long might it take for the ranking to come back, if at all? The drop happened on the third day and the site was restored on the third day. We are now on day 6. Using GWT with have used fetch as Google and resubmitted the site map. Any help would be gladly received. Thanks James
Intermediate & Advanced SEO | | JamesBryant0 -
Investigating Google's treatment of different pages on our site - canonicals, addresses, and more.
Hey all - I hesitate to ask this question, but have spent weeks trying to figure it out to no avail. We are a real estate company and many of our building pages do not show up for a given address. I first thought maybe google did not like us, but we show up well for certain keywords 3rd for Houston office space and dallas office space, etc. We have decent DA and inbound links, but for some reason we do not show up for addresses. An example, 44 Wall St or 44 Wall St office space, we are no where to be found. Our title and description should allow us to easily picked up, but after scrolling through 15 pages (with a ton of non relevant results), we do not show up. This happens quite a bit. I have checked we are being crawled by looking at 44 Wall St TheSquareFoot and checking the cause. We have individual listing pages (with the same titles and descriptions) inside the buildings, but use canonical tags to let google know that these are related and want the building pages to be dominant. I have worked though quite a few tests and can not come up with a reason. If we were just page 7 and never moved it would be one thing, but since we do not show up at all, it almost seems like google is punishing us. My hope is there is one thing that we are doing wrong that is easily fixed. I realize in an ideal world we would have shorter URLs and other nits and nats, but this feels like something that would help us go from page 3 to page 1, not prevent us from ranking at all. Any thoughts or helpful comments would be greatly appreciated. http://www.thesquarefoot.com/buildings/ny/new-york/10005/lower-manhattan/44-wall-st/44-wall-street We do show up one page 1 for this building - http://www.thesquarefoot.com/buildings/ny/new-york/10036/midtown/1501-broadway, but is the exception. I have tried investigating any differences, but am quite baffled.
Intermediate & Advanced SEO | | AtticusBerg10 -
Case study of site going up the rank
Is there a case study or an online article using good seo techniques (real examples) on how (preferably a small ) site started from point zero, employed good seo practices, moz tools, and went up the ranks? Some thing I can see as real example of taking the site from page 100 to page 1 rank? This would be great for my learning purpose. Thanks
Intermediate & Advanced SEO | | zsyed0 -
Can you spot the reasons for our site dropping in rankings so significantly?
We've been racking our brains over this since the recent search engine changes (the notorious and non-cuddley Google Panda update) and have, within reason, corrected as many of the problems that we possibly can yet still our traffic drops further. http://www.bedandbreakfastsguide.com used to rank fairly equally with it's competitors however since the update (and a number of suggestions from another SEO company), the traffic has dropped by about 90% and it's dropped almost completely from the search results (unlike the competitors who are breaking many faux-pars yet remain well ranked). I don't think we're seeing the wood from the trees anymore so I'd be grateful if someone could take a look and see if we've missed anything glaringly obvious? Any thoughts welcome. Thanks Tim Big changes around the same time/since that might be worth noting: Setup a canonical domain name of www.bedandbreakfastsguide.com and (using IIS7) 301 redirect all other traffic over. Setup canonical URL meta tag for all results pages so they point to a single page Moved the redirect page (the one which sends users to the B&B's site) to another subdomain. Redesigned the URLs where possible to use "friendlier" and more keyword rich urls and 301 redirecting for the old urls Added XML sitemaps to the various tools (we found out they weren't there before) Added a robots.txt file Lowercased all urls Where possible removed duplicate results pages and pointed them at a single page Restructured the page titles to be more relevant Setup nofollow on the external urls
Intermediate & Advanced SEO | | TimGaunt0