Keyword optimisation: Google's eyes before users' eyes?
-
Hi all,
So the default and ultimate suggestion about how to rank a page high is to get favoured by users, so by the Google. But if write content in favour of users, it may miss out the keywords or will not have much keyword density and variety of keywords to get in to Google's eyes. Then we may appear around 3rd page; then how do we get into top slots? I can see some top results without even a single mention of the keyword they are ranking for. How that would be possible?
Thanks
-
Hi there
The only way to "get the top spots" in Google is to make sure your site abides by SEO best practices, that great content is being written, and that you are building great backlinks. If you are conducting proper keyword research for your content, then you should be able to find a healthy balance between the language your target audience uses and what keywords get a healthy amount of search volume in search engines. Use these words / phrases naturally in your content and you'll be just fine.
Remember - Google is fantastic and making connections between topics and variations. If you are able to write your content that methodically uses these keywords, and your pages are optimized, then look into your link building strategy to begin distributing your content more effectively.
Also - don't forget to be conduct ongoing audits of your website as this will help / hinder ongoing search engine performance.
Hope this all helps! Good luck!
Patrick
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's Importance on usability issues in sub directories or sub domains?
Hi Moz community, As the different usability issues like pagespeed or mobile responsiveness are playing a key role in website rankings; I wonder how much the same factors are important for sub directories or sub domain pages? Do each and every page of sub directory or sub domain must be optimised like website pages? Does Google gives same importance? Thanks
Algorithm Updates | | vtmoz0 -
Very strange, inconsistent and unpredictable Google ranking
I have been searching through these forums and haven't come across someone that faces the same issue I am. The folks on the Google forums are certain this is an algorithm issue, but I just can't see the logic in that because this appears to be an issue fairly unique to me. I'll take you through what I've gone through. Sorry for it being long. Website URL: https://fenixbazaar.com 1. In early February, I made the switch to https with some small hiccups. Overall however the move was smooth, had redirects all in place, sitemap, indexing was all fine. 2. One night, my organic traffic dropped by almost 100%. All of my top-ranking articles completely disappeared from rank. Top keyword searches were no longer yielding my best performing articles on the front page of results, nor on the last page of results. My pages were still being indexed, but keyword searches weren't delivering my pages in results. I went from 70-100 active users to 0. 3. The next morning, everything was fine. Traffic back up. Top keywords yielding results for my site on the front page. All was back to normal. Traffic shot up. Only problem was the same issue happened that night, and again for the next three nights. Up and down. 4. I had a developer and SEO guy look into my backend to make sure everything was okay. He said there were some redirection issues but nothing that would cause such a significant drop. No errors in Search Console. No warnings. 5. Eventually, the issue stopped and my traffic improved back to where it was. Then everything went great: the site was accepted into Google News, I installed AMP pages perfectly and my traffic boomed for almost 2 weeks. 6. At this point numerous issues with my host provider, price increases, and incredibly outdated cpanel forced me to change hosts. I did without any issues, although I lost a number of articles albeit low-traffic ones in the move. These now deliver 404s and are no longer indexed in the sitemap. 7. After the move there were a number of AMP errors, which I resolved and now I sit at 0 errors. Perfect...or so it seems. 8. Last week I applied for hsts preload and am awaiting submission. My site was in working order and appeared set to get submitted. I applied after I changed hosts. 9. The past 5 days or so has seen good traffic, fantastic traffic to my AMP pages, great Google News tracking, linking from high-authority sites. Good performance all round. 10. I wake up this morning to find 0 active people on my site. I do a Google search and notice my site isn't even the first result whenever I do an actual search for my name. The site doesn't even rank for its own name! My site is still indexed but search results do not yield results for my actual sites. Check Search Console and realised the sitemap had been "processed" yesterday with most pages indexed, which is weird because it was submitted and processed about a week earlier. I resubmitted the sitemap and it appears to have been processed and approved immediately. No changes to search results. 11. All top-ranking content that previously placed in carousal or "Top Stories" in Google News have gone. Top-ranking keywords no longer bring back results with my site: I went through the top 10 ranking keywords for my site, my pages don't appear anywhere in the results, going as far back as page 20 (last page). The pages are still indexed when I check, but simply don't appear in search results. It's happening all over again! Is this an issue any of you have heard of before? Where a site is still being indexed, but has been completely removed from search results, only to return within a few hours? Up and down? I suspect it may be a technical issue, first with the move to https, and now with changing hosts. The fact the sitemap says processed yesterday, suggests maybe it updated and removed the 404s (there were maybe 10), and now Google is attempting to reindexed? Could this be viable? The reason I am skeptical of it being an algorithm issue is because within a matter of hours my articles are ranking again for certain keywords. And this issue has only happened after a change to the site has been applied. Any feedback would be greatly appreciated 🙂
Algorithm Updates | | fenixbazaar0 -
Reduced traffic from google news
I have a question for my site. Can you advice me for situation? Last 1 months my site reduced traffic from google and google news.. And i dont know why? I need your advice. Thanks. Yunus Altindag Diyarbakir/Turkey
Algorithm Updates | | maxbilgisayar0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Why has my homepage been replaced in Google by my Facebook page?
Hi. I was wondering if others have had this happen to them. Lately, I've noticed that on a couple of my sites the homepage no longer appears in the Google SERP. Instead, a Facebook page I've created appears in the position the homepage used to get. My subpages still get listed in Google--just not the homepage. Obviously, I'd prefer that both the homepage and Facebook page appear. Any thoughts on what's going on? Thanks for your help!
Algorithm Updates | | TuxedoCat0 -
What do you think Google analyzes for SERP ranking?
I've been doing some research trying to figure out how the Google algorithm works. The one thing that is constant is that nothing is constant. This makes me believe that Google takes a variable that all sites have and divides it by that number. One example would be taking the load time in MS and dividing it by the total number or points the website scored. This would give all of the websites a random appearance since there that variable would throw off all the other constants. I'm going to continue doing research but I was wondering what you guys think matters in the Google Algorithm. -Shane
Algorithm Updates | | Seoperior0 -
Is there a way to pull historical rankings for a keyword?
I have someone who's come to me and said that they have lost all of their organic keyword rankings. They did launch a site redesign a few months back so that could be a reason as to why. But after looking at the site, link profile, etc. It doesn't look like they could have been ranking for the terms they say they were. They have never implemented any SEO on their sites btw. I did not build this site and have not done any SEO, they are coming to me to solve the problem. I did notice in SEM rush that a couple months ago they were ranking organically for more terms (20 in July vs. 5 now), so they did lose some. Is there any way to see what terms they WERE ranking for?
Algorithm Updates | | MichaelWeisbaum0