Onpage Optimisation Changes
-
Hi Guys,
I love the SEO world I really do, but sometimes it can be quite confusing and even after 11 years and a few clients under my belt, I still have head-scratching days and this week has been one of them.
It seem the rules surrounding onpage optimisation of keywords have changed quite a lot this year. Whilst, I understand blatantly sticking to a 3% keyword density rate for your keywords, hasn't been good practice for a while, and with RankBrain and machine learning, we have to pay attention to semantic words and phrases but it seems there is a new set of rules I haven't learnt yet.
For example I had a client I was working on and we we noticed although they were ranking quite high for a keyword phrase, it wasn't actually mentioned in the text at all and so by adding it in a place it made sense, we should lift this and other keywords. Here is what happened, within a week their main keyword moved down from 1 to about 6 and the keyword that wasn't added moved from 4th to 23rd. After scratching my head and then going to full panic mode, I calmed down and looked at competitors, they didn't mention the word in the content either and so I decided to remove the one word we added to the text. The rankings came back overnight (well after doing a fetch as Google and getting to reindex). So if keyword density now is clearly NOT a metric to go on, how do we know the sweet spot? Do we use something like Ryte and make sure we using semantics and keywords within the average of the top ten? Does what Google deems important depend on the niche?
Not a right or wrong answer here, just interested in your thoughts
Regards
Neil
-
Hi All,
I will just give you all a quick update. I am more confused then when I first started!! I am telling my team from now if it's on the first page already we don't touch the keyword. Of course other keywords may also be trying to rank for the same page on page 2, so it's going to be a tricky one,
I have just done a quick experiment on a ecommerce page the keyword was on page 6, the changes I did yesterday made it go to page 7, further tweaks made it go to page 9!! This is mental! I am using Ryte to make sure semantics are mentioned, maybe using this tool and getting the averages that each keyword and related keyword is mentioned, is not the best idea. But without a guide how do we know what to optimise for?
I will give it a week as Nigel above said and see what happens. At least I know resetting to their original values before we started for any keywords that have declined will work.
Regards
Neil
-
No problem Neil, on-site SEO often makes me feel the same! It's easy to go from "following best SEO practices" to "over-optimization", and constant testing, checking results, and striving to always be learning what works today is essential. Best of success!
-
Hi Neil
That's really interesting and I have seen similar things happen. Google is determining user intent from each keyword and phrase typed into search, so the results will vary considerably with each semantic search. There is also a possible a stuffing issue, not just between identical keywords but semantics that we may need to consider. If adding that single word overplayed the whole piece, then the probability is that you were trying just that bit too hard. The whole thing has become very flaky and I have seen wild movements up and down which then appear to find a level after a week or even two.
For some clients I have seen wild movements for pages I haven't even touched - literally from position 5 to 83 and then back again within a week. Whatever Google is doing is unsettling - wrong even. It's hard enough for experienced SEOs but if you are a website owner with no experience of SEO it must be rather disheartening to experience the madness and start scratching your head looking for an explanation when there really isn't one.
-
Wow, so what you're suggesting is not only does keyword density seem to be applicable, but that it varies depending upon the specific keyword?
Wow! If that indeed turns out to be true after the conclusion of your further analysis, that's sounding to potentially be a headache for the SEO world when it comes to onsite lol!
-
Hi Nicholas,
Thank you for making me feel like I’m not going completely mad! I’m running an experiment now on a client site looking at a mixture of average keyword density of top ten for a keyword + searcher intent using sematic words that are in common with the top ten.
Remeber the days, you did the onpage month 1 and then built 300 ba each month! No skill, but easier times!! Haha!!
Regards
Neil
-
Very interesting situation Neil, it sounds like you did the right thing with looking at what the competition was doing, and then just making one change at a time, while using the Fetch & Render tool in GSC to see results faster.
While keyword density is a factor, Google is constantly trying to figure out the "searcher intent", and it is possible that adding this keyword in that seemed like a no-brainer changed Google's perspective of the page's "searcher intent". Just spitballing here of course, as there a ton of potential on-site factors at play. It sounds like you are on the right mindset though and I wouldn't totally throw keyword density out, just focus more on "content depth" and helping solve the searcher intent.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To change or not to change. DO we update our HTTP page to HTTPS? especially those with basic forms (phone, email, name)??
I recently went to a conference where a speaker strongly urged us to migrate to HTTPS before January 2017. I don't see any other sites referencing to make the switch before January 2017. whats the deal? 😉
On-Page Optimization | | millenniumsi0 -
Why would changing 404 pages increase traffic by 9%?
Neil Patel claimed in this article that by creating a custom 404 page that links out to 25 to 50 random internal pages on the website, he was able to increase the traffic of Techcrunch by 9%. I'm a bit skeptical about this claim. A couple of questions: Is this theory sound? If you've personally tried this or have read other articles supporting Neil, I'd love to learn more. Would a big site like Techcrunch really have problems with Google not indexing all of its pages? Also, does getting more pages crawled help you get more traffic? Specifically, would it help a site like mine? For reference, my site gets an average of 12,040 pages crawled per day in last 90 days. Currently 28,922 pages have been indexed. Are there any possible downsides to trying this? Thanks!
On-Page Optimization | | Brand_Psychic0 -
Optimising a page for multiple key phrases?
Is there a technique to optimising for multiple key phrases? In the "old days", we'd have written doorway pages targeted at different key phrases, or just written a landing page for each key phrase. Now we're told that more is better and having all the info about a topic in one place will get you better SEO outcomes. But that means pages must be optimised for multiple key phrases. For example, I currently have three pages that are related topically: Bangkok Skytrain (Guide to BTS and MRT Lines) - this page is a description of the metro train system in Bangkok and how to use it. Gets traffic from key phrases like "bangkok BTS line", "bangkok commuter trains", "BTS and MRT lines". Attractions near the Bangkok Skytrain - this page has a map for each major skytrain station and details of nearby attractions including hotels and restaurants. Gets traffic from phrases like "bangkok mrt and bts map", "bangkok rail link map", "how to get to siam on MRT" and "bangkok airport rail link map" (so mostly gets key phrases with "map" in them). Best shopping from the Bangkok Skytrain - this page talks about the shopping centres in easy walking distance of each skytrain station. Doesn't really get a lot of traffic and probably pulls that from the other two. Ideally, I probably should combine all of these into one page now. But how to optimise for all those key phrases? Should I just optimise within each Heading 2 as I would within a page? Does that risk confusing the overall page SEO?
On-Page Optimization | | Gavin.Atkinson0 -
Content with changing URL and duplicate content
Hi everyone, I have a question regarding content (user reviews), that are changing URL all the time. We get a lot of reviews from users that have been dining at our partner restaurants, which get posted on our site under (new) “reviews”. My worry however is that the URL for these reviews is changing all the time. The reason for this is that they start on page 1, and then get pushed down to page 2, and so on when new reviews come in. http://www.r2n.dk/restaurant-anmeldelser I’m guessing that this could cause for serious indexing problems? I can see in google that some reviews are indexed multiple times with different URLs, and some are not indexed at all. We further more have the specific reviews under each restaurant profile. I’m not sure if this could be considered duplicate content? Maybe we should tell google not to index the “new reviews section” by using robots.txt. We don’t get much traffic on these URLs anyways, and all reviews are still under each restaurant-profile. Or maybe the canonical tag can be used? I look forward to your input. Cheers, Christian
On-Page Optimization | | Christian_T2 -
My Meta Description changes when i use different keyword in google search.
Hello everyone, I have a question for the community. I have a website with several articles and news that i manage. I set specific meta descriptions for every page but when i search in google it gives me back different meta descriptions depending on the keyword that i use to search. What i notice is that google looks in my page for the most relevant part of the text that combines with my keyword and gives me back that result. I thought that this only happen when i have an empty meta description. Anyone felt the same ? Best Ricardo www.meuportalfinanceiro.pt
On-Page Optimization | | Adclick0 -
Keyword repeats/presence in url's & over-optimisation
Hi I'm about to launch a redesigned site and worried about overdoing kw presence on-page, primarily using in url's since will already be using kw in titles as well as page content. What's current thinking re over optimisation: If kw is in titles and page content is it best not to repeat again in url structure i.e. less is more, even though this will cause things like SeoMoz on-page grade score to fall, or better to keep them/add them ? Personally i think it makes sense to include kw in url again since helps make the page relevant, and so long as matches the content should help as opposed to hinder rankings for the pages target keyword. However when i look into this some say don't do this since is over-optimisation The sites generally ranking quite well for its target kw which i obviously don't want to lose after re-launch & hopefully improve further, in the case of this example they are 'Sports Centre Services' & 'Sports Centre Equipment Rental'). The sites current url structure is similar to this below example: frankssportscentres.com/services/sports-centre-equipment-rental Would it be better to keep following existing/above format or to go with either of the below options i.e. more kw rich urls or less: frankssportscentres.com/sports-centre-services/sports-centre-equipment-rental Or frankssportscentres.com/sports-centre-services/equipment-rental Or even less frankssportscentres.com/services/equipment-rental Many Thanks in advance for any helpful comments Cheers Dan
On-Page Optimization | | Dan-Lawrence0 -
I changed my site from HTML to PHP and I need to get some help.
Ok...so the other day I went from HTML to PHP in every part of my website. I want to know the best option for me for redirecting my pages from HTML to php. I had my site scanned with SEOMoz and I was given many 404 errors which is not at all good. I do not have any pages of my site linking to any of these html pages. All of the site links have been updated. I have checked 3 times. I have never created a robots.txt file so I would love to get a little help with this part. I was thinking it would be best to tell Google not to worry about these pages in the file. I kept the pages up and I plan to remove all code with them so that no content shows up if someone visits but the issue with that is my site is already indexed as HTML. I want to have the HTML pages redirect to the PHP without worrying that my visitors will land on my site via Google onto an HTML page. I hope I am making sense. What is the best advice you can give me. I need all pages to redirect to PHP. I used an htaccess redirect from all HTML to PHP but when I get so many of them added I get an error on my site saying too many redirects. Seriously need help.
On-Page Optimization | | TrendyHost0 -
Google VS Yahoo VS Bing & Onpage Optimization in 2011
I was just wondering if someone could point out to me any known differences between these three search engines. I feel like i have been spending a lot of time optimizing for google, but don't have much of an idea of how to optimize for yahoo or bing. Do you have any up-to date article links or tips/advice?
On-Page Optimization | | adriandg0