Page plumetting with a optimisation score of 97\. HELP
-
Hi everyone,
One of my pages has an optimisation score of 93, but ranks in 50+ place. What on earth can I do to address this? It's a course page so I've added the 'course' schema. I've added all the alt tags to say the keyword, UX signals aren't bad. Keyword is in the title tag. It has a meta description. Added an extra 7 internal, anchor-rich links pointing at the page this week. Nothing seems to address it. Any ideas?
Cheers,
Rhys
-
Hi Nicholas,
Thanks for such a detailed response, very helpful!
One question, regarding file size, what would be the largest that you would recommend?
Cheers,
Rhys
-
A lot of possibilities here, if you have just recently made the page changes you indicated above, I would recommend utilizing the Fetch and Render command in Google Search Console, and then Request To Re-Index the page. This will speed up the time it takes for Google to re-index your page with the changes you mentioned above (I have seen improvements in as fast as a day). You may need to verify your website in Google Search Console if you have not already in order to do this.
In addition to this, it may be beneficial to use more LSI (similar user intent) keywords on the page as it may be "over-optimized", for example if you have a page on lawn care that you want to rank for "lawn care in _____", try using "lawn services" and "lawn maintenance" in the H2s, image alt text, and content more instead of just re-using "lawn care" 99 times on the page. Also considering the length of the page is important as well, see if you can add a paragraph or two in new, unique content that mentions your keyword once or twice.
If neither of those work, it's time to start doing some backlink research to see what backlinks your competitors have that are ranking in the top 3-5 positions on Google for the keyword you are wanting to rank for. Use Moz' Open Site Explorer, Ahrefs, or SEMrush will be great in helping with this. I would also do a quick page speed audit, check the page's loading time with Pingdom and/or Google pagespeed insights. You may want to decrease the size of photos on the page or leverage cacheing (may need the help of a website developer depending on resources).
On-Site SEO is merely one facet of ranking your webpage higher, and if your keyword term that you are wanting to rank for is competitive you need to pay attention to technical SEO and Off-Site SEO and Quality Backlinks to the page as well, even if you have an "optimization score of 100" with whatever analysis tool you are using. Hope this helps and best of success!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise / Help on Bad Link Removals
Hey everyone.
White Hat / Black Hat SEO | | TheITOteam
Im new to the community and new to backlinks - hence the question to the community today.
I would like help understanding options and work load around back links and removing them.
I have a client with over 8000 back links as a few years ago he paid someone about £10 to boost his rankings by adding thousands of backlinks.
We fear this is having a bad effect on their site and rankings organically as 90% of these back links have a spam score of over 50% and also no follows. My questions to the community (if you could be so kind to share) are:
1. Whats the best way to decide if a Backlink is worth keeping or removing
2. Is there a tool to decide this or assist with this somewhere on the internet? Ive had advise stating if its not hurting the page we should keep it. However, again...
How do I know what damage each Backlink is causing to the domain? I appriciate anyones time to offer some advice to a novice looking to clear these1 -
Are landing pages making a comeback
Just recently I have noticed a ever increasing number of landing pages on websites, the ones I have come across have been in the sports industry like rugby/football and their landing pages are sparse but offering the social avenues on a plate. are Landing pages making their way back in the seo industry?
White Hat / Black Hat SEO | | TeamacPaints0 -
Do pingbacks in Wordpress help or harm SEO? Or neither?
Hey everyone, Just wondering, do pingbacks in Wordpress help or harm SEO? Or neither?
White Hat / Black Hat SEO | | jhinchcliffe1 -
Article submission, and how to build backlinks for Ecommerce? [HELP]
Hi guys, I have a question, for high quality backlinks apparently you go to these article websites where you submit your site such as Ezine etc etc, however is it just one article you submit to these as it'll look like duplicate content? Also can I have it on my site first? How does it work? Also I run an ecommerce website, how can I build backlinks to each product, theres over 200+ products and 1.6k subcategories. I would like to rank for as many as possible but getting an SEO company to do this would cost to much. Any ideas on how I should go about it?
White Hat / Black Hat SEO | | InkCartridgesFast1 -
Google Results Pages. after the bomb
So, ever since Google "went nuclear" a few weeks ago I have seen major fluctuations in search engine results pages. Basically what I am seeing is a settling down and RE-inclusion of some of my web properties. Basically I had a client affected by the hack job initially, but about a week later I not only saw my original ranking restored but a litany of other long tails appeared. I wasn't using any shady link techniques but did have considerable article distribution that MAY have connected me inadvertently to some of the "bad neighborhoods." The website itself is a great site with original relevant content, so if it is possible, Google definitely recognized some error in their destructive ranking adjustment and is making good on it for those sites that did not deserve the penalty. Alternatively, it could just be random Google reordering and I got lucky. What are your experiences with the updates?
White Hat / Black Hat SEO | | TheGrid0 -
Thought on optimising the perfect keyword location link
My site works a bit like a directory, so say I have a page called "Ice Cream Vendors" - on that page I would talk a bit about Ice Cream Vendors, then I will have a list of Ice Cream Vendor Locations. My list of locations can be quite big depending on the product and the amount of locations they occur in - when you click a location, it goes to a page showing all "ICeCream Vendors" in that location. So Currently I will have a table on the page a bit like this: ICE CREAM VENDOR LOCATIONS
White Hat / Black Hat SEO | | James77
New York
Miami
Las Vegas This is all perfectly nice, simple and usable - BUT it is not producing perfect keyword links - for perfect keyword links the list should be like this: ICE CREAM VENDOR LOCATIONS
New York Ice Cream Vendors
Miami Ice Cream Vendors
Las Vegas Ice Cream Vendors Now I have my perfect anchor links - BUT it looks rediculous and is NOT user friendly. So What do I do?
1/. Build it for users and not have perfect anchor links, and loose in SEO?
2/. Build a perfect SEO links and make it less usable and looking spammy? OR 3/. Deliver the search engine the perfect SEO links, and the user the userfriendly version? In this I mean I could do the following:
SE's (and screen readers I think would see):
ICE CREAM VENDOR LOCATIONS
New York Ice Cream Vendors
Miami Ice Cream Vendors
Las Vegas Ice Cream Vendors Users would See
ICE CREAM VENDOR LOCATIONS
New York
Miami
Las Vegas Now in my view I am doing nothing wrong - I am mearly giving the user the most userfriendly version and I am giving the SE more information on the link, that the user doesn't need. So - In my view I am doing something that is honest - but what are your thoughts?? Has anyone tried to do this? Thanks0 -
For traffic sent by the search engines, how much personalization/customization is allowed on a page if any?
If I want to better target my audience so I would like to be able to address the exact query string coming from the search engine. I'd also like to add relevant sections to the site based in the geo area they live in. Can I customize a small portion of the page to fit my visitors search query and geo area per the IP address? How much can I change a web page to better fit a user and still be within the search engine's guidelines?
White Hat / Black Hat SEO | | Thos0030 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0