Page plumetting with a optimisation score of 97\. HELP
-
Hi everyone,
One of my pages has an optimisation score of 93, but ranks in 50+ place. What on earth can I do to address this? It's a course page so I've added the 'course' schema. I've added all the alt tags to say the keyword, UX signals aren't bad. Keyword is in the title tag. It has a meta description. Added an extra 7 internal, anchor-rich links pointing at the page this week. Nothing seems to address it. Any ideas?
Cheers,
Rhys
-
Hi Nicholas,
Thanks for such a detailed response, very helpful!
One question, regarding file size, what would be the largest that you would recommend?
Cheers,
Rhys
-
A lot of possibilities here, if you have just recently made the page changes you indicated above, I would recommend utilizing the Fetch and Render command in Google Search Console, and then Request To Re-Index the page. This will speed up the time it takes for Google to re-index your page with the changes you mentioned above (I have seen improvements in as fast as a day). You may need to verify your website in Google Search Console if you have not already in order to do this.
In addition to this, it may be beneficial to use more LSI (similar user intent) keywords on the page as it may be "over-optimized", for example if you have a page on lawn care that you want to rank for "lawn care in _____", try using "lawn services" and "lawn maintenance" in the H2s, image alt text, and content more instead of just re-using "lawn care" 99 times on the page. Also considering the length of the page is important as well, see if you can add a paragraph or two in new, unique content that mentions your keyword once or twice.
If neither of those work, it's time to start doing some backlink research to see what backlinks your competitors have that are ranking in the top 3-5 positions on Google for the keyword you are wanting to rank for. Use Moz' Open Site Explorer, Ahrefs, or SEMrush will be great in helping with this. I would also do a quick page speed audit, check the page's loading time with Pingdom and/or Google pagespeed insights. You may want to decrease the size of photos on the page or leverage cacheing (may need the help of a website developer depending on resources).
On-Site SEO is merely one facet of ranking your webpage higher, and if your keyword term that you are wanting to rank for is competitive you need to pay attention to technical SEO and Off-Site SEO and Quality Backlinks to the page as well, even if you have an "optimization score of 100" with whatever analysis tool you are using. Hope this helps and best of success!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best seo benefit location ( main page text or h1 , h2)?
i have learned that h1 has more value than h2 and h2 has more than h3, but lets say if i want to place my keywords in there. should i include them in the main body or should take advantage of header tags?
White Hat / Black Hat SEO | | Sam09schulz0 -
Canonical tag On Each Page With Same Page URL - Its Harmful For SEO or Not?
Hi. I have an e-commerce project and they have canonical code in each and every page for it's own URL. (Canonical on Original Page No duplicate page) The url of my wesite is like this: "https://www.website.com/products/produt1"
White Hat / Black Hat SEO | | HuptechWebseo
and the site is having canonical code like this: " This is occurring in each and every products as well as every pages of my website. Now, my question is that "is it harmful for the SEO?" Or "should I remove this tags from all pages?" Is that any benefit for using the canonical tag for the same URL (Original URL)?0 -
More than 450 Pages Created by a hacker
Hi Moz Community, I am in charge of the Spanish SEO for an international company, related to security. A couple of months ago, I realized that my Spanish/keywords/post all vanished from Google, Yahoo, Bing and Duckduckgo. Then I noticed that somebody in command of the main website used a disavow all! I was in shock, as all of you can imagine. Knowing that all the inbound links were spam score under 4, highly relevant and so. Later on, I was informed that the website was hacked and somebody took that action. Of course, it did not solved the issue. I continue researching and found those pages - "Online%20Games%20-%20Should%20Parents%20Worry%20Or%20Celebrate%3F" - all of them like this one. I informed the owner of the website - he is not my client - my client is the Spanish Manager. They erased the pages, of course plus sent all those, to avoid the 404 responses, to the homepage with a 301. My heart stopped at that point! I asked them to send all those with a redirect 301 to a new hidden page with nofollow and noindex directives. We recover, my keywords/pages are in the first page again. Although the DA fell 7 points and no inbound links for now. I asked for the disavow file "to rewrite it", not received yet. Any better ideas? Encountered a similar issue? How did you solved it?
White Hat / Black Hat SEO | | Mª Verónica B.
Thanks in advance.0 -
A Sitemap Web page & A Sitemap in htaccess - will a website be penalised for having both?
Hi I have a sitemap url already generated by SEO Yoast in the htaccess file, and I have submitted that to the search engines. I'd already created a sitemap web page on the website, also as a helpful aid for users to see a list of all page urls. Is this a problem and could this scenario create duplicate issues or any problems with search engines? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
HELP - Site architecture of E-Commerce Mega Menu - Linkjuice flow
Hi everyone, I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that. We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts. Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language. Then on the homepage, we have the big MEGA MENU - and we have
White Hat / Black Hat SEO | | bjs2010
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3 Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc... Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above. So once the bots hit the site, immediately they have this structure to deal with. Here is what stats look like:
Domain Authority: 18 www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90 www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54 www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54 Product pages themselves - have a PA of 1 and no mR or mT. I really need some other opinions here - I am thinking of: Removing links in Nav menu so it only contains CAT1 and SUBCAT1 but DELETE SUBSUBCATS1 which represent around 80 links Remove products within the CAT1 page - eg., the CAT 1 would "tile" graphical links to subcategories, but not display products themselves. So products are only available right at the lowest part of the chain (which will be shortened) But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice? Thanks all, Ben0 -
HOW TO: City Targeted Landing Pages For Lead Generation
Hi guys, So one of my clients runs a web development agency in San Diego and for lead generation purposes we are thinking of creating him city targeted landing pages which will all be on different domains ie. lawebdesginstudio / sfwebdesigngurus I plan to register these 20-30 domains for my client and load them all up on a my single linux server I have from godaddy. I noticed however today using google's keyword tool that roughly only 5-10 cities have real traffic worth trying to capture to turn into leads. Therefore I am not sure if its even worth building those extra 20 landing pages since they will receive very little traffic. My only thought is, if I do decide to build all 30 landing pages, then I assume I will have a very strong private network of authority websites that I can use to point to the clients website. I mean I figure I can rank almost all of them page 1 top 5 within 2-3 months. My question is: 1. Do city targeted micro sites for the purpose of lead generation still work? If so are there any threads that have more info on this topic? 2. Do you suggest I interlink all 30 sites together and perhaps point them all to the money site? If so i'm wondering if I should diversify the ip's that I used to register the domains as well as the whois info. Thanks guys, all help is appreciated!
White Hat / Black Hat SEO | | AM2130 -
One Blog Comment Now on Many Pages of The Same Domain
My question is I blog commented on this site http://blogirature.com/2012/07/01/half-of-200-signals-in-googles-ranking-algorithm-revealed/#comment-272 under the name "Peter Rota". For some reason the recent comments is a site wide link so, bascially my link from my website is pretty much on each page of their site now. I also noticed that the anchor text for each one of my links says "Peter Rota". This is my concern will google think its spammy if im on a lot of pages on a same site for one blog comment, and will I be penailzied for the exact same anchor text on each page? If this is the case what could I do in trying to get the links removed? thanks
White Hat / Black Hat SEO | | ilyaelbert0 -
Single Domain With Different Pages Deep Linking To Different Pages On External Domain
I've been partaking in an extensive trial study and will be releasing the results soon, however I do have quite a strong indication to the answer to this question but would like to see what everyone else thinks first, to see where the common industry mindset is at. Let's say SiteA.com/page1.html is PR5 and links out to SiteB.com/page1.html This of course would count as a valuable backlink. Now, what would happen if SiteA.com/page2.html, which is also PR5, links out to SiteB.com/page2.html ? The link from SiteA is coming from a different page, and is also pointing to a different deeplink on SiteB, however it will contain the same IP address. What would the benefit be for having multiple deeplinks in this way (as outlined above, please read it carefully before responding) as opposed to having just a single deeplink from the domain? If a benefit does exist, then does the benefit start to become trivial? This has nothing to do with sitewide links. Serious answers only please.
White Hat / Black Hat SEO | | stevenheron1