Help identifying cause for total rank loss
-
Hello,
Last week I noticed one of my pages decreased in rank for a particular query from #8 to #13. Although I had recently made a few minor edits to the page (added an introductory paragraph and left-column promo to increase word count), I thought the reason for the decrease was due to a few newly ranked pages that I hadn't seen before.
In an attempt to regain my original position, I tried to optimize the meta title for the singular form of the word. After making this change, I fetched and rendered the page as Google (status = partial) and submitted the page for indexing (URL only, not including on-page links). Almost immediately after submitting, the page dropped from #13 out of the top 50.
I've since changed the meta title back to what it was originally and let Google crawl and index the page on its own, but the page is still not in the top 50. Could the addition of the page description and left column promos tipped the scales of keyword stuffing? If I change everything back to the way it was originally, is it reasonable to think I should regain my original position below the new pages? Any insights would be greatly appreciated!
-
I'll certainly bring it up. Maybe we can use the general concept and agree on a page title that accomplishes the same goal without sacrificing brand consistency.
Thanks again for the great suggestions.
-
lol... I understand.
Still, I would toss something like that out at the next meeting and tell them that it could be a Rocket Fuel. :0
-
That's a very interesting approach! I like the idea, but although it may be effective in gaining clicks and thus positively impacting our rank, our organization is very focused on maintaining a high level of professionalism, so informal title tags may not be suitable for us. I appreciate the suggestion, though!
-
We don't even think about singular or plural any more when writing title tags.
Here is a suggestion. When you visit a news site and see the content promotion ads from Taboola, Outbrain and others, read their captions. Those captions are designed to elicit clicks often to the point of being spammy, suggestive, explicit or vulgar. The people who write those title tags are experts. They know how to get the clicks and getting the clicks, I believe, is a huge ranking factor in the organic SERPs.
I have learned a lot from them and although I really dislike a lot of their ads they can give you an education. So, if I was selling foam rollers, I might consider title tags like...
Foam Rollers - Get yo' bones straight
Foam Rollers for when you back is out of whack !
Those are just very quick efforts and you can certainly do better because you know the product and how people use it. If there was a lot of money to be made selling foam rollers I would be spending a LOT of time to get a great title tag that makes people want to click into your webpage.
-
Awesome Joel! Keep me posted if you need anymore help!
-
Thanks for answering. Luckily we have other pages that rank on page 2 for that term. They are not really optimized for that keyword, but I'm reluctant to make any changes at this point!
Focusing on a readable message makes the most logical sense. We second-guessed this strategy based on the decreasing rank and thought we might be losing out by not including the singular form of the word, which is searched about 10 times more often than the plural.
-
^ This is a great practice.
-
is it reasonable to think the page would return to a similar position next time it is indexed?
I can't give you a confident answer. I know of situations where people made a lot of title tag changes in a short amount of time and the pages went deep into the SERPs and did not recover to any reasonable rankings for a long time.
At our office we rarely tweak title tags, when we do, we enter the details into our SEO log and wait a month or more before trying something new. Also, when we write title tags today we focus on a very readable message that is enticing instead of gunning keywords as was common a few years back.
-
Hi there,
Thank you for the quick reply! I am actually getting my ranking from a keyword tracker I have configured in Moz as I realize performing a search would yield inaccurate results from personalized search.
As for the sitemap, it has been properly submitted and is updated on the first of each month. Canonical tags are also set up correctly and have not been recently modified.
I'll take a look at some of the other links to see if I can find any helpful information there.
Thanks!
Joel
-
Hi there,
Thanks for the reply. I thought the same thing about an adjustment to the title tag making a small difference. That is why I was so surprised when the page dropped in rank so significantly. Unfortunately the multiple changes to the page make it impossible to isolate the detrimental factor, but if I revert all of the changes, is it reasonable to think the page would return to a similar position next time it is indexed?
-
You made a number of changes in a very short time. Keep in mind that google has lots of different servers, delivering SERPs pulled out of different data bases, with experimental algos running all of the time, new data going in, old data going out, and normal daily flux overprinting all of that.
To determine what is really happening, make one change and wait at least a couple of weeks before you make any judgement.
Also, for a query like "foam roller" there is going to be some competition, enough that a small tweak to a title tag is unlikely to make a huge difference.
If you change "Foam Rollers | OPTP" to "Foam Roller | Shop Premium Foam Rollers | OPTP".... that is a really small tweak that does nothing but make it a little repetitive. I would, instead go for something that.....
*** adds a value proposition like "free shipping"
*** elicits clicks such as... "over 100 varieties"
*** or broadens your keyword reach such as.... "for therapy and massage"..... "for painting and texture finishing"... depending upon what type of rollers you sell.
-
Hi there
One of the best ways to track your positions is to pay attention to your Average Position in Google Webmaster Tools. Reason being, with Google Personalized Search, the rankings you see while doing a search are going to be completely different than the search I would do. You can learn more about personalized search from this Whiteboard Friday. What Google Webmaster Tools does here is takes keywords and queries across multiple factors (location, device, browsing/search history, etc), and returns an average position for your site.
You can also look into Moz Keyword Tracking and SEMRush.
Before you change things, check the resources you have to see if in fact you are really losing positions, or if it's just a fluctuation.
A couple of suggestions I do have for you:
Product Schema.org
Check your sitemap and make sure it's properly submitted to Google and Bing Webmaster Tools I would also check your URL structures and canonical tags Check your information architecture
Lastly, check this KISSmetrics ecommerce SEO resource and this one for Search Engine Land Hope this all helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help! Is this what is called "cloaking"?
Friend asked me to look at her website. Ran it through screaming frog and BAM, instead of 4 pages i was expecting it returned HUNDREDS. 99.9% of them are for cheap viagra and pharmaceuticals. I asked her if she was selling viagra, which is fine, I don't judge. But she swears she isn't. http://janeflahertyesq.com I ran it through google site:janeflahertyesq.com and sure enough, if you click on some of those, they take you to canadien pharmacys selling half priced blue pills. a) is this cloaking? if not, what is going on? b) more importantly, how do I we get rid of those hundreds of pages / de-indexed She's stumped and scared. Any help would be greatly appreciated. Thank you all in advance and for the work you do.
White Hat / Black Hat SEO | | TeamPandoraBeauty0 -
Page plumetting with a optimisation score of 97\. HELP
Hi everyone, One of my pages has an optimisation score of 93, but ranks in 50+ place. What on earth can I do to address this? It's a course page so I've added the 'course' schema. I've added all the alt tags to say the keyword, UX signals aren't bad. Keyword is in the title tag. It has a meta description. Added an extra 7 internal, anchor-rich links pointing at the page this week. Nothing seems to address it. Any ideas? Cheers, Rhys
White Hat / Black Hat SEO | | SwanseaMedicine1 -
Sudden shift in rankings?
What would make a website move from position 14-17 (for the last year or so) to position 1 for a very competitive keyword when there are no obvious site changes?
White Hat / Black Hat SEO | | OUTsurance1 -
HELP! My website has been penalized - what did I do wrong?
I have been working on a website Zing.co.nz and have made a sub domain blog.zing.co.nz. The website is for a company that is yet to launch, so I have been boosting traffic by writing blog posts about the topic (loans) on the subdomain. I pushed some traffic to the actual website too. We were climbing the rankings for our brand name but have all of a sudden started to drop. The domain authority was something like 0.9 and has dropped to 0.3. (Using SEO Spyglass) The blog was somewhere similar, but has dropped to 0.0!!! Please help in anyway you can. These changes have happened within the last 48 hours. Zing.co.nz Blog.zing.co.nz
White Hat / Black Hat SEO | | Startupfactory0 -
How cloudflare might affect "rank juice" on numerous domains due to limited IP range?
We have implemented quite a few large websites onto cloudflare and have been very happy with our results. Since this has been successful so far, we have been considering putting some other companies on CL as well, but have some concerns due to the structure of their business and related websites. The companies run multiple networks of technology, review, news, and informational websites. All have good content (Almost all unique to each website) and rankings currently, but if implemented to cloudflare, would be sharing DNS and most likely IP's with eachother. Raising a concern of google reducing their link juice because it would be detected as if it was coming from the same server, such as people used to do for their blog farms. For example, they might be tasked to write an article on XYZ company's new product. A unique article would be generated for 5-10 websites, all with unique, informative, valid and relevant content to each domain; Including links, be it direct or contextual, to the XYZ product or website URL. To clarify, so there is no confusion...each article is relevant to its website... technology website- artciel about the engineering of xyz product
White Hat / Black Hat SEO | | MNoisy
business website - How xyz product is affecting the market or stock price
howto website - How the xyz product is properly used Currently all sites are on different IP's and servers due to their size, but if routed through cloudflare, will Google simply detect this as duplicate linking efforts or some type of "black hat" effort since its coming from cloudflare? If yes, is there a way to prevent this while still using CL?
If no, why and how is this different than someone doing this to trick google? Thank you in advance! I look forward to some informative answers.0 -
HELP - Site architecture of E-Commerce Mega Menu - Linkjuice flow
Hi everyone, I hope you have a couple of mins to give me your opinion. Ecommerce site has around 2000 products, in english and spanish, and around only 70 hits per day if that. We have done a lot of optimisation on the site - Page Titles, URL's, Content, H1's, etc.... Everything on page is pretty much under control, except I am starting to realise the site architecture could be harming our SEO efforts. Once someone arrives on site they are language detected and do a 302 to either domain.com/EN or domain.com/ES depending on their preferred language. Then on the homepage, we have the big MEGA MENU - and we have
White Hat / Black Hat SEO | | bjs2010
CAT 1
SubCat 1
SubsubCat 1
SubsubCat 2
SubsubCat 3 Overall, there are 145 "categories". Plus links to some CMS pages, like Home, Delivery terms, etc... Each Main Category, contains the products of everything related to that category - so for example:
KITCHENWARE
COOKWARE BAKINWARE
SAUCEPANS BOWLS
FRYING PANS Kitchenware contains: ALL PRODUCTS OF SUBCATS BELOW, SO COOKWARE ITEMS, SAUCEPANS, FRYING PANS, BAKINGWARE, etc... plus links to those categories through breadcrumbs and a left hand nav in addition to the mega menu above. So once the bots hit the site, immediately they have this structure to deal with. Here is what stats look like:
Domain Authority: 18 www.domain.com/EN/
PA: 27
mR: 3.99
mT: 4.90 www.domain.com/EN/CAT 1
PA: 15
mR: 3.05
mT: 4.54 www.domain.com/EN/CAT 1/SUBCAT1
PA: 15
mR: 3.05
mT: 4.54 Product pages themselves - have a PA of 1 and no mR or mT. I really need some other opinions here - I am thinking of: Removing links in Nav menu so it only contains CAT1 and SUBCAT1 but DELETE SUBSUBCATS1 which represent around 80 links Remove products within the CAT1 page - eg., the CAT 1 would "tile" graphical links to subcategories, but not display products themselves. So products are only available right at the lowest part of the chain (which will be shortened) But I am willing to hear any other ideas please - maybe another alternative is to start building links to boost DA and linkjuice? Thanks all, Ben0 -
Starting fresh on a new url after serious Penguin update down rank
Hi friends My site www.acupunctureclinicvictoriabc.com was recently hit by the penguin update and i dropped to page 5 of local searchs for my key words. A while back I had some bad link building done and now paying for it:( I thought the disavow tool (used 4 months ago) would deal with this issue but apparently not The current url is feeling like a lost cause. My question is if I start fresh on a new url, can I use my old content (or even clone the site and move it to a new url) without being punished for duplicate content on the new site? Any recommendations for starting fresh? I really appreciate any thoughts on this matter, as I am feeling a bit lost and bummed about this issue thanks!
White Hat / Black Hat SEO | | Silasrose0 -
Would the same template landing page (placed on 50+ targeted domains) help or hurt my ranking?
Scenario: Company ABC has 50 related domains that are being forwarding to the main company URL. Q1: Would there be SEO value by creating a template landing page for each domain that includes product info, photos and keyword links to the main URL? Q2: If all 50+ landing pages were the same, would that penalize the main site due to duplicate content?
White Hat / Black Hat SEO | | brianmeert0