Webpage has bombed outside of Top 50 for search term in one week. What's the cause?
-
I've been monitoring the performance of some pages via the email Moz sends every week, and until this week two pages that I've managed to get ranking have ranked between 20 and 23 for the specific term. However, today on the email one of the pages for one search term has bombed out of the top 50 while the other page has remained unaffected.
What could be the cause for this? I've looked at Google Webmasters for an indication of a penalty of some sort but there is nothing glaringly obvious. I've no messages on there, and I haven't bought a load of spam links at all.
What else could I check?
-
"two pages that I've managed to get ranking have ranked between 20 and 23 for the specific term. However, today on the email one of the pages for one search term has bombed out of the top 50 while the other page has remained unaffected."
Sometimes, if you have two pages that are ranking for the same search query it's not uncommon for Google to decide that only one of the pages needs to be presented to the user. If both are serving the same user intent then essentially Google may consider it (semantically) duplicate content, despite the fact that both pages may be worded differently, etc.
From my experience of having multiple pages ranking for the same keyword, the pages will keep battling it out in the SERPs bouncing up and down. One week, there'll be a cluster of 3 ranking terribly. The next week one will shoot up, while the other is nowhere to be seen. Personally I've found that Google seems to prefer it if there's only one page ranking for the term (it's an easier decision for Google to make and it won't get so confused which one to rank as more relevant to the query). By merging similar pages, I find that it ends up being stronger in Google as it's not having to compete in the SERPs with similar pages on your website.
I hope that helps at all, even if it's only from anecdotal evidence.
-
Hi Mick,
The first thing to do is always verify the ranking change. Open up an incognito window in Chrome, search the term and see if it has dropped there as well. Google fluctuates quite a bit and sometimes ranking shifts are ephemeral and will return.
Second, we do find that there is far more fluctuation beyond page 1. Whether this is due to cruder metrics, lack of stabilization by engagement metrics, etc. is unknown, but what is certain is that greater fluctuation seems occur the deeper you get in the search results. I would not be highly concerned with this rankings loss at face value.
However, there are some things you can check.
- Have you lost any links pointing to this page recently
- Have you made any substantive changes to the site, such as internal link structure
- Have you introduced alternate content on your site that may now outrank this page
These are just a couple of the first steps you can look at.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
Intermediate & Advanced SEO | | McTaggart
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on... OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke0 -
DA dropped 7 pts in one week as well as our competitors. Is something going on?
We have consistently had a 70 in DA and now all of the sudden we are at a 63 as well as seeing the same considerable drop among our competitors. This was odd as I have not seen this type of drop in DA since being with the company. Any ideas are appreciated.
Intermediate & Advanced SEO | | WeaverMike0 -
Finding Ranking for search term and increasing ranking
Hi. The company that I'm working with would like to rank highly in google for certain generic search terms (dentist, dentists, etc.). Certain websites the company has used to rank highly in google for generic keywords, but has not for years now since google has revised their algorithm so many times. Moz lists that the company websites are not found in the top 51+ results in google. My first question is: **Is there a way, apart from manually searching the results, to find the ranking position of the website in google? **Ideally, I would like to find a program that will do this. Second, I've been reading a lot of the great articles and comments on Moz, and I've been learning a lot more about SEO. My focus has shifted to spending more attention on User Experience and Social Media instead of placing the exact keywords in the pages / tags of the website. What area(s) should I be focusing on to best increase the ranking of the company website for certain generic terms? Ideally, I'd like to create good quality content, so that users will not instantly click away. I appreciate any thoughts or comments. Thank you in advance!
Intermediate & Advanced SEO | | americasmiles0 -
Do I eventually 301 a page on our site that "expires," to a page that's related, but never expires, just to utilize the inbound link juice?
Our company gets inbound links from news websites that write stories about upcoming sporting events. The links we get are pointing to our event / ticket inventory pages on our commerce site. Once the event has passed, that event page is basically a dead page that shows no ticket inventory, and has no content. Also, each “event” page on our site has a unique url, since it’s an event that will eventually expire, as the game gets played, or the event has passed. Example of a url that a news site would link to: mysite.com/tickets/soldier-field/t7493325/nfc-divisional-home-game-chicago bears-vs-tbd-tickets.aspx Would there be any negative ramifications if I set up a 301 from the dead event page to another page on our site, one that is still somewhat related to the product in question, a landing page with content related to the team that just played, or venue they play in all season. Example, I would 301 to: mysite.com/venue/soldier-field tickets.aspx (This would be a live page that never expires.) I don’t know if that’s manipulating things a bit too much.
Intermediate & Advanced SEO | | Ticket_King1 -
My site has a loft of leftover content that's irrelevant to the main business -- what should I do with it?
Hi Moz! I'm working on a site that has thousands of pages of content that are not relevant to the business anymore since it took a different direction. Some of these pages still get a lot of traffic. What should I do with them? 404? Keep them? Redirect? Are these pages hurting rankings for the target terms? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Report card shows many F's. How do I specify keywords for pages?
I have been doing general optimization for on-page, but still have many F's because SEOMoz considers the pages to be weak for keywords that are anyway not relevant. Is there a way to tease out keywords for specific pages so I can get a more accurate report card?
Intermediate & Advanced SEO | | Ocularis1 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0