Page drops from index completely
-
We have a page that is ranking organically at #1 but over the past couple of months the page has twice dropped from a search term entirely. There don't appear to be any issues with the page in Search Console and adding the page on https://www.google.com/webmasters/tools/submit-url seems to fix the issue.
The search term we're tracking that drops is in the URL for the page and is the h1 of the page.
Here is a screenshot of the ranking over the past few months: https://jmp.sh/akvaKGF
What could cause this to happen? There is nothing in search console that shows any problems with the page. The last time this happened the page completely dropped on all search terms and showed up again after submitting the url to google manually. This time it dropped on just one search term and reappeared the next day after manually submitting the page again.
-
I had a simlar issue on a couple Wordpress freebie sub domains I made while conducting reputation management for clients. What had ended up happening was The site would index immediately and then 24 hours later be ghosted completely.
Turns out I was submitting the news sitemap that it automatically generated and being that I wasn't in their list of approved news sitemaps, I guess it just ripped everything out, as I'm sure the news sitemap and the regular one had the same pages listed just with more detail on the news one.
I doubt it's the exact same occurrence but if you recently submitted a sitemap, I'd check it closely, as it has been known to trigger a similar problem, at least for me!
-
Thanks, Nigel. Your responses are actually quite helpful pointers. There's a possibility that Google is flagging it as duplicate content as perhaps the content on this page is a bit sparse. We have two posts - first post is a "What is this type of document you need" and the second post which is a link to a template for that doc. The template one is the one that has dropped twice. Here is the search we're dropping from occasionally. Interestingly enough, Google is indexing the public Google Doc our page points to and including that in search results.
Excuse the bitly links, just trying to avoid the search terms showing up for others to find.
To answer your questions directly:
- Google seems to be respecting canonicals
- Page is in sitemap
- Perhaps too much repetition? Maybe we should expand the content a bit
- This may well have happened as we have seen a few sites "republish" some of our content.
-
Hi Russell
I would have to see the URL but it looks like a duplicate content problem. Have you recently written a blog post with a very similar title?
Is Google respecting your canonicals?
Is the page in your sitemap?
Is it over optimised? too much repetition of teh main keyword?
Has someone stolen all of the content creating cross-site duplication?There isn't a lot to go on but I agree it's very unusual!
Regards Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Google Algorithm non-manual penalty. How do we fix this (quality?) drop?
Hi, See attached image. We received a non-manual penalty on March 22, 2015. I don't think we ever came out of it. We have moved up due to the Penguin update, but we should (by DA PA) be up on the first page for tons of stuff and most keyword are lower than their true strength. What kind of quality errors could be causing this? I assume it was a quality update. I am working on the errors, but don't see anything that would be so severe as to be penalized. What errors/quality problems am I looking for? We have tons of unique content. Good backlinks. Good design. Good user experience except for some products. Again, what am I looking for? Thanks. non-manual-penalty.png
White Hat / Black Hat SEO | | BobGW0 -
Cloaking for better user experience and deeper indexing - grey or black?
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands. The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines. If we limit the results, then not all of the individual directory listing pages can be found. I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black? I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply. To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines. This is a positive user experience. For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page. This requires cloaking, but is ultimately serving the same content in slightly different ways. 1. Where on the scale of white to black is this? 2. Would you do this for a client's site? 3. Would you do it for your own site?
White Hat / Black Hat SEO | | ServiceCrowd_AU0 -
Sudden Drop in Keyword Ranking - No Idea Why
Hi Mozzers, I am in charge of everything Web Optimization for the company I work for. I keep active track of our SEO/SEM practices, especially our keyword rankings. Prior to my arrival at the company, in January of this year, we had a consultant handling the SEO work and though they did a decent job on maintaining our rankings for a hefty set of keywords, they were unable to get a particular competitive keyword ranking. This is odd because other derivations of that keyword which are equally competitive are all still ranking on page one. Also, full disclosure, they were not engaging in any questionable linking. In fact, they didn't do much of any link building whatsoever. I also haven't been engaging in any questionable content creation or spammy linking. We put out content regularly as we are a publicly traded company - nothing spammy at all. Anyway, one thing I tried since February was engaging in a social media sharing campaign among friends and coworkers to share the respective page and keyword on their Facebook and Google+ pages. To my surprise, this tactic worked just like natural search usually does - slowly and through the months I saw the keyword rank from completely invisible, to page 6, to page 3, to page 2, and finally onto position 6 page one as of just last week. Today, unfortunately, the keyword is invisible again :(. I am perplexed. It's tough to build links for our company as we are in the public and everything we do has to be approved by someone higher up. I also checked our webmaster tools and haven't seen any notifications that can give me clue as to what's going on. I am aware that there was a Penguin update recently and there are monthly Panda updates, but I'm skeptical as to whether or not those updates would be correlated to this because, at initial glance, our traffic and rankings for other keywords and pages don't seem to be affected. Suggestions? Advice? Answers? Thanks!
White Hat / Black Hat SEO | | CSawatzky0 -
How do you optimize a page with Syndicated Content?
Content is syndicated legally (licensed). My questions are: What is the best way to approach this situation? Is there any a change to compete with the original site/page for the same keywords? Is it okay to do so? Will there be any negative SEO impact on my site?
White Hat / Black Hat SEO | | StickyRiceSEO0 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
Doorway Page? or just a flawed idea?
I have a website which is on a .co.uk TLD and is primarily focused to the UK. Understandably I get very little in the way on US traffic, even though a lot of the content is applicable to the UK or US and could be made more so with a little tinkering. The domain has some age to it and ranks quite well for a variety of keywords and phrases, so it seems sensible to keep the site on this domain. The .com version of the domain is no longer available, and the current owner does not seem inclined to sell it to me. So, I am considering registering a very similar .com domain and simply using it to drive some traffic to the .co.uk site. To do this, I would have the same category pages and the same (or similar) list of links to the various pages in those categories. But instead instead of linking to a page on the new .com, it would take visitors to the existing page on the .co.uk. I would make this transparent to visitors ("Take a look at these pages on our sister site bluewidgets.co.uk") and the .com would have some unique content of its own. Would this be considered some kind of Doorway site/page (content rich doorway), or is it simply bad idea which is unlikely to drive any traffic?
White Hat / Black Hat SEO | | Jingo010 -
A domain is ranking for a plural key word in SERPs on page 1 but for the singular not at all?
What could the reasons that a domain is ranking for the plural version of a key word on SERPs page 1 and for the singular version not at all? Google knows that both key words belong together, as in the SERPs for one version also the other version of the key word is being highlighted. If I search for the domain with the plural keyword it shows up on the first page in SERPs, but If I search for the same keyword as singular (in German it is just removing an “s”) I see the plural version highlighted many times but I cannot find my domain. What could be the reason for this behavior? penalties?
White Hat / Black Hat SEO | | SimCaffe0