Google de-indexed a page on my site
-
I have a site which is around 9 months old. For most search terms we rank fine (including top 3 rankings for competitive terms).
Recently one of our pages has been fluctuating wildly in the rankings and has now disappeared altogether from the rankings for over 1 week.
As a test I added a similar page to one of my other sites and it ranks fine.
I've checked webmaster tools and there is nothing of note there.
I'm not really sure what to do at this stage. Any advice would me much appreciated!
-
Another thing which is weird:
If I google "site:medexpress.co.uk" the page /clinics/erectile-dysfunction/viagra appears.
If I google "site:medexpress.co.uk viagra" that page no longer appears!
I could see why my page would have maybe dropped in the rankings but to go from a rank of around 30-70 for most of the keywords for this page to dropping off the index points to something more extraordinary. Its been like this for almost two weeks and I'm considering maybe changing the URL (as I mentioned other long-tail optimized blog pages are now starting to rank so it appears to be just this page).
Is it possible this page has been moved to the supplemental index for some reason?
-
Hi,
The domain actually has plenty of links and that page has two very high quality backlinks. The Moz tool for some reason is missing almost all the backlinks. If you use AHREFS it works much better (remembering to use www as the prefix for the domain).
The domain authority is 49 which is much higher than the bulk of sites on there and higher than the site in position number #1 which has somehow managed to rank with almost no backlinks (it has less than 20) or any quality content.
Whats even weirded I added a page to my other site www.centraltravelclinic.co.uk and it now outranks my main site even though it has less backlinks/relevant content.
There is even a page in my blog section which is now ranking as well.
I'm not sure but it seems as if there is some penalty against this one page...or something has gone wrong on the search index and it has been temporarily removed for whatever reason.
Regards,
Dwayne
-
Hi Everett,
I agree 100% the domain itself has a great name, the lack of domain authority or links undoubtably has taken its toll surprised it ranks for anything the honest without any authority.
A content marketing campaign with high quality content covering the topics he discussed like law, medical care and including tools for the end-user would make it extremely brand worthy.
I think the best way to go about it would be to start a six-month campaign and do some very aggressive work on site.
You have some great advice.
Sincerely,
Thomas
-
Thank you Thomas,
I did have a look at the site and came to the conclusion that it had relatively no domain authority or page/URL-level authority from quality backlinks. The page had no links, and the domain only had a couple, neither of them great. As you mention, this industry is very competitive. It is one of the most competitive niches online.
My recommendation was to embark on a content marketing campaign, possibly one that highlights medical care / law / coverage disparities between countries. The domain itself is very brandable, and if they were to latch onto a hot-topic issue in the right way they could get quoted as experts in that industry by major news outlets, or at least some decent blogs. Meanwhile, steady posting about the lindustry on the blog, occasional white papers or studies created by the company about the the industry... will help build their body of content and present their expertise on the topic. Any tool (calculators, price comparison charts, etc...) would also help. I also mentioned a recommended minimum budget for content creation and paid promotion of it over the next six months.
-
Hi Everett,
There is a good reason for him not not actually posting URL for keywords wants list for.
I think Deelo555 should private message you the information he sent me. They are keywords that are commonly thought of negatively is all I will say.
Deelo555 please send Everett what you sent me.
All the best,
Thomas
-
Deelo555 We'll need more information to be of much help here. Can you post the site and/or the keyword? If not, this is probably going to have to be assisted by private message with someone.
I'd look for over-optimized anchor text or spammy links going into that page just to make sure. From there I would look into whether the content on the page "should" rank for the keyword. In other words, does it answer the query or give the searcher what they are looking for? After that I would move on to general user experience. Does the site look good, trustworthy, authoritative...
Follow that path and you'll probably uncover some things to improve. If not, let us know.
-
I replied to the private message I hope it is helpful.
-
Fluxing sometimes can mean Google is testing your site verse the other competitors for that keyword.
If Google is unable to determine what is the most enjoyed by the end-user so they will allow for you to fluctuate in the SERPS.
If people query your keyword find your site in the SERPS then immediately leave once they have landed on your website. That is called "Pogo sticking" Google is simply trying to find out whether or not you are the best fit for that keyword.
1st off if you could test the site some of the tools below
https://moz.com/researchtools/crawl-test
http://deepcrawl.co.uk/ (Amazing tool)
http://www.screamingfrog.co.uk/seo-spider/ ( everyone should have a copy I recommend the paid but you can crawl 500 pages for free using SF free addition)
You want to be see what Google bot can see your site can do that using
http://www.feedthebot.com/tools/
Check your Robots.txt using
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/ ( this will pull the file sometimes there are more than 1 it does happen)
make sure that if you're using WordPress or other similar CMS is that Google bot is not being blocked from seeing J query
The reason for this is Google cannot see the site as it might be shown to the end user therefore they would be less likely to come back & index
https://www.authoritydev.com/dont-block-jquery-from-googlebot/
once we have completed eliminated and chance of robots.txt or no follow / no index then you will want to look at your entire website Very thoroughly.
I do not know how old your domain is could you share its age?
what is your domain authority? what is the page authority of pages not being indexed?
we need to look at all of this before we can render a verdict.
If you're uncomfortable sharing your URL publicly you're welcome to send it to me privately.
what type of URLs are pointing (if any) Internal or external are pointing at your pages that are not indexing.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difficulty with Indexing Pages - Desperate for Help!
I have a website with product pages that use the same URL, but load different data based on what's passed to them with GET. I am using a Wordpress website, but all of the page information is retrieved from a database using PHP and displayed with PHP. Somehow these pages are not being indexed by Google. I have done the following: 1. Created a site map pointing to each page. 2. Defined URL parameters in Search Console for these type of pages. 3. Created a product schema using schema.org, and tested it without errors. I have requested re-indexing repeatedly and these pages and images on the pages are still not being indexed! Does anybody have any suggestions?
Intermediate & Advanced SEO | | jacleaves0 -
Pages are being dropped from index after a few days - AngularJS site serving "_escaped_fragment_"
My URL is: https://plentific.com/ Hi guys, About us: We are running an AngularJS SPA for property search.
Intermediate & Advanced SEO | | emre.kazan
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty. The problem: When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404. The question: 1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)? or 2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop? and 3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way? Thank you very much! Pavel Velinov
SEO at Plentific.com1 -
Index or not index Categories
We are using Yoast Seo plugin. On the main menu we have only categories which has consist of posts and one page. We have category with villas, category with villa hotels etc. Initially we set to index and include in the sitemap posts and excluded categories, but I guess it was not correct. Would be a better way to index and include categories in the sitemap and exclude the posts in order to avoid the duplicate? It somehow does not make sense for me, If the posts are excluded and the categories included, will not then be the categories empty for google? I guess I will get crazy of this. Somebody has perhaps more experiences with this?
Intermediate & Advanced SEO | | Rebeca10 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
More Indexed Pages than URLs on site.
According to webmaster tools, the number of pages indexed by Google on my site doubled yesterday (gone from 150K to 450K). Usually I would be jumping for joy but now I have more indexed pages than actual pages on my site. I have checked for duplicate URLs pointing to the same product page but can't see any, pagination in category pages doesn't seem to be indexed nor does parameterisation in URLs from advanced filtration. Using the site: operator we get a different result on google.com (450K) to google.co.uk (150K). Anyone got any ideas?
Intermediate & Advanced SEO | | DavidLenehan0 -
De-indexed by Google! ?
So it looks as though the content from myprgenie.com is no longer being indexed. Anyone know what happened and what they can do to fix it fast?
Intermediate & Advanced SEO | | siteoptimized0 -
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Intermediate & Advanced SEO | | NEWCRAFT0 -
Our site is recieving traffic for both .com/page and .com/page/ with the trailing slash.
Our site is recieving traffic for both .com/page and .com/page/ with the trailing slash. Should we rewrite to just the trailing slash or without because of duplicates. The other question is, if we do a rewrite, google has indexed some pages with the slash and some without - i am assuming we will lose rank for one of them once we do the rewrite, correct?
Intermediate & Advanced SEO | | Profero0