Google de-indexed a page on my site
-
I have a site which is around 9 months old. For most search terms we rank fine (including top 3 rankings for competitive terms).
Recently one of our pages has been fluctuating wildly in the rankings and has now disappeared altogether from the rankings for over 1 week.
As a test I added a similar page to one of my other sites and it ranks fine.
I've checked webmaster tools and there is nothing of note there.
I'm not really sure what to do at this stage. Any advice would me much appreciated!
-
Another thing which is weird:
If I google "site:medexpress.co.uk" the page /clinics/erectile-dysfunction/viagra appears.
If I google "site:medexpress.co.uk viagra" that page no longer appears!
I could see why my page would have maybe dropped in the rankings but to go from a rank of around 30-70 for most of the keywords for this page to dropping off the index points to something more extraordinary. Its been like this for almost two weeks and I'm considering maybe changing the URL (as I mentioned other long-tail optimized blog pages are now starting to rank so it appears to be just this page).
Is it possible this page has been moved to the supplemental index for some reason?
-
Hi,
The domain actually has plenty of links and that page has two very high quality backlinks. The Moz tool for some reason is missing almost all the backlinks. If you use AHREFS it works much better (remembering to use www as the prefix for the domain).
The domain authority is 49 which is much higher than the bulk of sites on there and higher than the site in position number #1 which has somehow managed to rank with almost no backlinks (it has less than 20) or any quality content.
Whats even weirded I added a page to my other site www.centraltravelclinic.co.uk and it now outranks my main site even though it has less backlinks/relevant content.
There is even a page in my blog section which is now ranking as well.
I'm not sure but it seems as if there is some penalty against this one page...or something has gone wrong on the search index and it has been temporarily removed for whatever reason.
Regards,
Dwayne
-
Hi Everett,
I agree 100% the domain itself has a great name, the lack of domain authority or links undoubtably has taken its toll surprised it ranks for anything the honest without any authority.
A content marketing campaign with high quality content covering the topics he discussed like law, medical care and including tools for the end-user would make it extremely brand worthy.
I think the best way to go about it would be to start a six-month campaign and do some very aggressive work on site.
You have some great advice.
Sincerely,
Thomas
-
Thank you Thomas,
I did have a look at the site and came to the conclusion that it had relatively no domain authority or page/URL-level authority from quality backlinks. The page had no links, and the domain only had a couple, neither of them great. As you mention, this industry is very competitive. It is one of the most competitive niches online.
My recommendation was to embark on a content marketing campaign, possibly one that highlights medical care / law / coverage disparities between countries. The domain itself is very brandable, and if they were to latch onto a hot-topic issue in the right way they could get quoted as experts in that industry by major news outlets, or at least some decent blogs. Meanwhile, steady posting about the lindustry on the blog, occasional white papers or studies created by the company about the the industry... will help build their body of content and present their expertise on the topic. Any tool (calculators, price comparison charts, etc...) would also help. I also mentioned a recommended minimum budget for content creation and paid promotion of it over the next six months.
-
Hi Everett,
There is a good reason for him not not actually posting URL for keywords wants list for.
I think Deelo555 should private message you the information he sent me. They are keywords that are commonly thought of negatively is all I will say.
Deelo555 please send Everett what you sent me.
All the best,
Thomas
-
Deelo555 We'll need more information to be of much help here. Can you post the site and/or the keyword? If not, this is probably going to have to be assisted by private message with someone.
I'd look for over-optimized anchor text or spammy links going into that page just to make sure. From there I would look into whether the content on the page "should" rank for the keyword. In other words, does it answer the query or give the searcher what they are looking for? After that I would move on to general user experience. Does the site look good, trustworthy, authoritative...
Follow that path and you'll probably uncover some things to improve. If not, let us know.
-
I replied to the private message I hope it is helpful.
-
Fluxing sometimes can mean Google is testing your site verse the other competitors for that keyword.
If Google is unable to determine what is the most enjoyed by the end-user so they will allow for you to fluctuate in the SERPS.
If people query your keyword find your site in the SERPS then immediately leave once they have landed on your website. That is called "Pogo sticking" Google is simply trying to find out whether or not you are the best fit for that keyword.
1st off if you could test the site some of the tools below
https://moz.com/researchtools/crawl-test
http://deepcrawl.co.uk/ (Amazing tool)
http://www.screamingfrog.co.uk/seo-spider/ ( everyone should have a copy I recommend the paid but you can crawl 500 pages for free using SF free addition)
You want to be see what Google bot can see your site can do that using
http://www.feedthebot.com/tools/
Check your Robots.txt using
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/ ( this will pull the file sometimes there are more than 1 it does happen)
make sure that if you're using WordPress or other similar CMS is that Google bot is not being blocked from seeing J query
The reason for this is Google cannot see the site as it might be shown to the end user therefore they would be less likely to come back & index
https://www.authoritydev.com/dont-block-jquery-from-googlebot/
once we have completed eliminated and chance of robots.txt or no follow / no index then you will want to look at your entire website Very thoroughly.
I do not know how old your domain is could you share its age?
what is your domain authority? what is the page authority of pages not being indexed?
we need to look at all of this before we can render a verdict.
If you're uncomfortable sharing your URL publicly you're welcome to send it to me privately.
what type of URLs are pointing (if any) Internal or external are pointing at your pages that are not indexing.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Of Pages As HTTPS vs HTTP
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
Intermediate & Advanced SEO | | vikasnwu1 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Google Search Console - Indexed Pages
I am performing a site audit and looking at the "Index Status Report" in GSC. This shows a total of 17 URLs have been indexed. However when I look at the Sitemap report in GSC it shows 9,000 pages indexed. Also, when I perform a site: search on Google I get 24,000 results. Can anyone help me to explain these anomalies?
Intermediate & Advanced SEO | | richdan0 -
Google Indexing of Images
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed. Can adding Microdata improve the indexation of the images? Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment? My concern is that so many images that not indexed could be a red flag showing poor quality content to Google. Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
How to Get Google to Recognize Your Pages Are Gone
Here's a quick background of the site and issue. A site lost half of its traffic over 18 months ago and its believed to be a Panda penalty. Many, many items were already taken care of and crossed off the list, but here's something that was recently brought up. There are 30,000 pages indexed in Google,but there are about 12,000 active products. Many of these pages in their index are out of stock items. A site visitor cannot find them by browsing the site unless he/she had bookmarked and item before, was given the link by a friend, read about it, etc. If they get to an old product because they had a link to it, they will see an out of stock graphic and not allow to make the purchase. So, efforts have been made about 1 month ago to 301 old products to something similar, if possible, or 410 them. Google has not been removing them from the index. My question is how to make sure Google sees that these pages are no longer there and remove from the index? Some of the items have links to them and this will help Google see them, but what about the items which have 0 external / internal links? Thanks in advance for your assistance. In working on a site which has about 10,000 items available for sale. Looking in G
Intermediate & Advanced SEO | | ABK7170 -
Best practice to prevent pages from being indexed?
Generally speaking, is it better to use robots.txt or rel=noindex to prevent duplicate pages from being indexed?
Intermediate & Advanced SEO | | TheaterMania0 -
Is my site penalized by Google?
Let's say my website is aaaaa.com and company name is aaaaa Systems. When I search Google aaaaa my site do not come up at all. When I search for "aaaaa Systems" it comes up. But in WMT I see quite a few clicks from aaaaa as keyword. Most of the traffic is brand keywords only. I never received any manual penalty in WMT ever. Is the site penalized or regular algorithm issues?
Intermediate & Advanced SEO | | ajiabs0 -
PR Dilution and Number of Pages Indexed
Hi Mozzers, My client is really pushing for me to get thousands, if not millions of pages indexed through the use of long-tail keywords. I know that I can probably get quite a few of them into Google, but will this dilute the PR on my site? These pages would be worthwhile in that if anyone actually visits them, there is a solid chance they will convert to a lead do to the nature of the long-tail keywords. My suggestion is to run all the keywords for these thousands of pages through adwords to check the number of queries and only create pages for the ones which actually receive searches. What do you guys think? I know that the content needs to have value and can't be scraped/low-quality and pulling these pages out of my butt won't end well, but I need solid evidence to make a case either for or against it to my clients.
Intermediate & Advanced SEO | | Travis-W0