Homepage was removed from google and got deranked
-
Hello experts
I have a problem. The main page of my homepage got deranked severely and now I am not sure how to get the rank back.
It started when I accidentally canonicalized the main page "https://kv16.dk" to a page that did not exist.
4 months later the page got deranked, and you were not able to see the "main page" in the search results at all, not even when searching for "kv16.dk".
Then we discovered the canonicalization mistake and fixed it, and were able to get the main page back in the search results when searching for "kv16.dk".
At first after we made the correction, some weeks passed by, and the ranking didn't get better. Google search console recommended uploading a sitemap, do we did that. However in this sitemap there was a lot of "thin content sites", for all the wordpress attachments. E.g. for every image in an article. more exactly there were 91 of these attachment sites, and the rest of the page consists of only two pages "main page" and an extra landing page.
After that google begun recommending the attachment urls in some searches. We tried fixing it by redirecting all the attachments to their simple form. E.g. if it was an attachment page for an image we redirected strait to the image.
Google has not yet removed these attachment pages, so the question is if you think it will help to remove the attachments via google search console, or will that not help at all?
For example when we search "kv16" an attachment URL named "birksø" is one of the first results
-
Hi Everett
First of all I am sorry for the late reply, I was vacationing the last 7 days.
Thank you for your reply. I think you might be right about the "sandbox" thing. The page had a good position in the google search results, but then we made a mistake and canonicalized it to a non-existent page for 4 months. It could be that google considers it a "new page", even though they had it indexed for a year.
I appreciate your efforts, and I will wait some time, to see if it improves by itself, otherwise I will have to work some more on improving the contents of the site.
-
Hi Christian,
I don't see any evidence of the site being deindexed now. Here are some things I checked for you, along with a few observations:
-
Nothing in the Robots.txt file, or robots meta tag, or X-robots HTTP header response that would keep these pages from being indexed by Google
-
The rel= canonical tags appear to be functioning properly
-
The home page is indexed and not duplicated by other indexed pages
-
Google has about 86 pages indexd from your domain
-
Hrefl Lang tags appear to be implemented properly
-
There are only about 50 links going into the domain from other sites, and the ones from Moz are the best of what few aren't just random scraper sites (harmless, but annoying).
Sometimes Google ranks a brand higher when it first comes out because it's a chicken or egg situation. How else can they collect data for their machine to chew on unless some traffic is sent to a new site? We used to call this phenomenon "the Google sandbox" a long time ago, but it is essentially (in its effect) the same thing. We do it ourselves with A:B testing and paid advertising. You have to spend some budget to gain enough data to know what's working and what isn't.
I don't think you have a technical SEO problem here. I think you need to continue building a brand and producing useful, rich content. Good luck!
-
-
Hi Ross
Thank you a lot for all the recommendations, I will get those things done and get back on here with the result.
Though currently I have the Yoast SEO plugin, and have generated a sitemap with only the pages we want ranking, which I did already upload to GSC, but I will make sure there is a link in the footer as well.
And also I did a 301 redirect on all these "attachment" pages, but I will change that to a 410.
-
Hi Christian,
Try to update your home page by adding additional content to it or rewriting the existing content on the home page. Do not forget to reindex the home page after your update. In addition, you should install the Rank Math SEO plugin and regenerate the sitemap. Once, you have a new sitemap then you should resubmit the sitemap to GSC. You should only keep the URLs in the sitemap, but no images. If you want to remove some of your pages from the index you should set those pages to 410 Gone instead of 404 status. Also, I do not see that you have a link to your sitemap on your home page. You should add a link to the sitemap into your footer.
Ross
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not all images indexed in Google
Hi all, Recently, got an unusual issue with images in Google index. We have more than 1,500 images in our sitemap, but according to Search Console only 273 of those are indexed. If I check Google image search directly, I find more images in index, but still not all of them. For example this post has 28 images and only 17 are indexed in Google image. This is happening to other posts as well. Checked all possible reasons (missing alt, image as background, file size, fetch and render in Search Console), but none of these are relevant in our case. So, everything looks fine, but not all images are in index. Any ideas on this issue? Your feedback is much appreciated, thanks
Technical SEO | | flo_seo1 -
Removing indexed website
I had a .in TLD version of my .com website floated for about 15 days, which was a duplicate copy of .com website. I did not wish to use the .in further for SEO duplication reasons and had let the .in domain expire on 26th April. But still now when I search from my website the .in version also shows up in results and even in google webmaster it shows the the website with maximum (190) number of links to my .com website. I am sure this is hurting the ranking of my .com website. How can the .in website be removed from googles indexing and search results. Given that is has expired also. thanks
Technical SEO | | geekwik0 -
Why my site is not indexing in google
In google webmaster i have updated my sitemap in Mar 6th..There is around 22000 links..But google fetched only 5300 links for long time...
Technical SEO | | Rajesh.Chandran
I waited for 1 month till no improvement in google index..So apr6th we have uploaded new sitemap (1200 links totally)..,But only 4 links indexed in google ..
why google not indexing my urls? Is this affect our ranking in SERP? How many links are advisable to submit in sitemap for a website?0 -
How do I remove Links to my website???
Hi Guys, Please can anyone help!! Can anyone tell me how on earth I can remove links to my website? My website has been hit by the new penguin update and the company that was doing my SEO seems to have built a lot of spammy links!! How can I remove these links??? Please can anyone help Thanks Gareth
Technical SEO | | GAZ090 -
Google (GWT) says my homepage and posts are blocked by Robots.txt
I guys.. I have a very annoying issue.. My Wordpress-blog over at www.Trovatten.com has some indexation-problems.. Google Webmaster Tools data:
Technical SEO | | FrederikTrovatten22
GWT says the following: "Sitemap contains urls which are blocked by robots.txt." and shows me my homepage and my blogposts.. This is my Robots.txt: http://www.trovatten.com/robots.txt
"User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/ Do you have any idea why it says that the URL's are being blocked by robots.txt when that looks how it should?
I've read a couple of places that it can be because of a Wordpress Plugin that is creating a virtuel robots.txt, but I can't validate it.. 1. I have set WP-Privacy to crawl my site
2. I have deactivated all WP-plugins and I still get same GWT-Warnings. Looking forward to hear if you have an idea that might work!0 -
Having to type Google CAPTCHA all the time
Hi guys, Our office has about 15 computers all on the same IP address and about 10 actively search on Google. Recently we have been asked to type in CAPTCHA almost every single time searching on Google and would like to know if you have any suggestions of resolving this. We do use Firefox Rank Checker to check ranking once per week (around 400 keywords) but we use Hide My Ass to hide the IP. No malware or virus detected on computers in the network. Many thanks for your help in advance David
Technical SEO | | sssrpm0 -
Google Off/On Tags
I came across this article about telling google not to crawl a portion of a webpage, but I never hear anyone in the SEO community talk about them. http://perishablepress.com/press/2009/08/23/tell-google-to-not-index-certain-parts-of-your-page/ Does anyone use these and find them to be effective? If not, how do you suggest noindexing/canonicalizing a portion of a page to avoid duplicate content that shows up on multiple pages?
Technical SEO | | Hakkasan1