Removing indexed website
-
I had a .in TLD version of my .com website floated for about 15 days, which was a duplicate copy of .com website.
I did not wish to use the .in further for SEO duplication reasons and had let the .in domain expire on 26th April. But still now when I search from my website the .in version also shows up in results and even in google webmaster it shows the the website with maximum (190) number of links to my .com website.
I am sure this is hurting the ranking of my .com website. How can the .in website be removed from googles indexing and search results. Given that is has expired also.
thanks
-
How do I know if I was penalized by google?
in google search some links are from .com domain and some from .in domain.
I would like only .com domain to be indexed. you can check geekwik.com and geekwik.in
thanks
-
It will, yes, and you not only will get the new pages indexed faster but also the link juice the previous one had.
If you didn't receive any Google penalty, I would buy the domain, and set a 301 permanent redirect to the new one; or leaving the old one with the content and a rel=canonical tag to wherever the content is in the new domain (that's your decision).
But if you have the chance to get the domain again, for for it at least for 1 year and then, once you get all your new content indexed and the old one deindexed, you can forget about it, or leave it with the 301 redirect (which won't make you any harm, unless that previous domain was penalized).
If it was penalized, then don't. Again, do not get that domain back and point it to your new one, if you do that, all toxic links will be now pointing to your new domain.
-
Thanks. When you say it will take time, that means how much time?
If I repurchase the domain (it is available) and put it on 404 then will the process be faster?
-
If the domain expired and there's no content in it, then it will take time. You no longer control the domain, once Google crawls it again it will notice the lack of content and the domain actually expiring. The reason why Google may take some time to remove old content, is because you could be having an issue with the Website and therefore the content is not being served, and that is not a reason to remove/lose rankings.
Hope that helps.
-
I'm not sure whether is there a way to remove your website's index in google with a expired domain.
If the domain happens to be not expired then you can use the google webmaster tools > remove urls function to remove the index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are only a few of our pages being indexed
Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/
Technical SEO | | Blue-shark0 -
Should I remove these pages from the Google index?
Hi there, Please have a look at the following URL http://www.elefant-tours.com/index.php?callback=imagerotator&gid=65&483. It's a "sitemap" generated by a Wordpress plug-in called NextGen gallery and it maps all the images that have been added to the site through this plugin, which is quite a lot in this case. I can see that these "sitemap" pages have been indexed by Google and I'm wondering whether I should remove these or not? In my opinion these are pages that a search engine would never would want to serve as a search result and pages that a visitor never would want to see. Attracting any traffic through Google images is irrelevant in this case. What is your advice? Block it or leave it indexed or something else?
Technical SEO | | Robbern0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Website not ranking for noncompetitive terms
Hi, We've took over a website last July and no matter what we do we just can't get it ranking in Google, even for noncompetitive terms. here is the website in question: http://www.alignandsmile.co.uk Ideally the client would like to rank for Canary Wharf but that location is competitive, the site doesn't even rank for 'Dentist New Providence Wharf E14' despite it being included in the title tag on the home page and in the content throughout the website. Directories with Align and Smile's business information do rank however. I opened a case with google through Webmaster tools and they 'reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google.' So I'm a bit stuck. The site ranks top for the keyphrase in Bing and Yahoo...we are really struggling with Google! Any help would be much appreciated. many thanks Marcus
Technical SEO | | dentaldesign0 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
Quickest way to remove content from Google index?
We had some content on our own website indexed by Google and the content was changed later. But that content is still showing up in Google results. Of course because it was indexed. Its very important for us that content should not show up in Google. So how to remove that content quickly from Google Index? I know normally when it crawl again it will show new content. Google url removal tool or Google url fetch ? or anything else?
Technical SEO | | Personnel_Concept0 -
Dealing with indexable Ajax
Hello there, My site is basically an Ajax application. We assume lots of people link into deep pages on the site, but bots won't be able to read past the hashmarks, meaning all links appear to go to our home page. So, we have decided to form our Ajax for indexing. And so many questions remain. First, only Google handles indexable Ajax, so we need to keep our static "SEO" pages up for Bing and Yahoo. Bummer, dude, more to manage. 1. How do others deal with the differences here? 2. If we have indexable Ajax and static pages, can these be perceived as duplicate content? Maybe the answer is to disallow google bot from indexing the static pages we made. 3. What does your canonical URL become? Can you tell different search engines to read different canonical URLs? So many more questions, but I'll stop there. Curious if anyone here has thoughts (or experience) on the matter. Erin
Technical SEO | | ErinTM2