Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Check Google ban on domainname
-
Hello all,
If I wanted to know if a domainname has a google ban on it would the following be a good idea to test it.
Place an article on the domain page with unique content and then link to the page so its gets indexed and then link to the article from a well indexed page.
If it doesn't get indexed there might be a ban on the page, if it does get indexed there is no ban on the page...
Or are there other points I should keep in mind while doing this.
All help is very welcome.
Cheers,
Arnout
-
Hi! I'm following up on some older questions. What did you do in this case? One thing I would have added to this discussion is that if you owned the domain already to verify it in Google Webmaster Tools and see if there were any webmaster notifications there about the domain.
-
Likely it's not banned then - just not worth indexing. Chuck some decent content up there and you'll be fine

-
Yeps, I know this one but the site is adsense only on a parked domain...
-
My main problem is that the site is not in Google's index currently. It is currently a parked domain with adsense on it....
Would my suggestion in the first post work?
-
Also try the "trick" query of adding a /* to the URL.
site:domain.com/*
I always compare these results with the plain site:domain.com - it's conjecture but I believe the /* is showing the really indexed pages (primary index) and the other shows supplemental index. No-one really knows of course, but I track the percentage of one over the other as a way of measuring google's trust of your site. The numbers are relative, not absolute, but I use a yardstick of 20-30% as being good.
YMMV

-
There are several types of penalties (single keyword, all keywords, complete ban...etc). Search for your own brand and if you don't come up with anything than you're likely banned. Same thing with domain, site: command and info:
-
The simplest way to my knowledge is to use the Google site: operator. Simply type site:www.yourdomain.co.uk into Google search box. The results this search brings back will show all the pages Google has indexed for your website.
You can also use cache:www.yourdomain.co.uk to see what google is holding in cache, clicking the Cached link in the listing will show when the site was last indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sanity Check: NoIndexing a Boatload of URLs
Hi, I'm working with a Shopify site that has about 10x more URLs in Google's index than it really ought to. This equals thousands of urls bloating the index. Shopify makes it super easy to make endless new collections of products, where none of the new collections has any new content... just a new mix of products. Over time, this makes for a ton of duplicate content. My response, aside from making other new/unique content, is to select some choice collections with KW/topic opportunities in organic and add unique content to those pages. At the same time, noindexing the other 90% of excess collections pages. The thing is there's evidently no method that I could find of just uploading a list of urls to Shopify to tag noindex. And, it's too time consuming to do this one url at a time, so I wrote a little script to add a noindex tag (not nofollow) to pages that share various identical title tags, since many of them do. This saves some time, but I have to be careful to not inadvertently noindex a page I want to keep. Here are my questions: Is this what you would do? To me it seems a little crazy that I have to do this by title tag, although faster than one at a time. Would you follow it up with a deindex request (one url at a time) with Google or just let Google figure it out over time? Are there any potential negative side effects from noindexing 90% of what Google is already aware of? Any additional ideas? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945010 -
My website is not ranking for primary keywords in Google
I need help regarding some SEO strategy that need to be implemented to my website http://goo.gl/AiOgu1 . My website is a leading live chat product, daily it receives around 2000 unique visitors. Initially the website was impacted by manual link penalty, I cleaned up lot of backlinks, the website revoked from the penalty some where around June'14. Most of the secondary and longtail Keywords started ranking in Google, but unfortunately, it do not rank well for the primary keywords like (live chat, live chat software, helpdesk etc). Since I have done lot of onsite changes and even revamped the content but till now I dont find any improvement. I am unable to understand where I have got structed.
Intermediate & Advanced SEO | | sandeep.clickdesk
can anyone help me out?0 -
Tool to bulk check outbound links
Hi. I have a list of 50 domains I need to check for links to three different sites. Does anybody know an easy way to do this? The best solution I have found so far is to crawl each with Screaming Frog and search for the domains, but I can only do one at a time this way. Some way to speed it up would be great!
Intermediate & Advanced SEO | | Blink-SEO0 -
Will Google View Using Google Translate As Duplicate?
If I have a page in English, which exist on 100 other websites, we have a case where my website has duplicate content. What if I use Google Translate to translate the page from English to Japanese, as the only website doing this translation will my page get credit for producing original content? Or, will Google view my page as duplicate content, because Google can tell it is translated from an original English page, which runs on 100+ different websites, since Google Translate is Google's own software?
Intermediate & Advanced SEO | | khi50 -
Google is displaying wrong address
I have a client whose Google Places listing is not showing correctly. We have control of the page, and have the address verified by postcard. Yet when we view the listing it shows a totally different address that is miles away and on a totally different street. We have relogged into manage the business listing and all of the info is correct. We dragged the marker and submitted it to them that they had things wrong and left a note with the right address. Why would this happen and how can we fix it? Right now they rank highly but with a blatantly wrong address.
Intermediate & Advanced SEO | | Atomicx0 -
Limit on Google Removal Tool?
I'm dealing with thousands of duplicate URL's caused by the CMS... So I am using some automation to get through them - What is the daily limit? weekly? monthly? Any ideas?? thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Tool to check XML sitemap
Hello, Can anyone help me finding a tool to have closer look of the XML sitemap? Tks in advance! PP
Intermediate & Advanced SEO | | PedroM0