Does 'XXX' in Domain get filtered by Google
-
I have a friend that has xxx in there domain and they are a religious based sex/porn addiction company but they don't show up for the queries that they are optimized against. They have a 12+ year old domain, all good health signs in quality links and press from trusted companies. Google sends them adult traffic, mostly 'trolls' and not the users they are looking for.
Has anyone experienced domain word filtering and have a work around or solution? I posted in the Google Webmaster help forums and that community seems a little 'high on their horses' and are trying to hard to be cool. I am not too religious and don't necessarily support the views of the website but just trying to help a friend of a friend with a topic that I have never encountered.
here is the url: xxxchurch.com
Thanks,
Brian
-
Hmmm... This is a hard one. (Oh man, did not mean to make the intentional sex referrence)
Yes, Google has made changes in it's algorithm in the past year that makes porn harder to search for on the Internet. These changes don't filter the porn per se - except when "Safe search" is set to on - but it does mean that you must be much more specific in your search queries to find what you are looking for. For example, the query "boobs" generally returns almost no porn in Google, but the query "boobs porn" will.
If I were building an algorythm to separate porn sites from non, a large amount of XXX in the incoming anchor text, or in the URL, would probably trigger it.
Oh the other hand, I'm inclined to agree with George - seems like there's something more going on here. The backlink profile isn't terrible.... but there's definitely a footprint of comment spam in there. I won't link directly, but some of the suspect, off-topic links I found include:
http://www.takarat.com/forums/showthread.php?tid=750&page=3
http://www.omyogapages.com/forum/showthread.php?t=43&page=7
http://www.atthepicketfence.com/2011/09/behind-blog-with-savvy-southern-style.html
http://www.marypoppins-homesweethome.com/2011/07/what-is-it-with-us-girls-and-ikea.htmlThese are pretty terrible It's possible that there's 100's or 1000's more we're not seeing, and these are causing either a manual or algorithmic penalty.
My advice:
-
Check with Google Webmaster Tools for any messages - especially unnatural link warnings.
-
File a reconsideration request, even if you don't have any messages in GWT. Explain your concerns. Matt Cutts, the head of the Webspam team, helped write the original adult filter algorithms. He might take a special interest if you can get it to his attention.
But mostly, what you're looking for is verification, or not, of a penalty.
-
You may need to clean up the links. Do your best to remove any suspect links. Use the disavow tool as a last resort.
Hope this helps! Best of luck with your SEO.
-
-
I doubt there's a filter against xxx, but that doesn't mean there isn't something in the algos that checks for a spammy link profile more aggressively if the xxx is there.
I ran through the first 5 pages of links in Open Site Explorer, and their highest authority links mainly contain the branded keyword phrase "xxx church". Could use some diversity in anchor text. Just because Penguin hit for exact match anchor text for spammy links (from spammy sites and tactics), it doesn't mean you can't use "Check out this porn addiction recovery site if you're having issues with porn in your house." and link to the site with the underlined text.
There may be some more questions to ask. What are their link building efforts?
A number of pages from http://blog.internetsafety.com with incoming links no longer resolve (404 not found). There are lots of links that actually do look Penguin bait.
It could be link diversity. It could be low quality links. It could be tons of links coming from pages that are now resolving as 404s.
Sorry the news isn't great, but I really don't think it's the domain name that is the problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My main domain is missing in google, subdomain appears instead.
I have two SEO optimised pages in my website targeting different keywords www.example.com <-- main selling page (Pocket Guitar | Guitar Instruments)
On-Page Optimization | | kevinbp
www.example.com/index/ <-- 2nd selling page (Guitar Australia | Guitar Perth) Q: At first my website "www.example.com" is ranking on google first page. Suddenly it disappears and the link "www.example.com/index/" appears instead. No matter what i search, "Pocket Guitar | Guitar Instruments | Guitar Australia | Guitar Perth", the link www.example.com/index/ appears on the front page instead of www.example.com. What is happening to my main domain? Should i be worried?0 -
Multiple domains vs single domain vs subdomains ?
I have a client that recently read an article that advised him to break up his website into various URL's that targeted specific products. It was supposed to be a solution to gain footing in an already competitive industry. So rather than company.com with various pages targeting his products, he'd end up having multiple smaller sites: companyClothing.com companyShoes.com Etc. The article stated that by structuring your website this way, you were more likely to gain ranking in Google by targeting these niche markets. I wanted to know if this article was based on any facts. Are there any benefits to creating a new website that targets a specific niche market versus as a section of pages on a main website? I then began looking into structuring each of these product areas into subdomains, but the data out there is not definitive as to how subdomains are viewed by Google and other search engines - more specifically how subdomains benefit (or not!) the primary domain. So, in general, when a business targets many products and services that cover a wide range - what is the best way to structure the delivery of this info: multiple domains, single domain with folders/categories, or subdomains? If single domain with folders/categories are not an option, how do subdomains stack up? Thanks in advance for your help/suggestions!
On-Page Optimization | | dgalassi0 -
Blocking Google seeing outbound links?
Apart from rewriting the outbound url to look like a folder 'abc.co.uk/out/link1' and blocking the folder 'out' in the robots.txt file, along with also nofollowing the links as well, is there anything else you can do?
On-Page Optimization | | activitysuper0 -
Why does Google no longer like our site?
Hey guys, I'm trying to figure out why the traffic and rankings have been plummeting on www.readprint.com. It's a collection of both public domain books and books on Amazon's store. If anyone can offer any pointers as to if it's duplicate content or ??? It used to get 300K visits/mo but has slowly been dropping over the last year. I appreciate anyone's expertise!
On-Page Optimization | | CoBraJones0 -
I think I`ve caught some kind of google filter on my site.
What if the PA and the DA on my domain and the entire site is 1. Most of the pages on the site were empty or not unique. Now I`m adding new pages with unique content. I have only one position in the top 10. The remaining 15 positions are above the top. What should I do to increase my PA & DA and to have top 10 positions by other keywords?
On-Page Optimization | | ATCnik0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0 -
SEO Domain Values
I always thought that there was no difference in value between a .com and .net, also that hyphens have the same value as the keyphrase without the hyphen. But I have heard Rand lately saying hyphens are spammy - whats the go? Is CarParts.com better than Car-Parts.com or Car-Parts.net etc?
On-Page Optimization | | Ventura0