Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does 'XXX' in Domain get filtered by Google
-
I have a friend that has xxx in there domain and they are a religious based sex/porn addiction company but they don't show up for the queries that they are optimized against. They have a 12+ year old domain, all good health signs in quality links and press from trusted companies. Google sends them adult traffic, mostly 'trolls' and not the users they are looking for.
Has anyone experienced domain word filtering and have a work around or solution? I posted in the Google Webmaster help forums and that community seems a little 'high on their horses' and are trying to hard to be cool. I am not too religious and don't necessarily support the views of the website but just trying to help a friend of a friend with a topic that I have never encountered.
here is the url: xxxchurch.com
Thanks,
Brian
-
Hmmm... This is a hard one. (Oh man, did not mean to make the intentional sex referrence)
Yes, Google has made changes in it's algorithm in the past year that makes porn harder to search for on the Internet. These changes don't filter the porn per se - except when "Safe search" is set to on - but it does mean that you must be much more specific in your search queries to find what you are looking for. For example, the query "boobs" generally returns almost no porn in Google, but the query "boobs porn" will.
If I were building an algorythm to separate porn sites from non, a large amount of XXX in the incoming anchor text, or in the URL, would probably trigger it.
Oh the other hand, I'm inclined to agree with George - seems like there's something more going on here. The backlink profile isn't terrible.... but there's definitely a footprint of comment spam in there. I won't link directly, but some of the suspect, off-topic links I found include:
http://www.takarat.com/forums/showthread.php?tid=750&page=3
http://www.omyogapages.com/forum/showthread.php?t=43&page=7
http://www.atthepicketfence.com/2011/09/behind-blog-with-savvy-southern-style.html
http://www.marypoppins-homesweethome.com/2011/07/what-is-it-with-us-girls-and-ikea.htmlThese are pretty terrible
It's possible that there's 100's or 1000's more we're not seeing, and these are causing either a manual or algorithmic penalty.
My advice:
-
Check with Google Webmaster Tools for any messages - especially unnatural link warnings.
-
File a reconsideration request, even if you don't have any messages in GWT. Explain your concerns. Matt Cutts, the head of the Webspam team, helped write the original adult filter algorithms. He might take a special interest if you can get it to his attention.
But mostly, what you're looking for is verification, or not, of a penalty.
-
You may need to clean up the links. Do your best to remove any suspect links. Use the disavow tool as a last resort.
Hope this helps! Best of luck with your SEO.
-
-
I doubt there's a filter against xxx, but that doesn't mean there isn't something in the algos that checks for a spammy link profile more aggressively if the xxx is there.
I ran through the first 5 pages of links in Open Site Explorer, and their highest authority links mainly contain the branded keyword phrase "xxx church". Could use some diversity in anchor text. Just because Penguin hit for exact match anchor text for spammy links (from spammy sites and tactics), it doesn't mean you can't use "Check out this porn addiction recovery site if you're having issues with porn in your house." and link to the site with the underlined text.
There may be some more questions to ask. What are their link building efforts?
A number of pages from http://blog.internetsafety.com with incoming links no longer resolve (404 not found). There are lots of links that actually do look Penguin bait.
It could be link diversity. It could be low quality links. It could be tons of links coming from pages that are now resolving as 404s.
Sorry the news isn't great, but I really don't think it's the domain name that is the problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google Understand H2 As Subtitle?
I use some HTML 5 tags on my custom template. I implement <header class="entry-header-outer"> Flavour & Chidinma – 40 Yrs 40 Yrs by Flavour & Chidinma </header> html code. h1 tag serves as the title, while h2 tag servers as the subtitle of the post. Take a look at it here: https://xclusiveloaded.com/flavour-chidinma-40-yrs/ I want to know if it's ok or should I remove the h2 tag. Guys, what is your thoughts?
On-Page Optimization | | Kingsmart4 -
Duplicate URL's in Sitemap? Is that a problem?
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
On-Page Optimization | | Luciana_BAH0 -
Snippet showing as domain name with apostrophe, instead of page title when searching for the domain name.
Hi, We have an issue with one of our websites, with the snippet dispaying differently in Google serps when searching for the domain or the website name rather than a search term. When searching for a search term, the page title shows as expected, but when searching for the site by the domain name either with or without the tld, it shows the snippet as the domain name with an apostrophe at the end. Domain is subli.co.uk Thanks in advance for any advice!
On-Page Optimization | | K3v1n0 -
Multiple domains vs single domain vs subdomains ?
I have a client that recently read an article that advised him to break up his website into various URL's that targeted specific products. It was supposed to be a solution to gain footing in an already competitive industry. So rather than company.com with various pages targeting his products, he'd end up having multiple smaller sites: companyClothing.com companyShoes.com Etc. The article stated that by structuring your website this way, you were more likely to gain ranking in Google by targeting these niche markets. I wanted to know if this article was based on any facts. Are there any benefits to creating a new website that targets a specific niche market versus as a section of pages on a main website? I then began looking into structuring each of these product areas into subdomains, but the data out there is not definitive as to how subdomains are viewed by Google and other search engines - more specifically how subdomains benefit (or not!) the primary domain. So, in general, when a business targets many products and services that cover a wide range - what is the best way to structure the delivery of this info: multiple domains, single domain with folders/categories, or subdomains? If single domain with folders/categories are not an option, how do subdomains stack up? Thanks in advance for your help/suggestions!
On-Page Optimization | | dgalassi0 -
Google cached snapshots and last indexed
My question is I noticed today that the snap shots of my main pages were outdated. About a month. Then I clicked on the "Learn More" link about cahced images and Google says "Google crawls the web and takes snapshots of each page. When you click Cached, you'll see the webpage as it looked when we last indexed it." I know this sounds really dumb, but does that really mean the last time Google indexed that page? So the changes I have made since then have not been taken yet?
On-Page Optimization | | cbielich0 -
Page title getting cut off in SERPS even though it's under 70 characters?
I re-wrote the page title of a home page for a site I'm working on and made sure it's under 70 characters (68 to be exact) to comply with best practices and make sure it doesn't get cut-off in the SERPS. It's still getting cut-off though and right when it gets to the brand/website name. Does a "-" have anything to do with it? Does that translate to an elipsis? Format: keywords - website/brand.com Can anybody tell me why this would be happening?
On-Page Optimization | | MichaelWeisbaum0 -
Should I let Google index tags?
Should I let Google index tags? Positive? Negative Right now Google index every page, including tags... looks like I am risking to get duplicate content errors? If thats true should I just block /tag in robots.txt Also is it better to have as many pages indexed by google or it's should be as lees as possible and specific to the content as much as possible. Cheers
On-Page Optimization | | DiamondJewelryEmpire0