Swear Filter - SERP Impact
-
My forum currently has a swear filter in place. While I personally think in most cases there's better alternatives to swearing, the general consensus is that it should be removed, and I've no problem with that in principal, however my concern is that Google may penalise the site in some way if this is done.
I've searched around a fair bit and haven't found any solid info on this so hoping someone on here may know the answer.
The question - Can repeated swear words effect rankings or prevent a website displaying in Google if safesearch is on?
Thanks as always.
-
Cheers Keri
-
Your best bet there might be to find someone you know who works in IT at a company that's really picky about what their employees see.
The problem I've run into is (I believe) a filter on a keyword in the URL for my site that won't let people at Vandenberg Air Force Base go to strikemodels.com. The site is about model warships battling each other, not about people modeling, but I'm pretty sure it's keying on the word model in the URL. Other sites from that host can be viewed from the base, and other sites with the same WP/Thesis framework can also be viewed.
-
One thing I've only just considered:
What about people who browse at work. Do many work filters block out sites with profanity?
-
Thanks William, that's great.
-
If the text and content within the page is extremely offensive with excessive cursing it wouldn't even rank due to no one is going to search all those curses.
But Google will not penalize you or filter it out with the safe search. The default settings on Google allows swearing and unless you have highly offensive or object-able content on the page, I don't see why Google will not show the page.
I've operated forums before and the only thing I filter are racial slurs. I wouldn't allow too much swearing on the forums as well as it creates sort of a barrier from old to new registrants.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of keyword/keyphrases density on header/footer
Hi, It might be a stupid question but I prefer to clear things out if it's not a problem: Today I've seen a website where visitors are prompted no less than 5 times per page to "call [their] consultants".
On-Page Optimization | | GhillC
This appears twice on the header, once on the side bar (mouse over pop up), once in the body of most of the pages and once in the footer. So obviously, besides the body of the pages, it appears at least 4 times on every single pages as it's part of the website template. In the past, I never really wondered re the menu, the footer etc as it's usually not hammering the same stuff repeatedly everywhere. Anyway, I then had a look at their blog and, given the average length of their articles, the keyword density around these prompts is about 0.5% to 0.8% for each page. This is huge! So basically my question is as follow: is Google's algorithm smart enough to understand what this is and make abstraction of this "content" to focus on the body of the pages (probably simply focusing on the tags)? Or does it send wrong signals and confuse search engine more than anything else? Reading stuff such as this, I wonder how does it work when this is not navigational or links elements. Thanks,
G Note: I’m purposely not speaking about the UX which is obviously impacted by such a hammering process.0 -
Google is showing erroneous results on SERPs page
Hello, All, In April, two months ago, we caught a hack on a client's website. It created about 40 pages in what looked to be a black hat link tactic. We removed the pages, resubmitted the sitemap.xml (it reprocessed) and ran it through screaming frog to confirm all the pages were gone, but the forty pages still show up in the search results for a site search. We have both the www. and non www. version of sites claimed and set a preference. Nothing is awry with the robots.text. We're not really sure what to do to resolve it. We asked Google to recrawl (fetch) the site. I'm not sure what's going on with it. The website's name is fortisitsolutions.com The site search bringing up the pages from the hack is below. site:www.fortisitsolutions.com Any ideas?
On-Page Optimization | | Cazarin-Interactive0 -
Is my domain holding me back in the SERPS?
Even after a good year or so, my site intensivedrivingschoolmiltonkeynes.co.uk does not rank in the top 10 (google.co.uk) for "Intensive Driving School Milton Keynes", and is nowhere for "Driving School Milton Keynes". Do you think the domain name is being penalised, or do you think there are other factors that contribute to the poor performance.
On-Page Optimization | | Buffalo-Mobile1 -
What is the Impact of Canonical to a Canonical Page?
hey folks, How does google respond to this, canonical to a canonical page? i.e page A is canonical to Page which is already/also canonical to PAGE C. Thanks In advance AK
On-Page Optimization | | AnkammaRao0 -
Advice on related post plugin that doesn impacts on page load speed
I am looking for a related post page plugin for wordpress that doesnt affects to much on page speed Advice on related post plugin that doesn't impacts to much on page load speed
On-Page Optimization | | maestrosonrisas0 -
Will a large percentage of 404 links negatively impact SERP performance?
We discovered a broken link and issue with a dynamically generated sitemap that resulted in 9,000+ pages of duplicate content (namely there was not actual 404 page, but content for a 404 page that populated on the broken page). We've corrected that issue so the 404 page is working correctly now and there aren't any more broken links on the site. However, we just reviewed our Google crawl report, and saw that now there are 9,000+ 404 links in the Google index. We discovered the initial error when our SERP performance dropped 60% in a month. Now that we've corrected all the duplicate content pages, will vast number of 404 pages negatively impact SERP results? If so, do you recommend doing 301 redirects to the page it should have gone to, and do you know of any automated tools performing the 301's (it's a standard HTML site, no CMS involved). Thanks for your help!
On-Page Optimization | | DirectiveGroup0 -
Do Schema.org changes impact local SEO
I've reviewed the various presentation and blog posts from SMX advanced regarding local SEO and I didn't see any mention of Schema.org and microformats. Has any research or case studies been presented supporting that implementation of Schema.org microformats will improve local results? Here is one example where I've implemented the basics in the address info of the footer. http://bit.ly/lZQYeg Any tips on how to further optimize with schema.org markup?
On-Page Optimization | | DotCar0 -
SERP listing of a websites' 'categories'
Hi all, just wondering if anyone has thoughts on what I can do to encourage SERP listings that include website categories, eg http://www.google.com.au/search?q=seomoz&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a . I'm assuming search engines only display type of listings when the search query closely matches the domain name? Thank heaps!
On-Page Optimization | | TheWebSearchMarketingCompany0