Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Howdy, do curse words on your content article hurt SEO in any way or form?
-
howdy, do curse words on your content article hurt SEO in any way or form?
and if so is there a "list" of registered curse keywords that should be avoided?
-
thanks guys
-
Are the "curse" words necessary for your website? Although the site may pass through Google's safe search - it could still offend some visitors.
Obviously, if it is integral to your site then fine but I would steer away from using any foul language on a user rather than Google bot level. You wouldn't walk into a shop and tolerate the attendant to use abusive language whether it is targeted at you or not.
Hope this helps!
-
Yes Google claims that the default "moderate filtering" is only image based and does not include "curse keywords", however, I have a site that I manage which I cannot see unless I turn safesearch to "no filtering" . There are no images on the website that would trigger the safe search however there is some very foul language on the site which in my opinion is the cause of the filter. The site does not link to any adult sites either so I always presumed it was the language.
-
Safe Search filter is user activated - parents can activate the filter to protect their kids when searching the internet. I doubt his target audience is formed by kids, and I doubt a parent would make a search with the option activated, on their private PC. Here you can find more on Safe search.
The important thing is: think at your audience and how an article that contains courses could help - for example such an article could be more interactive for certain public - there are journalistic styles (can't remember the name now) that contain dirty language.
-
Curse words will not affect your ranking, however Google may indeed filter you with their "safe search" filter if you use too many words that they do not consider family friendly. As to what these words are I cannot tell you however I think we all have an idea of what is considered family friendly.
-
Hey,
That would be impossible. Google does not censor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in sidebar
Hi guys. So I have a few sentences (about 50 words) of duplicate content across all pages of my website (this is a repeatable text in sidebar). Each page of my website contains about 1300 words (unique content) in total, and 50 words of duplicate content in sidebar. Does having a duplicate content of this length in sidebar affect the rankings of my website in any way? Thank you so much for your replies.
On-Page Optimization | | AslanBarselinov1 -
Harms of hidden categories on SEO
On our website we have some invisible/hidden categories on our site. Can anyone advise whether these are harmful in terms of SEO?
On-Page Optimization | | CostumeD0 -
How to remove subdomains in a clean way?
Hello, I have a main domain example.com where I have my main content and then I created 3 subdomains one.example.com, two.example.com and three.example.com I think the low ranking of my subdomains is affecting the ranking of my main domain, the one I care the most. So, I decided to get rid of the subdomains. The thing is that only for one.example.com I could transfer the content to my main domain and create 301 redirects. For the other two subdomains I cannot integrate the content in my main domain as it doesn't make sense. Whats the cleanest way to make them dissapear? (just put a redirect to my main domain even if the content is not the same) or just change the robots to "noindex" and put a 404 page in the index of each subdomain. I want to use the way that will harm the least the performance with Google. Regards!
On-Page Optimization | | Gaolga0 -
Using Escaped Fragments with SEO
Our e-commerce platform is in the process of changing to what we call app based stores (essentially running in a browser as single page web-app) With these new stores they are being built in HTML 5 and using escaped fragments.
On-Page Optimization | | marketing_zoovy.com
Currently merchants are usually running 2 stores until we launch to app site at 100%. My questions are really concerning the app stores which right now show on a subdomain but will essentially take over the primary domain. Here is an example:
app.tikimater.com and app.sportsworld.com Since I am not a developer, I'm really having a hard time understanding the escaped fragments. I'm using this but https://developers.google.com/webmasters/ajax-crawling/docs/getting-started I'm not sure what my actual urls should look like and what the canonical should be set to. Right now they have been removed but previously they had http:app.tikimaster.com#!v=1 Also, and how I should be setting up my meta information for Google so 1) pages are indexed timely 2) pages are indexed with the correct information. I am still setting the meta titles and descriptions but in some instances Google uses other info. With the new platform we are moving away from on page content (written paragraphs) but category pages would have related products embedded. Should I still be pushing to have some type of intro text, since it would solely be for SEO and not the shoppers experience. All product pages have content (product description etc) Thank you for any advice0 -
Duplicate Content - Blog Rewriting
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website. He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field. To what extent would I need to rewrite each article so as to avoid duplicating the content? Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords? Would the articles need to be completely taken by all current publishers? Any advice would be greatly appreciated.
On-Page Optimization | | StoryScout0 -
What is the most SEO friendly Shopping Cart?
What is the most SEO friendly shopping cart? I have been using zen-cart for 6 years. Seems Google doesn't like it as much as other carts. I started a new site about 6 months ago using Magento. When I build links to this site the terms move. The terms are very similar. So I would imagine the competition is the same. I am curious if anybody has tried with different carts and found anyone to be better than the others. Also the new site has about one tenth the amount of products but has a lot more pages indexed.
On-Page Optimization | | kicksetc0 -
SEO for Japan
Google and Yahoo are the two major search engines in Japan. You can search using Western characters, and you often see English language results with Japanese (Chinese) characters next to them. As I don't speak Japanese, how do I approach SEO for my Japanese-language site? would appreciate any experiences and educational sources on the topic.
On-Page Optimization | | KnutDSvendsen0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0