My website (non-adult) is not appearing in Google search results when i have safe search settings on. How can i fix this?
-
Hi,
I have this issue where my website does not appear in Google search results when i have the safe search settings on. If i turn the safe search settings off, my site appears no problem. I'm guessing Google is categorizing my website as adult, which it definitely is not. Has anyone had this issue before? Or does anyone know how to resolve this issue? Any help would be much appreciated.
Thanks
-
Let us know how it goes, either with a response here, or a case study on YouMoz. Best of luck!
-
Yeh i also noticed that there are some pages indexed but the homepage and some other pages on the website don't show up. We have a network of 35 sites, 8 of these sites are experiencing this same problem, the rest are fine. Im not exactly sure when this issue started but i beleive it has only happened recently.
Thanks for the help, i will sumbit the URL's for reconsideration and see how we go from there.
-
And here's the place to ask that you site be reconsidered:
-
I enabled safe search, then did a site:blackcupid.com search on Google. There are a lot of pages that show up as indexed, though I didn't see the home page right off. If I search for Black Cupid, I do see pages from that domain, but not the home page.
I took a snippet from the help page at http://www.blackcupid.com/help/helpcategory.cfm and searched for it in Google, with safe search on. Google is showing dozens of results from similar sites, such as Brazil Cupid and Japan Cupid (then I went to the home page and saw that all of those sites are indeed related).
How are the other sites performing? Do they have the same problem? Have you always had this problem, or is it new?
Any messages from Google in GWT?
-
Thanks for the quick response Keri - http://www.blackcupid.com/
-
Could you include your URL? That could help us out.
I've had some filters block access to strikemodels.com presumably because of the "models" in the URL (it's actually model warships), but haven't had a problem with Google filtering it out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Mobile site crawl returns poorer results on 100% responsive site
Has anyone experienced an issue where Google Mobile site crawl returns poorer results than their Desktop site crawl on a 100% responsive website that passes all Google Mobile tests?
Intermediate & Advanced SEO | | MFCommunications0 -
My website is my name. Overnight it went from being the number one google search to not showing up at all when you google my name. Why would this happen?
I built my website via square space. It is my name. If you google my name it was the number one hit. Suddenly 2 weeks ago it doesn't show up AT ALL. I went through square spaces SEO check list, secured my site etc. Still doesn't show up. Why would this happen all of the sudden and What can I do? Thank you!
Intermediate & Advanced SEO | | Jbark0 -
Google Search Operators Acting Strange
Hi Mozers, I'm using search operators for a count of how many pages have been indexed for each section of the site. I was able to download the first 1000 pages from Google Search Console but there are more than 1000 pages indexed, so I'm using operators for a count (even if I can't get the complete list of indexed URLs). [Although, if there is a better way, PLEASE let me know!] Anyway, in terms of search operators: from my understanding, the more general the URL, the more results should come up. However, when I put in the domain site:www.XXX it gives me FEWER results than when I put in site:www.XXX/. When I add the backslash to the end of the domain, it gives me MORE results. And when I put in site:www.AAA/BBB/CC it gives me MORE results than when I put in site:www.AAA/BBB. What's with this? Yael
Intermediate & Advanced SEO | | yaelslater1 -
What to try when Google excludes your URL only from high-traffic search terms and results?
We have a high authority blog post (high PA) that used to rank for several high-traffic terms. Right now the post continues to rank high for variations of the high-traffic terms (e.g keyword + " free", keyword + " discussion") but the URL has been completed excluded from the money terms with alternative URLs of the domain ranking on positions 50+. There is no manual penalty in place or a DCMA exclusion. What are some of the things ppl would try here? Some of the things I can think of: - Remove keyword terms in article - Change the URL and do a 301 redirect - Duplicate the POST under new URL, 302 redirect from old blog post, and repoint links as much as you have control - Refresh content including timestamps - Remove potentially bad neighborhood links etc Has anyone seen the behavior above for their articles? Are there any recommendations? /PP
Intermediate & Advanced SEO | | ppseo800 -
News Errors In Google Search Console
Years ago a site I'm working on was publishing news as one form of content on the site. Since then, has stopped publishing news, but still has a q&a forum, blogs, articles... all kinds of stuff. Now, it triggers "News Errors" in GWT under crawl errors. These errors are "Article disproportionately short" "Article fragmented" on some q&a forum pages "Article too long" on some longer q&a forum pages "No sentences found" Since there are thousands of these forum pages and it's problem seems to be a news critique, I'm wondering what I should do about it. It seems to be holding these non-news pages to a news standard: https://support.google.com/news/publisher/answer/40787?hl=en For instance, is there a way and would it be a good idea to get the hell out of Google News, since we don't publish news anymore? Would there be possible negatives worth considering? What's baffling is, these are not designated news urls. The ones we used to have were /news/title-of-the-story per... https://support.google.com/news/publisher/answer/2481373?hl=en&ref_topic=2481296 Or, does this really not matter and I should just blow it off as a problem. The weird thing is that we recently went from http to https and The Google News interface still has us as http and gives the option to add https, which I am reluctant to do sine we aren't really in the news business anymore. What do you think I should do? Thanks!
Intermediate & Advanced SEO | | 945010 -
What can Google tell about a website from Chrome or its toolbar?
Besides crawling with a bot, what kind of info about site construction, html, etc does Google get via users via alternate methods? Thanks... Darcy
Intermediate & Advanced SEO | | 945011 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
De-indexing search results noindex, follow or noindex, nofollow
If search results were not originally blocked with robots.txt, and need to be de-indexed, is it better to use noindex, nofollow or noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0