Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
-
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too):
Here's how they show up:
Example 1:
- URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2
- Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html
Example 2 (this surprised me the most when I looked at the linked from data):
-
URL: www.example.com/search/adult-site-keyword-2.html/page/3/
-
Linked From:
-
www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site)
-
http://a-spammy-adult-site.com/search/adult-site-keyword-2.html
Example 3:
- URL: www.example.com/search/adult-site-keyword-3.html
- Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html
How do I address this issue?
-
Here is what I would do
-
Disavow the domain that is linking to you from the adult site(s).
-
The fact that Google search console is showing that you have an internal page linking as well makes me want to know a) have you always owned this domain and maybe someone previously did link internally like this or b) you may have been or are hacked
In the case of b) this can be really tricky. I once had a site that in a crawl it was showing sitewide links to various external sites that we should not be linking to. When I looked at the internal pages via my browser, there was no link as far as I could see even though it showed up on the crawler report.
Here was the trick. The hacker had setup a script to only show the link when a bot was viewing the page. Plus, we were running mirrored servers and they had only hacked one server. So, the links only showed up when you were spidering a specific mirrored instance as a bot.
So thanks to the hacking, not only were we showing bad links to bad sites, we were doing this through cloaking methodology. Two strikes against us. Luckily we picked this up pretty quick and fixed immediately.
Use a spidering program or browser program to show a user agent of Googlebot and go visit your pages that are linking internally. You might be surprised.
Summary
Googlebot has a very long memory. It may be that this was an old issue that was fixed long ago. If that was the case, just show the 404s for the pages that do not exist, and disavow the bad domain and move on. Make sure that you have not been hacked as this would also be why this is showing.
Regardless, the fact that Google did find it at one point, you need to make sure you resolve. Pull all the URLs into a spreadsheet and run Screaming Frog in list mode to check them all to make sure you fix all of it.
-
-
Yep.. Looking if anyone can help with this..
-
Oh yea, I missed that. That's very strange, not sure how to explain that one!
-
Thanks for the response Logan. What you are saying definitely makes sense.. But it makes think why do I see something like Example 2 under Crawl errors. Why Google Search Console shows linked from as 2 URL - one the spammy site's and other is from my own website. How is that even possible?
-
I've seen similar situations, but never in bulk and not with adult sites. Basically what's happening is somehow a domain (or multiple) are linking to your site with inaccurate URLs. When bots crawling those sites find the links pointing to yours, they obviously hit a 404 page which triggers the error in Search Console.
Unfortunately, there's not too much you can do about this, as people (or automated spam programs) can create a link to any site and any time. You could disavow links from those sites, which might help from an SEO perspective, but it won't prevent the errors from showing up in your Crawl Error report.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel="prev" / "next"
Hi guys, The tech department implemented rel="prev" and rel="next" on this website a long time ago.
Intermediate & Advanced SEO | | AdenaSEO
We also added a canonical tag to the 'own' page. We're talking about the following situation: https://bit.ly/2H3HpRD However we still see a situation where a lot of paginated pages are visible in the SERP.
Is this just a case of rel="prev" and "next" being directives to Google?
And in this specific case, Google deciding to not only show the 1st page in the SERP, but still show most of the paginated pages in the SERP? Please let me know, what you think. Regards,
Tom1 -
My "search visibility" went from 3% to 0% and I don't know why.
My search visibility on here went from 3.5% to 3.7% to 0% to 0.03% and now 0.05% in a matter of 1 month and I do not know why. I make changes every week to see if I can get higher on google results. I do well with one website which is for a medical office that has been open for years. This new one where the office has only been open a few months I am having trouble. We aren't getting calls like I am hoping we would. In fact the only one we did receive I believe is because we were closest to him in proximity on google maps. I am also having some trouble with the "Links" aspect of SEO. Everywhere I see to get linked it seems you have to pay. We are a medical office we aren't selling products so not many Blogs would want to talk about us. Any help that could assist me with getting a higher rank on google would be greatly appreciated. Also any help with getting the search visibility up would be great as well.
Intermediate & Advanced SEO | | benjaminleemd1 -
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Using "nofollow" internally can help with crawl budget?
Hello everyone. I was reading this article on semrush.com, published the last year, and I'd like to know your thoughts about it: https://www.semrush.com/blog/does-google-crawl-relnofollow-at-all/ Is that really the case? I thought that Google crawls and "follows" nofollowed tagged links even though doesn't pass any PR to the destination link. If instead Google really doesn't crawl internal links tagged as "nofollow", can that really help with crawl budget?
Intermediate & Advanced SEO | | fablau0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
How to perform Local SEO for sites like Angies List/Task Rabbit or Craigslist
I have a new SEO client that has a business model similar to Criagslist and Angies List or Task Rabbit, Where they offer local based services nationwide. My first thought was Local link building and citation building etc. But the issue is they are a purely online service company and they don't have a phyiscal address in every city/state they will be offering their services in. What is the best course of action for providing SEO services for this type of business model. I am pretty much at a stand still on how to rank them locally for the areas they provide services in. it's a business model that involves local businesses and customers looking for services from those local businesses.
Intermediate & Advanced SEO | | VITALBGS0 -
What is the best way to optimize/setup a teaser "coming soon" page for a new product launch?
Within the context of a physical product launch what are some ideas around creating a /coming-soon page that "teases" the launch. Ideally I'd like to optimize a page around the product, but the client wants to try build consumer anticipation without giving too many details away. Any thoughts?
Intermediate & Advanced SEO | | GSI0 -
Google Said "Repeat the search with the omitted results included."
We have some pages targeting the different countries but with the Near to Similar content/products, just distinguished with the country name etc. one of the page was assigned to me for optimizing. two or three Similar pages are ranked with in top 50 for the main keyword. I updated some on page content to make it more distinguish from others. After some link building, I found that this page still not showing in Google result, even I found the following message on the google. "In order to show you the most relevant results, we have omitted some entries very similar to the 698 already displayed.
Intermediate & Advanced SEO | | alexgray
If you like, you can repeat the search with the omitted results included." I clicked to repeat omitted result and found that my targeted url on 450th place in google (before link building this was not) My questions are Is google consider this page low quality or duplicate content? Is there any role of internal linking to give importance a page on other (when they are near to similar)? Like these pages can hurt the whole site rankings? How to handle this issue?0