Googlebot and other spiders are searching for odd links in our website trying to understand why, and what to do about it.
-
I recently began work on an existing Wordpress website that was revamped about 3 months ago. https://thedoctorwithin.com. I'm a bit new to Wordpress, so I thought I should reach out to some of the experts in the community.Checking ‘Not found’ Crawl Errors in Google Search Console, I notice many irrelevant links that are not present in the website, nor the database, as near as I can tell. When checking the source of these irrelevant links, I notice they’re all generated from various pages in the site, as well as non-existing pages, allegedly in the site, even though these pages have never existed.
For instance:
- https://thedoctorwithin.com/category/seminars/newsletters/page/7/newsletters/page/3/feedback-and-testimonials/ allegedly linked from:
- https://thedoctorwithin.com/category/seminars/newsletters/page/7/newsletters/page/3/ (doesn’t exist)
In other cases, these goofy URLs are even linked from the sitemap. BTW - all the URLs in the sitemap are valid URLs.
Currently, the site has a flat structure. Nearly all the content is merely URL/content/ without further breakdown (or subdirectories). Previous site versions had a more varied page organization, but what I'm seeing doesn't seem to reflect the current page organization, nor the previous page organization.
Had a similar issue, due to use of Divi's search feature. Ended up with some pretty deep non-existent links branching off of /search/, such as:
- https://thedoctorwithin.com/search/newsletters/page/2/feedback-and-testimonials/feedback-and-testimonials/online-continuing-education/consultations/ allegedly linked from:
- https://thedoctorwithin.com/search/newsletters/page/2/feedback-and-testimonials/feedback-and-testimonials/online-continuing-education/ (doesn't exist).
I blocked the /search/ branches via robots.txt. No real loss, since neither /search/ nor any of its subdirectories are valid.
There are numerous pre-existing categories and tags on the site. The categories and tags aren't used as pages. I suspect Google, (and other engines,) might be creating arbitrary paths from these. Looking through the site’s 404 errors, I’m seeing the same behavior from Bing, Moz and other spiders, as well.
I suppose I could use Search Console to remove URL/category/ and URL/tag/. I suppose I could do the same, in regards to other legitimate spiders / search engines. Perhaps it would be better to use Mod Rewrite to lead spiders to pages that actually do exist.
- Looking forward to suggestions about best way to deal with these errant searches.
- Also curious to learn about why these are occurring.
Thank you.
-
Thanks, Kevin.
Glad I'm not the only one.
Disabling tags and categories aren't an option, in my case. Guess I need to look at more of the potential upside. Seems tags and categories, if handled correctly, could provide a new way to engage visitors and search engines.
I've heard people refer to 'spidering budgets, or whatnot'. Guess it's an entirely new topic of discussion... if limiting the spurious spider searching, (from good spiders,) means that said spiders will spend more time on the conventional pathways of a site.
-
Thanks, Vjay.
Did a lot of work fixing links in the database.
The issue was occurring even before implementation of WP super cache, and before the link fixing.
Being new-ish to WP, it seems strange that it's so willing to:
-
provide access via directories that don't really exist:
-
categories, tags, even search, if using a theme-provided site search.
I'm getting better at .htaccess, so I'm able to handle a lot of the old incoming links fairly well. In the case of these weird 'in the mind of the spiders' links, will be try to address these as well.
Thanks for your advice about 404 and 301 plugins. Time to look around and see what other useful tools are out there.
-
-
I have the same issue, I have stopped using tags because of all the irrelevant links they cause. Looking forward to reading the comments on this thread.
KJr
-
Hi There,
Your website is built on WordPress and it looks like that there might be spurious entries in the DB, which might also not be getting deleted due to the WP super cache plugin. You may try to empty your cache and install 'all 404 redirect' and 301 management plugins.
I hope this helps.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multilingual website
My website is https://www.india-visa-gov.in and we are doing multilingual. There are three options 1. TLD eg india-visa-gov.fr (French) india-visa-gov.de (German) 2. Subdomain eg: fr.india-visa-gov.in (French) de.india-visa-gov.in (German) 3. Folders https://www.india-visa-gov.in/fr/ (French) https://www.india-visa-gov.in/de/ (German) We have tried the 3rd option but need to know whether its better or not for the long term health from SEO. Does the MOZ DA carry better in Subdomain or TLD or Folders? What does MOZ recommend to maintain DA? Thanks
Technical SEO | | amitdipsite150220200 -
Worried About Broken Links
In Wordpress, I'm using a plugin called Broken Link Checker to check for broken links. Should I be worried about/spend time fixing outbound links that result in: 403 Forbidden -Server Not Found -Timeout -500 Internal Server Error -etc. Thanks for your help! Mike
Technical SEO | | naturalsociety3 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Will links be counted?
We are considering a redesign of our website and one of the options we are considering is to come up with something along the lines of http://www.tesco.com/, with rotating top offers. The question I am wondering is whether or not the links (ie. the blue links on the left side of the main graphic) will be visible to the spiders, and if not, whether there is a way to code it so they are?
Technical SEO | | simonukss0 -
Website no longer visible Search Results
Overnight my website no longer appears in search engines for the two keywords I use. The website has been nicely climbing up (very steady progress to 42 and 73) the overnight it has vanished off the Radar. I have checked my webmaster account, no messages etc. Please can anyone shed any light on why this has happened? Website is http://www.securityjobsuk.co.uk Many thanks in advance for any help with this. D
Technical SEO | | SJUK0 -
Internal linking disaster
Can someone help me understand what my devs have done? The site has thousands of pages but if there's an internal homepage link on all of the pages (click on the logo) shouldn't that count for internal links? Could it be because they are nonfollow? http://goo.gl/0pK5kn I've attached my competitors opensiteexplorer rankings (I'm the 2nd column) .. so despite the face the site is new you can see where I'm getting my ass kicked. Thanks! psRsQtH.png
Technical SEO | | bradmoz0 -
Too many links on your blog?
In all of my campaigns, I have a lot of URLs with too many links on the page (defined loosely as around or over 100 links per page); these links are virtually all found on blog pages. The link count shoots up quickly when you start using things like tag clouds, showing all the tags/categories a post is in, in addition to all the cross linking thats typical of blog posts. My question is: Does this matter? Do you work to get blog pages down under that 100 link limit, or just assume most blogs are like this and move along? If you think it does matter, what strategies have you used to cut down the number of links while still keeping popular elements like tag clouds?
Technical SEO | | AdoptionHelp0