GSC is reporting a lot of chopped URLs
-
Recently, in the last two weeks, I started seeing a lot of odd 404 errors in GSC for my site. Upon investigation, the URLs are for fairly new articles, and the URLs are chopped in various places. From missing a character at the end to missing about 10 characters at the end of the URL. (an old similar issue is that GSC reports duplicate contents on weird subdomains that we've never used like 'smtp' 'ww1' or even random ones like 'bobo'.)
GSC doesn't report any 'linked from' for those odd URLs and I know for sure these links aren't on the site itself. They're definitely not errors in the CMS.
The site is a long established site (started 1997-1998) and we've been subject to a lot of negative SEO. I recently had to disavow about 1000 .ru domain linking to us, with some domains containing over a million link each.
Could these chopped links be a new tactic of negative SEO? How do I find these seemingly intentionally broken links to us?
-
Thanks for the question. It isn't uncommon for there to be strange 404 errors in Search Console with little information/bad information. They are working hard to improve this, but I wouldn't take everything you see there as set-in-stone.
This doesn't sound like a negative SEO tactic. I would just mark them all as fixed, and see if they appear again in about a week. If they do, I'd make sure they are actually served as 4xx status and not worry too much about it. If you want to do more digging...
Some ideas of where you could look further
- Logs logs logs. This will be the ultimate truth - you will be able to see whether or not GoogleBot is actually hitting those URLs.
- It could be something weird happening with a plugin of yours that generates those URLs (particularly on Wordpress).
- Perhaps you have a filtering system setup that generates these URLs?
- If you have a search function on the site, sometimes weird URLs can be generated through that.
- Do the URLs come-up when you crawl the site at all?
Just a few ideas!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO effect of URL with subfolder versus parameters?
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations). For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL: https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php or http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
Intermediate & Advanced SEO | | Searchout0 -
Redirecting a Few URLs to a New Domain
We are in the process of buying the blog section of a site. Let's say Site A is buying Site B. We have taken the content from Site B and replicated it on Site A, along with the exact url besides the TLD. We then issued 301 redirects from Site B to Site A and initiated a crawl on those original Site B urls so Google would understand they are now redirecting to Site A. The new urls for Site A, with the same content are now showing up in Google's index if we do a site:SiteA.com search on the big G. Anyone have any experience with this as to how long before Site A urls should replace Site B urls in the search results? I undestand there may be a ranking difference and CTR difference based on domain bias, etc... I'm just asking if everything goes as planned and there isn't a huge issue, does the process take weeks or months?
Intermediate & Advanced SEO | | seoaustin0 -
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Weird 404 URL Problem - domain name being placed at end of urls
Hey there. For some reason when doing crawl tests I'm finding pages with the domain name being tacked on the end and causing 404 errors.
Intermediate & Advanced SEO | | Jay328
For example: http://domainname.com/page-name/http://domainname.com This is happening to all pages, posts and even category type 1. Site is in Wordpress
2. Using Yoast SEO plugin Any suggestions? Thanks!0 -
Hash URLs
Hi Mozzers, Happy Friday! I have a client that has created some really nice pages from their old content and we want to redirect the old ones to the new pages. The way the web developers have built these new pages is to use hashbang url's for example www.website.co.uk/product#newpage My question is can I redirect urls to these kind of pages? Would it be using the .htaccess file to do it? Thanks in advance, Karl
Intermediate & Advanced SEO | | KarlBantleman0 -
Please suggest! Overstuffed URLs - to change, or not to change?
Hello guys, We are in doubts about changing URLs on our http://www.fiberscope.net/ store The site was hit by Google in January and has dropped from top 3 to top 20 for most of its keywords. Now we're working on some redesign, including removing of over optimized content from it. It looks like our categories and content pages URLs are overstuffed with keywords. Here is a dilemma. Should we rewrite them and set up 301 redirect or just leave as is? I look trough the seomoz Q&A and find out that a lot of guys got in troubles after redirecting PLEASE, Suggest!
Intermediate & Advanced SEO | | Meditinc.com0 -
Page URL keywords
Hello everybody, I've read that it's important to put your keywords at the front of your page title, meta tag etc, but my question is about the page url. Say my target keywords are exotic, soap, natural, and organic. Will placing the keywords further behind the URL address affect the SEO ranking? If that's the case what's the first n number of words Google considers? For example, www.splendidshop.com/gift-set-organic-soap vs www.splendidshop.com/organic-soap-gift-set Will the first be any less effective than the second one simply because the keywords are placed behind?
Intermediate & Advanced SEO | | ReferralCandy0 -
Pretty URLs... do they matter?
Given the following urls: example.com/warriors/ninjas/ example.com/warriors/ninjas/cid=WRS-NIN01 Is there any difference from an SEO perspective? Aesthetically the 2nd bugs me but that's not a statistical difference. Thank you
Intermediate & Advanced SEO | | nymbot0