Could this URL issue be affecting our rankings?
-
Hi everyone,
I have been building links to a site for a while now and we're struggling to get page 1 results for their desired keywords. We're wondering if a web development / URL structure issue could be to blame in what's holding it back.
The way the site's been built means that there's a 'false' 1st-level in the URL structure. We're building deeplinks to the following page:
www.example.com/blue-widgets/blue-widget-overview
However, if you chop off the 2nd-level, you're not given a category page, it's a 404:
www.example.com/blue-widgets/ - [Brings up a 404]
I'm assuming the web developer built the site and URL structure this way just for the purposes of getting additional keywords in the URL. What's worse is that there is very little consistency across other products/services. Other pages/URLs include:
www.example.com/green-widgets/widgets-in-green
www.example.com/red-widgets/red-widget-intro-page
www.example.com/yellow-widgets/yellow-widgets
I'm wondering if Google is aware of these 'false' pages* and if so, if we should advise the client to change the URLs and therefore the URL structure of the website.
- This is bearing in mind that these pages haven't been linked to (because they don't exist) and therefore aren't being indexed by Google. I'm just wondering if Google can determine good/bad URL etiquette based on other parts of the URL, i.e. the fact that that middle bit doesn't exist.
As a matter of fact, my colleague Steve asked this question on a blog post that Dr. Pete had written. Here's a link to Steve's comment - there are 2 replies below, one of which argues that this has no implication whatsoever. However, 5 months on, it's still an issue for us so it has me wondering...
Many thanks!
-
It's ahrd to address in blog comments, but these things can be very situational. In a perfect world, I don't like those phantom folder levels for 2 reasons:
(1) Someone will eventually try to link to or access one, including possibly Google, and that may lead to odd behavior. I've seen claims Google will extrapolate URLs, but have never seen clear proof.
(2) It just makes for long URLs that, in this case, look a bit spammy.
Practically, is it making a difference? They aren't being indexed, so that's certainly a positive sign - it indicates no weird extrapolation by Google and no inbound links to those levels. At the same time, as discussed in my post, revamping your entire URL structure does carry risk.
So, it's not ideal (IMO), but I'm not sure I'd mess with it unless you're changing URLs for other reasons (then, do it all at once).
-
URLs - headache! We have a terrible URL structure because of the ways we have to pull data, so this is something that I have checked into, too. Now, I will say there's lots of differing opinions on this. I will share with you what someone from Google said last week at SMXWest: they just want you to know about bad links, they don't penalize you for them.
I'm not saying that's the end-all-be-all answer, but she knows that there's a perception that it can 'ding' you when the reality (according to her) is that they drop 404 pages from their index because they don't serve up bad pages. You have lots of bad pages, less linking ability, less pages to have rank and you can lose online visibility. There's a difference between losing visibility because your overall content offering is reduced by bad links and those pages never having existed in the first place.
There's a good chance there's something else going on -one of the things I adore about this forum is that people here have crazy skills and I have witnessed them uncover an issue the original poster didn't even know they had.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Only Indexing Canonical Root URL Instead of Specified URL Parameters
We just launched a website about 1 month ago and noticed that Google was indexing, but not displaying, URLs with "?location=" parameters such as: http://www.castlemap.com/local-house-values/?location=great-falls-virginia and http://www.castlemap.com/local-house-values/?location=mclean-virginia. Instead, Google has only been displaying our root URL http://www.castlemap.com/local-house-values/ in its search results -- which we don't want as the URLs with specific locations are more important and each has its own unique list of houses for sale. We have Yoast setup with all of these ?location values added in our sitemap that has successfully been submitted to Google's Sitemaps: http://www.castlemap.com/buy-location-sitemap.xml I also tried going into the old Google Search Console and setting the "location" URL Parameter to Crawl Every URL with the Specifies Effect enabled... and I even see the two URLs I mentioned above in Google's list of Parameter Samples... but the pages are still not being added to Google. Even after Requesting Indexing again after making all of these changes a few days ago, these URLs are still displaying as Allowing Indexing, but Not On Google in the Search Console and not showing up on Google when I manually search for the entire URL. Why are these pages not showing up on Google and how can we get them to display? Only solution I can think of would be to set our main /local-house-values/ page to noindex in order to have Google favor all of our other URL parameter versions... but I'm guessing that's probably not a good solution for multiple reasons.
Intermediate & Advanced SEO | | Nitruc0 -
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Google ranking 301 redirected vanity urls
We use vanity URLs for offline marketing. An example vanity URL would be www.clientsite.com/promotion, this URL 301 redirects to a page on the site with tracking parameter ex: www.clientsite.com/mainpage?utm_source=source&utm_medium=print&utm_campaign=xyz. We are running into issues with Google ignoring the 301 redirect and ranking these vanity URLs instead of the actual page on the website. Any suggestions on how to resolve?
Intermediate & Advanced SEO | | digitalhound0 -
Tagged URL ranking organically
I've noticed that one of our GA tagged urls are ranking organically & therefore is skewing the referral data. The campaign that we were tracking is no longer active but the link still works, but it's going to an old landing page. I asked our developers if we could redirect it but they said that it didn't work. Does anyone have some advise or a solution for this? Thanks!
Intermediate & Advanced SEO | | Elihn0 -
HTTPS Certificate Expired. Website with https urls now still in index issue.
Hi Guys This week the Security certificate of our website expired and basically we now have to wail till next Tuesday for it to be re-instated. So now obviously our website is now index with the https urls, and we had to drop the https from our site, so that people will not be faced with a security risk screen, which most browsers give you, to ask if you are sure that you want to visit the site, because it's seeing it as an untrusted one. So now we are basically sitting with the site urls, only being www... My question what should we do, in order to prevent google from penalizing us, since obviously if googlebot comes to crawl these urls, there will be nothing. I did however re-submitted it to Google to crawl it, but I guess it's going to take time, before Google picks up that now only want the www urls in the index. Can somebody please give me some advice on this. Thanks Dave
Intermediate & Advanced SEO | | daveza0 -
Rankings Dropped
Hello, I wonder if anyone can point us in the right direction. Our main domain name has got a good SEO profile the domain is 12 years old and we have some good reputable links. On the 7th of Jan our website dropped in the rankings from 3rd to 10th for one of our main keywords and from then onwards some of our other keywords have dropped. We have a lot of landing pages that target specific keywords which look like a template page but just with the content changed. Can anyone pin point what could of caused this problem or has anyone experienced this before and knows how to fix it. I personally think we have been hit by the panda. Thanks, Scott
Intermediate & Advanced SEO | | ScottBaxterWW0 -
Will swear words present on my pages affect my rankings?
Hi There, I am in the process of formulating a listing policy for my site and I'm not sure whether I should add something in there for swear words. My site is an adult site and swear words come with the territory, unfortunately. Will user generated content with swear words affect my ranking? Thank you
Intermediate & Advanced SEO | | Mulith0 -
Google Places - How do we rank
So, google places showing up on search results is great feature . . . But how can we get our results to the top? I mean I can see some terrible websites appearing at the top of the google places with their places page having no activity whatsoever. Is there a trick to this at all? What can we do to increase our ranking on Google Places because our old GOOD rankings are now appearing BELOW the map results Cheers
Intermediate & Advanced SEO | | kayweb0