Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Set base-href to subfolders - problems?
-
A customer is using the <base>-tag in an odd way:
<base href="http://domain.com/1.0.0/1/1/">
My own theory is that the subfolders are added as the root because of revision control.
CSS, images and internal links are used like this:
I ran a test with Xenu Link Sleuth and found many broken links on the site, but I can't say if it is due to the base-tag.
I have read that the base-tag may cause problems in some browsers, but is this usage of base-tag bad in some SEO-perspective? I have a lot of problems with this customer and I want to know if the base-tag is a part of it.
-
Hi Highland!
I know that relative URLs is anything but good, especially when you also use URL rewrite.
The only question is how Google will react to this?
Thanks for your answer!
-
Hi Cyrus and thanks for your answer!
The client is using the base tag on all pages on the site, but with different URLs. For example:
Root page: <base href="http://domain.com/1.0.1.0/2/1/">
Subpage:
<base href="http://domain.com/1.0.1.0/5/1/"> OR
<base href="http://domain.com/1.0.1.0/13/1/">Productpage:
<base href="http://domain.com/1.0.1.0/14/1/">As you can se they are using a lot of different base locations and unfortunately we are unable to change the base URL and test.
We have problems with both broken links and rankings. Whenever a new version of the system is created, all base URLs will be changed. This may mean that old links are still there and will be broken.
What do you think Cyrus, can this hurt us from a SEO perspective? It must be confusing for Google with all the strange base URLs?
I think the best would be to rebuild the structure and remove the base tag!
-
Most of the time you don't need to specify a base URL. The browser already knows this location. In some situations defining a base is helpful, such as mirrored sites when the URL used is not the same URL that is needed to resolve files.
Is your clients using a universal base tag that is the same across the entire site? I can't tell from the question, but this is a common situation that could potentially cause problems.
There's nothing inherently wrong with using a base tag. Most of the time, if you use it, you simply want to set it to the URL of the current page.That said, to avoid complications, the only time you really want to use the Base tag is when relative URLs wouldn't work without it.
You might want to test how the links on your site resolve and see if removing or modifying the base tag helps clear up your broken links.
-
Those are some sloppy URLs. I especially advise people to avoid the problems of relative paths in ANY URL. And, yes, <base> probably isn't helping.
Links starting with / are fine. That's the root of your site. Anything using "../" should be nixed and use a fixed path. And never, ever use "./".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is having my homepage on a subfolder harmful?
Hi guys, I am the webmaster of the following two websites: www.gpblog.com/nl
Technical SEO | | NielsDE
www.gpblog.com/en The first URL is the Dutch version of GPBlog, the second URL is the UK version of GPblog. Whenever a person visits www.gpblog.com he gets redirected to either the Dutch version or the UK version based on his location. My question is: is it harmful to have 1. your homepage on a subfolder and 2. is it harmful to run two different languages on one domain using this technique? Thank you in advance!1 -
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
how to set rel canonical on wordpress.com sites
I know how to do this with a wordpress.org site but I have a client that does not want to switch and without a plugin I am lost. any help would be greatly appreciated. Jeremy Wood
Technical SEO | | SOtBOrlando0 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
Geotargeting duplicate content to different regions - href and canonical tag confusion
If you duplicate content onto a sub-folder for say a new US geotargeted site (to target kw spelling differences) and, in addition to GWT geotargeting settings, implement the 'Canonical' and 'Hreflang' tags on these new pages to show G different region and language version (en-us). Then does the original/main site similar pages also need to have canonical and href tags ? The main/original sites page I don't really want to target a specific country (although existing signals (hosting etc) will be UK (primary target of main site) but pages show up in other country searches too (which we want). Im presuming fine to leave the original/main site as it currently is although wording in google blog/webmaster central articles etc are a bit confusing hence why im asking for anyone elses opinion/input on this. Also is there are any benefit (or just best practice) to use 'www.example.com/en-us/...' in the subdirectory URL as opposed to just 'www.example.com/us/' many thanks in advance to any commentators 🙂
Technical SEO | | Dan-Lawrence0 -
Internal vs external blog and best way to set up
I have a client that has two domians registered - one uses www.keywordaustralia.com the other uses www.keywordaelaide.com He had already bought and used the first domain when he came to me I suggested the second as being worth buying as going for a more local keyword would be more appropriate. Now I have suggested to him that a blog would be a worthy use of the second domain and a way to build links to his site - however I am reading that as all links will be from the same site it wont be worth much in the long run and an internal blog is better as it means updated content on his site. should i use the second domain for blog, or just 301 the second domain to his first domain. Or is it viable to use the second domain as the blog and just set up an rss feed on his page ? Is there a way to have the second domain somehow 'linked' to his first domain with the blog so that google sees them as connected ? NOOBIE o_0
Technical SEO | | mamacassi0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0 -
Is a redirect based on a session cookie hurting rankings?
My clients business is divided in chain stores. All stores are set under the same franchise. There is one domain www.company.com with branches like www.company.com/location1/content and www.company.com/location2/content etc. I've taken care of duplicate content issues with rel="canonical" and duplicate page titles are also not a concern, anymore. Right now the concept is like this: If you visit the site for the first time you get to choose between the locations. Then a cookie is set and once you revisit www.company.com it will redirect you via a php header command to the location stored in your cookie: www.company.com/location1/content. My question is if this might hurt rankings in some kind of way as these aren't permanent redirects with a 301 but rather individual ones, based on your cookie.
Technical SEO | | jfkorn0