Set base-href to subfolders - problems?
-
A customer is using the <base>-tag in an odd way:
<base href="http://domain.com/1.0.0/1/1/">
My own theory is that the subfolders are added as the root because of revision control.
CSS, images and internal links are used like this:
I ran a test with Xenu Link Sleuth and found many broken links on the site, but I can't say if it is due to the base-tag.
I have read that the base-tag may cause problems in some browsers, but is this usage of base-tag bad in some SEO-perspective? I have a lot of problems with this customer and I want to know if the base-tag is a part of it.
-
Hi Highland!
I know that relative URLs is anything but good, especially when you also use URL rewrite.
The only question is how Google will react to this?
Thanks for your answer!
-
Hi Cyrus and thanks for your answer!
The client is using the base tag on all pages on the site, but with different URLs. For example:
Root page: <base href="http://domain.com/1.0.1.0/2/1/">
Subpage:
<base href="http://domain.com/1.0.1.0/5/1/"> OR
<base href="http://domain.com/1.0.1.0/13/1/">Productpage:
<base href="http://domain.com/1.0.1.0/14/1/">As you can se they are using a lot of different base locations and unfortunately we are unable to change the base URL and test.
We have problems with both broken links and rankings. Whenever a new version of the system is created, all base URLs will be changed. This may mean that old links are still there and will be broken.
What do you think Cyrus, can this hurt us from a SEO perspective? It must be confusing for Google with all the strange base URLs?
I think the best would be to rebuild the structure and remove the base tag!
-
Most of the time you don't need to specify a base URL. The browser already knows this location. In some situations defining a base is helpful, such as mirrored sites when the URL used is not the same URL that is needed to resolve files.
Is your clients using a universal base tag that is the same across the entire site? I can't tell from the question, but this is a common situation that could potentially cause problems.
There's nothing inherently wrong with using a base tag. Most of the time, if you use it, you simply want to set it to the URL of the current page.That said, to avoid complications, the only time you really want to use the Base tag is when relative URLs wouldn't work without it.
You might want to test how the links on your site resolve and see if removing or modifying the base tag helps clear up your broken links.
-
Those are some sloppy URLs. I especially advise people to avoid the problems of relative paths in ANY URL. And, yes, <base> probably isn't helping.
Links starting with / are fine. That's the root of your site. Anything using "../" should be nixed and use a fixed path. And never, ever use "./".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
Canonical vs Alternate for country based subdomain dupe content?
What's the correct method for tagging dupe content between country based subdomains? We have: mydomain.com // default, en-us www.mydomain.com // en-us uk.mydomain.com // uk, en-gb au.mydomain.com // australia, en-au eu.mydomain.com // europe, en-eu In the header of each we currently have rel="alternate" tags but we're still getting dupe content warnings in Moz for the "WWW" subdomain. Question 1) Are we headed in the right direction with using alternate? Or would it be better to use canonical since the languages are technically all English, just different regions. The content is pretty much the same minus currency and localization differences. Question 2) How can we solve the dupe content between WWW and the base domain, since the above isn't working. Thanks so much
Technical SEO | | lvdh11 -
Do I have a robots.txt problem?
I have the little yellow exclamation point under my robots.txt fetch as you can see here- http://imgur.com/wuWdtvO This version shows no errors or warnings- http://imgur.com/uqbmbug Under the tester I can currently see the latest version. This site hasn't changed URLs recently, and we haven't made any changes to the robots.txt file for two years. This problem just started in the last month. Should I worry?
Technical SEO | | EcommerceSite0 -
Subdomain/subfolder question
Hi community, Let's say I have a men's/women's clothing website. Would it be better to do clothing.com/mens and clothing.com/womens OR mens.clothing.com and womens.clothing.com? I understand Moz's stance on blogs that it should be clothing.com/blog, but wanted to ask for this different circumstance. Thanks for your help!
Technical SEO | | IceIcebaby0 -
Do I have a problem with missing pages in Screaming Frog?
We have category pages and some of those pages have pagination due to us having additional items. Screaming Frog could not find the items that were after page 1. Is this a problem for Google? These item pages are still in the sitemap. I am sure they can find them to index them but does it hurt rankings at all.
Technical SEO | | EcommerceSite0 -
Clone TLD Problems
I have an online services website www.geekwik.com which I started 3 months back. I also recently made a clone in TLD geekwik.in which has the same content, only pricing is in INR and is targeted at India users, while geekwik.com is targetted at global users with pricing in USD. How do I manage these 2 sites so that I do not face duplicate content penalty from google and the sites do not cannibalize on each other. Is there anything specific I need to do in robots.txt or .htaccess or sitemaps or hrelang etc? I personally feel that after putting up geekwik.in couple of weeks ago, the ranking of geekwik.com went down and I started getting lesser search queries. I would be putting up an IP based switch on both sites shortly so that Indian users are redirected to .in TLD and non-Indians are redirected to .com TLD. From SEO standpoint what are the things I need to do to counter these problems mentioned above. Putting India version in a subdirectory is also an option.
Technical SEO | | geekwik0 -
Problem with duplicate content
Hi, My problem is this: SEOmoz tells me I have duplicate content because it is picking up my index page in three different ways: http://www.web-writer-articles.co.uk http://www.web-writer-articles.co.uk/ and http://www.web-writer-articles.co.uk/index.php Can someone give me some advice as to how I can deal with this issue? thank you for your time, louandel15
Technical SEO | | louandel150 -
Removing robots.txt on WordPress site problem
Hi..am a little confused since I ticked the box in WordPress to allow search engines to now crawl my site (previously asked for them not to) but Google webmaster tools is telling me I still have robots.txt blocking them so am unable to submit the sitemap. Checked source code and the robots instruction has gone so a little lost. Any ideas please?
Technical SEO | | Wallander0