Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Set base-href to subfolders - problems?
-
A customer is using the <base>-tag in an odd way:
<base href="http://domain.com/1.0.0/1/1/">
My own theory is that the subfolders are added as the root because of revision control.
CSS, images and internal links are used like this:
I ran a test with Xenu Link Sleuth and found many broken links on the site, but I can't say if it is due to the base-tag.
I have read that the base-tag may cause problems in some browsers, but is this usage of base-tag bad in some SEO-perspective? I have a lot of problems with this customer and I want to know if the base-tag is a part of it.
-
Hi Highland!
I know that relative URLs is anything but good, especially when you also use URL rewrite.
The only question is how Google will react to this?
Thanks for your answer!
-
Hi Cyrus and thanks for your answer!
The client is using the base tag on all pages on the site, but with different URLs. For example:
Root page: <base href="http://domain.com/1.0.1.0/2/1/">
Subpage:
<base href="http://domain.com/1.0.1.0/5/1/"> OR
<base href="http://domain.com/1.0.1.0/13/1/">Productpage:
<base href="http://domain.com/1.0.1.0/14/1/">As you can se they are using a lot of different base locations and unfortunately we are unable to change the base URL and test.
We have problems with both broken links and rankings. Whenever a new version of the system is created, all base URLs will be changed. This may mean that old links are still there and will be broken.
What do you think Cyrus, can this hurt us from a SEO perspective? It must be confusing for Google with all the strange base URLs?
I think the best would be to rebuild the structure and remove the base tag!
-
Most of the time you don't need to specify a base URL. The browser already knows this location. In some situations defining a base is helpful, such as mirrored sites when the URL used is not the same URL that is needed to resolve files.
Is your clients using a universal base tag that is the same across the entire site? I can't tell from the question, but this is a common situation that could potentially cause problems.
There's nothing inherently wrong with using a base tag. Most of the time, if you use it, you simply want to set it to the URL of the current page.That said, to avoid complications, the only time you really want to use the Base tag is when relative URLs wouldn't work without it.
You might want to test how the links on your site resolve and see if removing or modifying the base tag helps clear up your broken links.
-
Those are some sloppy URLs. I especially advise people to avoid the problems of relative paths in ANY URL. And, yes, <base> probably isn't helping.
Links starting with / are fine. That's the root of your site. Anything using "../" should be nixed and use a fixed path. And never, ever use "./".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we set up redirects for all deleted TAGS?
We recently found our site had 65,000 tags (yes 65K). In an effort to consolidate these we've started deleting them. MOZ is now reporting a heap of 404 errors for tag pages. These tag pages should not have links to them so not sure how come they're being crawled. Any suggestions from experience in this area would be useful.
Technical SEO | | wearehappymedia0 -
Sitemap use for very large forum-based community site
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Technical SEO | | CommManager
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it: Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs. Sitemap contains the above but is also dynamically updated with new threads & blog posts. Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance.0 -
Problems with WooCommerce Product Attribute Filter URL's
I am running a WordPress/WooCommerce site for a client, and Moz is picking up some issues with URL's generated from WooCommerce product attribute filters. For example: ..co.uk/womens-prescription-glasses/?filter_gender=mens&filter_style=full-rim&filter_shape=oval How do I get Google to ignore these filters?
Technical SEO | | SushiUK
I am running Yoast Premium, but not sure if this can solve the issue? Product categories are canonicalised to the root category URL. Any suggestions very gratefully appreciated. Thanks Bob0 -
Problem with Yoast not seeing any of this website's text/content
Hi, My client has a new WordPress site http://www.londonavsolutions.co.uk/ and they have installed the Yoast Premium SEO plug-in. They are having issues with getting the lights to go green and the main problem is that on most pages Yoast does not see any words/content – although there are plenty of words on the pages. Other tools can see the words, however Yoast is struggling to find any and gives the following message:- Bad SEO score. The text contains 0 words. This is far below the recommended minimum of 300 words. Add more content that is relevant for the topic. Readability - You have far too little content. Please add some content to enable a good analysis. They have contacted the website developer who says that there is nothing wrong, but they are frustrated that they cannot use the Yoast tools themselves because of this issue, plus Yoast are offering no support with the issue. I hope that one of you guys has seen this problem before, or can spot a problem with the way the site has been built and can perhaps shed some light on the problem. I didn't build the site myself so won't be offended if you spot problems with it. Thanks in advance, Ben
Technical SEO | | bendyman0 -
Www2 vs www problem
Hi, I have a website that has an old version and a new version. The content is not duplicate on the different versions.
Technical SEO | | TihomirPetrov
The point is that the old version uses www. and non-www before the domain and the new one uses www2. My questions is: Is that a problem and what should be done? Thank you in advance!0 -
Would using javascript onclick functions to override href target be ok?
Hi all, I am currently working on a new search facility for me ecommerce site... it has very quickly dawned on me that this new facility is far better than my standard product pages - from a user point of view - i.e lots of product attributes for customers to find what they need faster, ability to compare products etc... All in all just better. BUT NO SEO VALUE!!! i want to use this search facility instead of my category/product pages... however as they are search pages i have "robots noindex them" and dont think its wise to change that... I have spoken to the developers of this software and they suggested i could use some javascript in the navigation to change the onlclick function to take the user to the search equivelant of the page... They said this way my normal pages are the ones that are still indexed by google etc, but the user has the benefit of using the improved search pages... This sounds perfect, however it also sounds a little deceptive... and i know google has loads of rules about these kinds of things, the last thing i want is to get any kind of penalty or any negative reaction from an SEO point of view... I am only considering this as it will improve the user experience on my website... Can any one advise if this is OK, or a "no no"... P.s for those wondering i use an "off the shelf" cart system and it would cost me an arm and a leg to have these features built into my actual category / product pages.
Technical SEO | | isntworkdull0 -
Trailing Slash Problems
Link juice being split between trailing slash and non versions. ie. ldnwicklesscandles.com/scentsy-uk and ldnwicklesscandles.com/scentsy-uk/ Initially asked in here and was told to do a rewrite in the htaccess file. I don't have access to this with squarespace, nor can I add canonical tags on a page by page basis. 301 redirect from scentsy-uk to scentsy-uk/ didn't work either...said that the redirect wasn't completing in an error message on the browser. Squarespace hasn't been very helpful at all. My question is....is there another way to fix this? or should I just call it a day with squarespace and move to wordpress?
Technical SEO | | cmjolley0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0