Base HREF set without HTTP. Will this cause search issues?
-
The base href has been set in the following format:
<base href="//www.example.com/">
I am working on a project where many of the programming team don't believe that SEO has an impact on a website. So, we often see some strange things. Recently, they have rolled out an update to the website template that includes the base href I listed above. I found out about it when some of our tools such as Xenu link checker - suddenly stopped working.
Google appears to be indexing the the pages fine and following the links without any issue - but I wonder if there is any long term SEO considerations to building the internal links in this manner?
Thanks!
-
Thanks for the comment. I was able to get them to make the changes, but I think I have made some new enemies. Oh well, I will move on in a few months anyhow.
Thanks again,
Joe
-
The W3C standards might allow for no protocol, but you would never just put "//" - that's part of the protocol ("http://", "https://", "ftp://", etc.). This usage is technically incorrect. It could cause minor issues on some browsers (although probably not on newer ones).
Does it matter for SEO? Well, that's a bit trickier. Google tend to ignore base href unless there are ambiguous relative URLs, like canonical tags that have no base URL and are unclear. Practically speaking, it's probably not a huge problem, but it is possible for it to cause issues down the road.
Either way, if it's on a sitewide template, it's a 5-minute job, and what they have is wrong. I'm not one to knock devs (I've been a dev and I've managed devs), but they need to stop arguing and just fix it.
-
They have put it on every page. The programming manager is quick to point out that according to W3C neither http nor https are required for proper links. I have just never seen anyone purposely make all internal links begin with double slashes (//). It certainly makes xenu die, but I am not sure if there is any downside other than xenu and a few other tools not working.
Thanks!
-
Is that code on every page or just the homepage?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Discovered - currently not indexed issues
Hi there We're having an issue with our site recently where our new blog posts are not indexing properly from Google.
Technical SEO | | Syte
Inspecting it with Google Search Console gives up the errors "Discovered - currently not indexed" and "Excluded". I.e. it seems like Google sees our sitemap but chooses not to crawl and index for some reason. Does anybody have any ideas why this might be, and what we could do to fix it?
Thanks0 -
X-robots tag causing no index issues
I have an interesting problem with a site which has an x-robot tag blocking the site from being indexed, the site is in Wordpress, there are no issues with the robots.txt or at the page level, I cant find the noindex anywhere. I removed the SEO plug-in which was there and installed Yoast but it made no difference. this is the url: https://www.cotswoldflatroofing.com/ Its coming up with a HTTP error: x-robots tag noindex, nofollow, noarchive
Technical SEO | | Donsimong0 -
Australian search - ZERO visibility and stumped
Fair warning, this is going to be long, but necessary to explain the situation and what has been done. I will take ANY suggestions, even if I have tried them already. We have a sister site in Australia, targeting Australian traffic. I have inherited what seems to be an incredible rat's nest. I've fixed over two dozen issues, but still haven't seemed to address the root cause. NOTE: Core landing pages have weak keyword targeting. I don't expect much here until I fix this. The main issues I'm trying to resolve first are with the unusual US-based targeting, and the inability of the homepage to rank for anything. The site is www[dot]castleford[dot]com[dot]au. Here's the rundown on what's going on: Problems: The site ranks for four times as many keywords in the US as it does in Australia. The site ranks for a grand total of 5 keywords on the first page for AU keywords. The homepage, while technically optimized on-page for "content marketing agency", and with content through MarketMuse, has historically ranked between 60-100, despite having a fairly strong DA with fairly weak competitors, based on AHREFs keyword difficulty, and Moz keyword difficulty. Oddly, the ranking has gone up to 5-7 for three day spurts over the past year. Infrequent indexing of homepage (used to be every 2-3 weeks, I've gotten that down to 1 week). Sequence of events: November 2017 - they made some changes to their URLs - some on the blog and some on the top nav LPs. Redirects seem okay. November 2017 - Substantial number of lost referring domains, not many seem to be quality. January 2018 - total number of AU ranking keywords more than halved. May/June 2018 - added a follow inbound link sitewide to an external site that they created. 20k inbound links with same anchor text to homepage. Site has a total of 24k inbound links. July-Sep 2018 - total number of US ranking keywords halved November 10 - I walked into this mess. What's been done: Reduced site load speed by over 150% (it was around 20 seconds). Create sitemap (100 entry batching) and submit to GSC. Improved MarketMuse score for the homepage. Changed language from "en-US" to "en-AU" Fetch and render - content is all crawlable and indexed properly. Changed site architecture for top nav core landing pages to establish clear hierarchy. All version of GSC created, non-www and www http, and non www https and www https Site crawl - normal amount of 404s, nothing stands out as substantial. http to https redirect okay. Robots.txt updated and okay. Checked GSC international targeting, confirmed AU. No manual links penalty I'm clearly stumped and could use some insights. Thanks to everyone in advance, if you can find time.
Technical SEO | | Brafton-Marketing0 -
Does an Apostrophe affect searches?
Does Google differentiate between keyphrase structures such as Mens Sunglasses & Men**'**s Sunglasses? I.e. does the inclusion/exclusion of an apostrophe make any difference when optimising your main keyword/phrase for a page? Keyword explorer appears to give different results..... I.e. no data for Men's Sunglasses, but data appears for Mens sunglasses. So if I optimise my page to include the apostrophe, will it screw the potential success for that page? Thanks 🙂 Bob
Technical SEO | | SushiUK1 -
Cross domain canonical issue
Hello fella SEOs! I have a very intriguing question concerning different TLDs across the same domain. For eg: www.mainwebsite.com, www.mainwebsite.eu, www.mainwebsite.au, www.mainwebsite.co.uk etc... Now, assuming that all these websites are similar in terms of content, will our lovely friend Google consider all these TLDs as only one and unique domain or will this cause a duplicate content problem? If yes, then how should I fix it? Thnx for your precious help guys!
Technical SEO | | SEObandits1 -
Base href
I'm having a discussion with a third party that's building a website for a client that I advised concerning his SEO. The site went live 2 weeks ago and it's not getting indexed very well, so my client asked me what could be the problem. I checked several things that could be the problem, like an xml sitemap that was missing etc. But there was another thing that I saw in the source code: <base href="http://www.domain.com/"> Can this be a problem for Google to follow internal links? I always thought that you should use the base href like this: <base href="http://www.domain.com"> so without the trailing / behind the TLD And even better using absolute instead of relative links, no?
Technical SEO | | nvs.nim0 -
Facebook Like button issue
In looking through my top pages in Google Analytics, my #2 page (oddly enough) looked like this "/?fb_xd_fragment=". Apparently, this is because we added the Facebook Like button to many of our pages. But I'm worried these show very skewed PageView data and lower Time Spent on each page. The average time on this page is 5 seconds whereas the average sitewide time is much higher. Further, it shows 9,000 pageviews coming from only 250 Unique Visitors. I'm sure this is messing with our SEO. Is there a fix for this? Should I even be worried about it? I heard that I can remove it from my GA stat reporting, but I don't want it to be causing problems in the background. Please advise..my boss wants to keep the Facebook Like button the pages as it has brought us some good response. The page that this is on is: www.accupos.com Maybe there's an alternate version of the Facebook Like that we don't know about... I would appreciate any help on this DM
Technical SEO | | DerekM880 -
Will using http ping, lastmod increase our indexation with Google?
If Google knows about our sitemaps and they’re being crawled on a daily basis, why should we use the http ping and /or list the index files in our robots.txt? Is there a benefit (i.e. improving indexability) to using both ping and listing index files in robots? Is there any benefit to listing the index sitemaps in robots if we’re pinging? If we provide a decent <lastmod>date is there going to be any difference in indexing rates between ping and the normal crawl that they do today?</lastmod> Do we need to all to cover our bases? thanks Marika
Technical SEO | | marika-1786190