Relative paths vs absolute paths for links - is there a difference?
-
-
I almost always use absolute URLs. Dana has a point about efficiency, and a lot of developers HATE writing anything but relative URLs, but I think most modern servers can handle the load just fine, and developers can be bribed with cookies.
The advantage of Absolute URLs is they're less likely to break with various CMS's (content management systems) and on-page elements. I've seen javascript do some crazy things to relative URLs causing 1000's of broken, uncrawlable links.
And when your content is scraped, either for black hat reasons or perfectly legitimate reasons like embedded RSS feeds, then you get full credit for the link. But to be fair, and to recognize Gamer07's point, Google likely devalues most of these links anyway once they detect the duplicate content.
-
Well, if we go like that. Wouldn't it also involve a risk of negative SEO? A site who rips content from another is more likely to be considered a spam site. (linking to your site now)
-
However, if you use absolute URLs, if your content gets ripped off you have links back to your own content now, and people can know it's your content. It can also help to force http or https.
-
Thank you for the invaluable information
-
Hi Cenk,
Yes, it is better to use relative paths when linking on your site to your internal pages. This reduces the number of server calls on a page and can increase your page speed and efficiency. Now, does that factor specifically factor into algorithm? I don't know. But page speed is part of the algorthm, so I suppose you could say that indirectly it does have an effect on algorithms and how your site potentially ranks.
One caveat: Use absolute URLs for your canonical tags. Search engines have problems interpreting relative URLs when they are in a canonical tag. I learned this the hard way!
Hope that helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
More internal links pointing to internal page vs homepage
I was looking at our GSC internal links section and I saw that we have 901 internal links going to our compare rates form and 890 going to our homepage. At the end of most of our content I add a call to action to our compare rates form. Is this SEO friendly or should I have more pointing to the homepage and less pointing to our compare rates page?
Intermediate & Advanced SEO | | LindsayE0 -
Link Building for Ecommerce
I need help - I'm trying to boost the rankings of a competitive category page - Leather Office Chairs First I'm thinking I need earned links - but for something like leather office chairs thinking of interesting, unique content people would love to read & share is proving difficult. I am struggling - can anyone help?!
Intermediate & Advanced SEO | | BeckyKey1 -
TLDs vs ccTLDs?
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came. So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs. What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits? Thanks V j3LWnOJ
Intermediate & Advanced SEO | | venkatraman0 -
SEO: Subdomain vs folders
Hello, here's our situation:We have an ecommerce website, say www.example.com. For support, we use zendesk which offers a great platform that makes publishing FAQ and various resource articles very easy. We're torn between publishing these articles on our actual website, or publishing them via Zendesk. If we publish them on our website, the url would be something like:
Intermediate & Advanced SEO | | yacpro13
www.example.com/articles/title_article.html On the other hand, if we publish them via zendesk, the url would look like:
support.example.com/articles/title_of_article We would like to publish them via Zendesk, however, we do no want to miss out on any SEO benefit, however marginal it may be. Doing it this way, the domain would have all of the ecommerce pages (product and category pages), and the subdomain would have ALL other types of pages (help articles and also policies, such as return policies, shipping info, etc etc). I know a long time ago, folders were preferred over subdomains for SEO, but things change all the time.Do you think setting up all the ecommerce pages on the domain, and all content pages on the subdomain, would be a lesser solution? Thanks.0 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0 -
Do I have any harmful links? If so, what should I do?
URL in question: www.nasserilegal.com/criminal.html I'm using OSE and see some questionable backlinks. At first glance, if you look at the page authority and domain authority, they look great. Once you go to the actual pages, they look spammy. If the links are hurting the rankings for the site, should I try to remove the links manually or just ignore and continue to build good quality links or even build a new site? I noticed for the last couple of weeks, the rankings started to slip. Thanks in Advance, Lucas
Intermediate & Advanced SEO | | micasalucasa0 -
Which link url placement to buy - High PR vs. High PA?
I'm about to buy one directory link (just the one!) but can't decide which URL to place my link on in that directory because of the varying metrics - which is better of the below (bearing in mind my own site is still a PR0 sitewide)? www.exampledirectory.com/categoryA/subtategory1/
Intermediate & Advanced SEO | | emerald
Metrics: 21 linking domains, PA 44, DA 59, PR0 www.exampledirectory.com/categoryA/
Metrics:1 linking domain, PA 35, DA 59, PR5 I know PR is no longer relevant and usually ignore this metric (except for possible penalties) and just focus on Seomoz toolbar metrics, but as my own site itself is PA:37 and DA:28 homepage but PR0 completely sitewide (over 6 months old but relatively new site), I thought this might help to balance things. Thanks for your advice.0 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0