Is Having Broken Outbound Links on old blogs posts an issue?
-
Please note that these old posts hardly get any traffic. Ive heard both sides on this.
thanks,
Chris
-
Great advice, love it. Thx!
-
This is on a small business website in one city.
Please note that these old posts hardly get any traffic.
I assume that the real issue with broken links is when they occur on one of your main navigational links that drives most of your traffic?
I just view this as a minor issue, where as broken links on main pages where there is a lot of traffic is much more alarming or pressing.
First, it sounds like you are making excuses for not fixing them.
If these posts get hardly any traffic then maybe they are not very good posts. If they are worthless then dump them. If they have potential then improve them.
No matter where these broken links are it is a sign of low form, of a website that is not tended, of a website that Google might view as low quality.
Today, more than ever, the game on the web is about quality. The most important thing that you can do is buy into that.
-
Thanks for the info, appreciate it and totally get it. Just seems like very low priority if you will when these links are on old posts (2014/13) that very few at most visit. Additionally, i read that the Google bot simply moves on. I just view this as a minor issue, where as broken links on main pages where there is a lot of traffic is much more alarming or pressing. Thx
-
My view is you should fix all broken links. Also investigate why broken.
The primary reason is the "google bot". In short Mr Google Bot hates broken links and it means he cannot crawl your site properly this could lead to a decrease in rankings, a Mr Bot stumbles into a dead end (broken link) and thinks the rest of your site has closed as well. Hasta la vista...
It is simply not worth the risk.
-
Hi there
I would fix them. Otherwise, if the articles aren't getting any traffic, rankings, or are out of date, I would see what I can do to improve those old posts or remove them. There may be opportunities to update, you never know.
It may be a good time to perform a content audit.
Broken outbound links are frustrating for users, especially if it's a link to an article backing up your points or data. Always make sure they work. Remember - at the end of the day you want to have a great user experience - outbound links are a part of them.
Plus, a lot of broken outbound links may show crawlers that you're not paying attention to your site, or the site isn't being kept up to date.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local Site Linking to Corporate Site In Main Menu - Bad for SEO?
Hi, We have 'local' websites for different countries (UK, DE, FR, AP, US etc.) and a corporate website, the local websites are going to be linking back to the corporate website in the main menu (think about us, terms and conditions kind of pages). Any local products will have their own pages on the local website but global products will be linked back to the corporate website. We will be placing an indication the user will be going to another website next to those menu links that go to the corporate website. Is there any drawback to this for SEO? Should we use nofollow in the menu structure of regional websites for these links? Thanks for your help.
Local Website Optimization | | UNIT40 -
Wordpress Blog, Schema and Authorship Settings
Hi Everyone, What is the best practice for authorship in 2018 and going forward? I am moving my entire blog over to a new wordpress theme so it's easier to read and navigate in an attempt to make it look better on the mobile and give better UX / CRO and implicit user feedback signals to google. On the old blog I would say who the author is in the URL, H1 and in the content. This includes an image of the author with an image alt with their name, qualifications and blurb. I've now set up each author as a 'user' for the new blog and their image and name comes up because I've marked those blogs as authored by that particular user in Wordpress. What should I do as far as the SEO elements are concerned? I have read Eric Enge's blog about authorship being dead here and also that authorship should be marked up in schema correctly - which I've done. Also I've read around how it provides indirect signals even though it's no longer a direct ranking factor. Should I tell wordpress to ignore the authorship SEO element by unticking the boxes relating to publishing authorship or let wordpress just do it's thing? Should I keep the images and alt tags and H1 in there or take them out and let the wordpress system take over the authorship SEO elements? It's going to look funny to have author (in wordpress theme) and then author details again just below? So what is the best practice for authorship in 2018 and going forward? Am I making too big a deal of it and can just let wordpress sort it out. Something it seems to do very well? Thanks in advance, Ed.
Local Website Optimization | | Smileworks_Liverpool0 -
Client wants to rebrand but insists on keeping their old website live as well...
I am working with a client in the dental space that has an existing (11 year old) website for his practice. His domain is tied to his last name, which he would like to get away from because he plans to sell the practice in the next couple years. Backstory: Prior to taking him on, he was working with an SEO agency out of India that were built him quite an ugly backlink profile. Once we discovered it, we immediately notified him about the risk of a penalty if left alone. He was riding high in Google SERP's so of course, it was of no concern to him. Needless to say about a year ago he was inducted into Google's "manual penalty club" for suspicious links. His site vanished in Google and all! Hooray! But no, not really... We met with him to discuss the options, suggesting we clean up his backlink profile, then submit for reconsideration. Based on the time we told him it could take to make progress and be back up and running, he wasn't very excited about that approach. He said he wanted us to rebuild a new site, with a new domain and start fresh. In addition, he wanted keep his original site live since it is tied to his already thriving practice. To sum it all up, his goal is to keep what he has live since his customers are accustom to using his existing (penalized) website. While building a new brand/website that he can use to build a cleaner backlink profile and rank in Google as well as to sell off down the line without having his name tied to the practice. Question: Being that he has an existing site with the company NAP info throughout and the new site will also have the same NAP (just a different domain/brand), is there a "best way" to approach this? The content on the new site would be completely unique. I understand this approach is iffy but in his situation it makes sense to some extent. Any feedback or ideas on how to best handle having two sites running for the same dental practice? If any part of my question is confusing or you need further details to help make a suggestion, please fire away and I will be happy to give as much detail as possible. Thanks Mozzers!
Local Website Optimization | | Bryan_Loconto1 -
What are the easiest Blogs to Become a Marketing Contributor for?
I've seen a lot of online articles that are bit outdated when it comes to this question and I wanted to see if Authors on other sites knew of some great sites to post their articles on.
Local Website Optimization | | InsigniaSEO0 -
Will hreflang eliminate duplicate content issues for a corporate marketing site on 2 different domains?
Basically, I have 2 company websites running. The first resides on a .com and the second resides on a .co.uk domain. The content is simply localized for the UK audience, not necessarily 100% original for the UK. The main website is the .com website but we expanded into the UK, IE and AU markets. However, the .co.uk domain is targeting UK, IE and AU. I am using the hreflang tag for the pages. Will this prevent duplicate content issues? Or should I use 100% new content for the .co.uk website?
Local Website Optimization | | QuickToImpress0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0