301 vs Changing Link href
-
We have changed our company and want to 301 old domain from new domain in order to transfer the benefits of backlinks (DA: 50, 115 Linking Root Domains). I have the ability to modify around 50% of the backlinks. So my question is:
Instead of redirecting all the links, should I update the 50% to link to the new domain instead of relying on redirects? Would this possibly trip an algorithmic filter and devalue these links? Or should I just do a 301 and not worry about modifying the links?
-
Alan is exactly right. Direct links are better but google will discount them if you switch too quickly.
-
301's leak link juice so changing links would be better, but you have a point about gettuing too many links too quick. i would 301 them for now, and slowly change a few links each week.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing sitemaps in console
Hi there, Does anyone have any experience submitting a completely new sitemap structure - including URLs - to google console? We've changed our sitemap plug in, so rather than /sitemap-index.xml, our main sitemap home is /sitemap.xml (as an example). Is it better to 410 the old ones or 301 redirect them to the new sitemaps? If 301, what do we do about sitemaps that don't completely correlate - what was divided into item1.xml, item2.xml is now by date so items-from-2015.xml, items-from-2016.xml and so on. On a related note, am I right in thinking that there's no longer a "delete/ remove sitemap" option on console? In which case, what happens to the old ones which will now 404? Thanks anyone for any insight you may have 🙂
Intermediate & Advanced SEO | | Fubra0 -
Content Internal Linking ?
Should we internally link new content to old content using anchor tags (keywords) related to pages from all new blogposts or should be keep rotating the blogposts like link from some blog posts & not from others. What ratio should we maintain. Right now i keep 2 links maximum from a 300 words posts or 3 in 500 words posts maximum. But linking from each new blog posts will be good?
Intermediate & Advanced SEO | | welcomecure0 -
Is a rebranding that calls for a domain change a good time to sneak in a change to HTTPS?
Assumed: The material around good migration/redesign practices recommend, logically enough, to change as few things as possible in any given step, thus giving search engines as little trouble as possible identifying and reindexing changes. So if someone is doing significant changes to content, including uri changes, and a rebranding that requires a domain migration, they are generally better off doing one, than the other. 1) Beyond immediate testing and checking for correct crawl health being reestablished after one change, any thoughts on rules of thumb for when to do the second change? Do you do it as soon as you see your rankings/traffic turn the corner and confirm an upward trend after the drop, or wait till you have it all back (or at least hit a plateau)? In the absence of data or best practice I'm thinking of just letting 1/3rd to 2/3rds come back. Is a change to HTTPS small enough/similar enough from the search engine's perspective that it makes more sense to do that at the same time as the rebrand driven domain change? Does this create any special risks or considerations beyond those that arise from the individual components of the change?
Intermediate & Advanced SEO | | JFA0 -
Links to www vs non-www
I was having speed issues when I ran a test under Google Page Speed test and, as a result, switched to using Google Page Speed Service. This meant I had to switch my site from the non-www to the www. Since the switch my page is running faster but my ranking has dropped. What I'm trying to find out is the drop due to all of my previous links going to the non-www or is it because of the site being considered new and is more of a temporary issue. If it is a link issue I will contact everyone I can to see who will update the site address. Thanks everyone!
Intermediate & Advanced SEO | | toddmatthewca0 -
Ranking EMD to 301 for branding is it better to leave it as or 301 it?
We have a client about to enroll with us for SEO. The client has about 50 EMD sites, out of which 9 are ranking. An EMD has [Exact] match anchoring naturally, the sites in question are all EMDs the link profiles show it. The client wants to 301 the EMDs to a brand page.. We would want to 301, 9 EMD sites to the new site. Here is the thing, if the site domain has an exact match to the anchor text profile, when we 301 the page to www.brand.com/EMD will the link profile matter? One of the EMDs is on page one spot 2 if we do this change, will Google look at the new brand page (www.brand.com/EMD) as an unnatural link profile?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Undo a 301 redirect
Hi there, 4 months ago I have done a redirect from one domain to another. Now, after about 120 days I have just a few results from the old domain indexed. The problem is that I believe that the old domain name had a really big impact on rankings, as it had the main keyword in the domain name. I'm wondering now if I could restore the old domain just by taking out the 301 instruction and how will search engines react. Do you have any studies on that? Would it be possible? Matt Cutts himself did it with his own domain, but he doesn't talk specifically on the effect of the rankings: http://www.thedotcomblog.com/seo/redirects-after-change-in-domain-name Thanks in advance for any help,
Intermediate & Advanced SEO | | SandraMoZ0 -
301 Redirect question
Which is the best way to set up the 301 redirect on my main home page? http://horsebuggy.com to http://www.horsebuggy.com Or does it make a difference? Boodreaux
Intermediate & Advanced SEO | | Boodreaux0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0