Do old backlinks still help with new URL with 301 redirect? Also I added the www. How does this affect it all?
-
I changed my URL from exampledetailing. com to exampleautodetailing. com. It is redirected with a 301. Also, it is on Squarespace AND I opted to add the www. So will the old backlinks of exampledetailing. com still help the new URL exampleautodetailing. com or do I need to try and update all the links? Also, future links, do I need to include the www. or just the root domain of exampleautodetailing. com or even the whole https://wwwexampleautodetailing. com? I believe the www is considered a sub domain and a new entity on Google, so I am not sure how that works. Thank you!
-
So about updating links to the new one. I see that you said that I should use the https://www.domain.com at all times.
Then you said that the www is considered a new entity, but then you mention that it was ok to not use the www in my domain.
Are you talking about the backlinking domains I do from here or on the actual website itself?
Last thing. is the backlinking rule that it should always match the exact way it shows in the URL bar? So for example https://www.domain.com should always be https://www.domain.com and never https://domain.com or even domain.com?
-
Hi there,
You still will get value from your backlinks if you do a 301 redirect on the server level. When doing 301 redirect you are losing about 30% of SEO juice, so it is better to update your links to the new site. For the new links, you should link them to https://www.domain.com at all times. Yes, you are correct www ( CNAME) subdomain is considered as a new entity, that's why it is ok not to use www in your domain.
Ross
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An attorney left my clients firm and we still rank well for her name
We've taken down the attorney's official page. Should we redirect her old page to the home page? Do a custom 404? I'm sure there's a best practice here but I'm blanking.
Local Website Optimization | | TheKatzMeow0 -
Do duplicate street addresses on 2 website affect SEO?
Hi, We have 2 websites built for one client that has 2 companies running from the same physical location. Would having the same address listed on both websites affect their SEO rankings? The 2 websites mentioned are linked below: http://anastasiablinds.ca/ http://www.greenfoxwindows.ca/ Thanks for your help!
Local Website Optimization | | Web3Marketing871 -
Landing page, or redirect? Looking for feedback.
If we have a section of our site that we have branded separately from the rest of the site, does it make sense to provide a landing page on our current, high authority site that has content and links off to the separate site, or would just a domain.com/keyword redirect to the page be a better route? Does it matter? I have an idea, but I'd like to get feedback on this. We are a newspaper, http://billingsgazette.com and we have an auto branded site called http://montanawheelsforyou.com. The URL and branding is fubar. We're wondering if we can increase the ranking if we swapped out the http://billingsgazette.com/autos from a redirect to http://montanawheelsforyou.com to a landing page with content and a link to http://montanawheelsforyou.com.
Local Website Optimization | | rachaelpracht0 -
All metrics appear to be better than our local competitors yet we our ranking doesn't resemble it. Help?
Hi, I work for a marquee company and have recently been really trying to optimise our SEO through good content, link building, social media especially google + and so on. Yet a rival (www.camelotmarquees.com) who performs worse than us for the majority of the moz parameters still ranks better than us in both organic search and google places. The clear and obvious factor they beat us on is internal links which is currently over 15,000 which seems ridiculous for the size of their site, compared to our site of about 120. Would this have that match of an effect on the rankings and how on earth have they got so many? Also is there any tips and advice to help us leap frog them as we feel, we're producing regular, useful content and optimised our site the best we can? website: www.oakleafmarquees.co.uk keywords: marquee hire dorset, marquee dorset, dorset marquee hire, wedding marquee hire
Local Website Optimization | | crazymoose780 -
Image URLs changed 3 times after using a CDN - How to Handle for SEO?
Hi Mozzers,
Local Website Optimization | | emerald
Hoping for your advice on how to handle the SEO effects an image URL change, that changed 3 times, during the course of setting up a CDN over a month period, as follows: (URL 1) - Original image URL before CDN:www.mydomain.com/images/abc.jpg (URL 2) - First CDN URL (without CNAME alias - using WPEngine & their own CDN):
username.net-dns.com/images/abc.jpg (URL 3) - Second CDN URL (with CNAME alias - applied 3 weeks later):
cdn.mydomain.com/images/abc.jpg When we changed to URL 2, our image rankings in the Moz Tool Pro Rankings dropped from 80% to 5% (the one with the little photo icons). So my questions for recovery are: Do I need to add a 301 redirect/Canonical tag from the old image URL 1 & 2 to URL 3 or something else? Do I need to change my image sitemap to use cdn.mydomain.com/images/abc.jpg instead of www.? Thanks in advance for your advice.0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
How slow can a website be, but still be ok for visitors and seo?
Hello to all, my site http://www.allspecialtybuildings.com is a barn construction site. Our visitors are usually local. I am worried about page speed. I have been using Google Page Insight, and Gtmetrix. Although I cannot figure out browser leveraging, I have a 79 / 93 google score and for gtmetrix 98/87 score. Load times vary between 2.13 secs to 2.54 secs What is acceptable? I want to make sure I get Google love for a decent page speed, but for me these times are great. Bad times are like 7 seconds and higher. I have thought about a CDN, yet I have read horror stories too. I have ZERO idea of how to use a CDN, or if I need it. I just want a fast site that is both user and Google speed friendly. So my question is, what is a slow speed for a website? Is under 3 seconds considered ok? or bad for seo? But any advice is greatly appreciated.
Local Website Optimization | | asbchris0 -
HELP, My site have more than 40k visits by day and the server is down, I do not want all this visits...
Hello... I have a website for a local spa in ecuador, this website have a blog with some tips about health... and suddenly one of the articles goes viral on south america profiels on FB and I am receiving 40k visits by day from other countries that are not interested to me because my site is for a local bussines in ecuador... I already block some countries by IP , but Im still receiving visits from other south america countries, for this reason My hosting server company put down my website and I can not put it back online beacuse this thousands of visits use more than the 25% of the CPU of the server and the hosting company put down my website again... I really need to know what to do, I do not want to pay for a expensive special server because all this visits from other countries are not interesting to me .and as I said before my bussines is local.
Local Website Optimization | | lans27872