Can you nofollow a URL?
-
Hey Moz Community,
My questions sounds pretty simple but unfortunately, it isn't. I have a domain name (we'll use example.com for this) http://example.com which 301 re-directs to http://www.example.com. http://example.com has bad links pointing to it and http://www.example.com does not. So essentially, I want to stop negative influences from http://example.com being passed on to http://www.example.com. A 302 re-direct sounds like it would work in theory but is this the best way to go about this?
Just so you know, we have completed a reconsideration request a long time ago but I think the bad links are still negatively affecting the website as it does not rank for it's own name which is bizarre.
Actual Question:
How do I re-direct http://example.com to http://www.example.com without passing on the negative SEO attached to http://example.com?
Thanks in advance!
-
Looks like a lot of good information from folks here so I'll be brief.
Technically, there's no practical way to redirect the page without redirecting the links. Unless your page serves a 404 or 410 response code, those links will be associated with your domain.
The only way to disassociate yourself from these links is through use of the Disavow Tool.
-
It's a website for a mobile app and the references for it around the web (in iTunes for instance) all rank on first page, it has a unique name. The link profile has been fine since the penalty was lifted, a few links still need to be cleaned but they are all in the disavow file that is uploaded to Google. It's weird because we aren't even in the top 10 pages.
If the website was hit again then wouldn't a notice have come through in Google Webmaster Tools by now?
-
Hm, you're in a tough spot.
Is your domain a unique name, like "Moz," or is it a description of what you are, like "SEO Info"? I ask because it's possible that your brand name is really a competitive keyword that you're just not strong enough to compete for, which could explain why you don't rank for it.
Also, have you looked through your link profile since 2013? It's possible that you have been hit by a spam penalty again. Even if you haven't purchased more links, if someone hits you with some negative SEO or if you picked up a few low quality links without high quality links to balance things out, it's possible you've been hit again.
-
Hey Kristina, thanks for your reply, see my answers to each bullet point below:
- The www version does not rank for the brand name. Initially the non www version was the main website but we changed this to the www version during November 2014.
- The reconsideration request was submitted on 18/10/2013 and Google responded on 24/10/2013 stating that the manual spam action had been revoked.
- Not a lot of value right now but changing the domain name will be impossible.
-
Thanks Ryan, I guess you're right but we're trying to minimize the negative impact, a new domain name is not possible.
-
Thanks for the reply Monica. Unfortunately, a new domain is not possible.
-
To second Ryan's point: Google definitely sees http://example.com as a separate page than http://www.example.com, but I'd be surprised if you can distance yourself from bad links pointing to http://example.com by focusing on http://www.example.com. Google's pretty smart, it knows that those two pages are usually one and the same.
To your 302 redirect point: Google's seem enough improperly used 302 redirects (both accidentally and for SEO reasons, like this) to start treating 302 redirects as 301s if they stay in place over time, according to a test Geoff Kenyon worked on. A 302 redirect may work for a little while, but it's not a long term solution.
To dig into this a bit deeper:
- Does the www version of your site rank for your brand name? Is it just the non-www version that's been hit?
- When did you submit a reconsideration request, and what was Google's response? Did they say that your penalty has been lifted?
- What is the value of your current domain name? I've heard of companies that have a small enough brand awareness, it's better business sense to just restart from the ground up with a new domain. Is that where you are, or do you have a pretty solid business built up?
-
I think your problem may be one of differentiation. While example.com and www.example.com are technically two different domains, they're not substantially different enough when it comes to creating a new site on the ashes of a negative one. (i.e. canonical redirecting off www or non-www for the same domain name is a common practice for websites that aren't trying to change their image). I can't speak specifically to your situation but you might consider creating an entirely new domain if you think example.com is that negative.
-
Usually, you wouldn't 301 redirect this, you would use a canonical tag. If the value of the URL has a ton of negative link juice, is there any reason you can't 404 the page and start fresh on a new URL? That would be my advice. Even if you redirect the link, these are technically the same page, and the negative link juice will be passed through. I would cut my losses, get rid of the bad pages and start fresh.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would cause these ⠃︲蝞韤諫䴴SPপ� emblems in my urls?
In Search Console I am getting errors under other. It is showing urls that have this format- https://www.site.com/Item/654321~SURE⠃︲蝞韤諫䴴SPপ�.htm When clicked it shows 蝞韤諫䴴SPপ� instead of the % stuff. As you can see this is an item page and the normal item page pulls up fine with no issues. This doesn't show it is linked from anywhere. Why would google pull this url? It doesn't exist on the site anywhere. It is a custom asp.net site. This started happening in mid May but we didn't make any changes then.
Intermediate & Advanced SEO | | EcommerceSite0 -
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
Internal nofollow links
Hello, We have a blog and at the end each blog post (and from the sidebar) we link to one main product page (tagged with a particular query string). Now Google will see from every blog post all of these internal links pointing back to this page. Do you think this would cause a problem and that these links should be nofollowed? I think Google will kind of detect that these is kind of a "navigation" as the code will be the same across all webpages. Most of all, doing them nofollow I think it is worse because it may trigger some sort of pagerank sculpting algo filter, if it still exists. Thanks, Conrad
Intermediate & Advanced SEO | | conalt0 -
URL Optimisation Dilemma
First of all, I fully appreciate that I may be over analysing this, so feel free to highlight if you think I’m going overboard on this one. I’m currently trying to optimise the URLs for a group of new pages that we have recently launched. I would usually err on the side of leaving the urls as they are so that any incoming links are not diluted through the 301 re-direct. In this case, however, there are very few links to these pages, so I don’t think that changing URLs will harm them. My main question is between short URLs vs. long URLs (I have already read Dr. Pete’s post on this). Note: the URLs I have listed below are not the actual URLs, but very similar examples that I have created. The URLs currently exist in a similar format to the examples below: http://www.company.com/products/dlm/hire-ca My first response was that we could put a few descriptive keywords in the url, with something like the following: http://www.company/products/debt-lifecycle-management/hire-collection-agents - I’m worried though that the URL will get too long for any pages sitting under this. As a compromise, I am considering the following: http://www.company/products/dlm/hire-collection-agents My feeling is that the second approach will give the best balance between having the keywords for the products and trying to ensure good user experience. My only concern is whether the /dlm/ category page would suffer slightly, but this would have ‘debt-lifecycle-management’ in the title tag. Does this sound like a good approach to people? Or do you think I’m being a little obsessive about this? Any help would be appreciated 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
Internal links and URL shortners
Hi guys, what are your thoughts using bit.ly links as internal links on blog posts of a website? Some posts have 4/5 bit.ly links going to other pages of our website (noindexed pages). I have nofollowed them so no seo value is lost, also the links are going to noindexed pages so no need to pass seo value directly. However what are your thoughts on how Google will see internal links which have essential become re-direct links? They are bit.ly links going to result pages basically. Am I also to assume the tracking for internal links would also be better using google analytics functionality? is bit.ly accurate for tracking clicks? Any advice much appreciated, I just wanted to double check this.
Intermediate & Advanced SEO | | pauledwards0 -
Robots.txt: Can you put a /* wildcard in the middle of a URL?
We have noticed that Google is indexing the language/country directory versions of directories we have disallowed in our robots.txt. For example: Disallow: /images/ is blocked just fine However, once you add our /en/uk/ directory in front of it, there are dozens of pages indexed. The question is: Can I put a wildcard in the middle of the string, ex. /en/*/images/, or do I need to list out every single country for every language in the robots file. Anyone know of any workarounds?
Intermediate & Advanced SEO | | IHSwebsite0 -
What Should I Do With My URL Names?
I release property on my blog each week, and it has come to the point we will get property in the same area as we have had in the past. So, I name my URL /blah-blah-blah-[area of property]/ for the first property in that area right. Now I get a different property in that same area and the URL will have to be named /blah-blah-blah-[area of property]-2/. Now I'm not sure if this is a major issue or not, but I'm sure there must be a better way than this, and I don't really want to take down our past properties - unless you can give me good reason too, of course? So before I start getting URLs like this: /blah-blah-blah-[area of property]-2334343534654/ (well, ok, maybe not that bad! But you get my point) I wanted to see what everyones opinion on it is 🙂 Thanks in advance!
Intermediate & Advanced SEO | | JonathanRolande0 -
URL Parking and Frame Forwarding..
I have a few URLs... Is there any benefit for me to frame forward these empty domains?
Intermediate & Advanced SEO | | IoanSaid0