Can I redirect a link even if the link is still on the site
-
Hi Folks,
I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places.
When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation.
We can't use rel-canonical because they don't want visitors going to that 2nd page.
Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page?
I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change.
So, what are your thoughts?
Thanks!
-
Are you using a CMS, or some inhouse solution? If it is a CMS, in many cases you should be able to update that CMS so that the 2 links are generated but the page itself isn't generated twice.
Another option if 2 pages must exist, would be to set a canonical on both pages to the 1 main location for the content, while using a pushstate on the url to manipulate the browser into the main pathing. Although the more I think about that one, it may not be a 100% viable option.
-
I agree - but as with many things, there's politics involved. . . . . I'll leave it at that.
-
Although, depending on Craig's site structure, it could be a simple, one-time set up of the htaccess so all Link 2's 301 to the Link 1's.
For example, if when creating website.com/category1/product1, it also creates a duplicate page on /category2/product1, he could use regex so that all products under /category2/ redirect to the /category1/ product URL.
You're right that it's still not the most elegant of solutions, but it's a simple enough way to make sure users are where you want them to be without requiring any effort every time you create a new page - and it shouldn't upset Googlebot.
-
Yes, you absolutely can redirect this link. However I think your time would be better spent focusing on a solution that prevents this from happening long term. You will continually have to redirect new content as long as this continues to work as is.
-
Redirecting the 2nd link would probably be the best option, in my opinion. If the 2nd link has an integral part of the site structure and navigation, but you don't want users (or Google) to access that duplicate page, I don't see how you could do it any other way if your client insists that the 2nd page has to be created.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
Another client copies everything to blogspot. Is that what keeps her site from ranking? Or what? Appears to be a penalty somewhere but can't find it.
This client has a brand new site: http://www.susannoyesandersonpoems.com Her previous site was really bad for SEO, yet at one time she actually ranked on the first page for "LDS poems." She came to me because she lost rank. I checked things out and found some shoddy SEO work by a very popular Wordpress webhoste that I will leave unnamed. If you do a backlink analysis you can see the articles and backlinks they created. But there are so few, so I'm not sure if that was it, or it just was because of the fact that her site was so poorly optimized and Google made a change, and down she fell. Here's the only page she had on the LDS poems topic in her old site: https://web.archive.org/web/20130820161529/http://susannoyesandersonpoems.com/category/lds-poetry/ Even the links in the nav were bad as they were all images. And that ranked in position 2 I think she said. Even with her new site, she continues to decline. In fact she is nowhere to be found for main keywords making me think there is a penalty. To try and build rank for categories, I'm allowing google to index the category landing pages and had her write category descriptions that included keywords. We are also listing the categories on the left and linking to those category pages. Maybe those pages are watered down by the poem excerpts?? Here's an example of a page we want to rank: http://susannoyesandersonpoems.com/category/lds-poetry/ Any help from the peanut gallery?
Technical SEO | | katandmouse0 -
Our stage site got crawled and we got an unnatural inbound links warning. What now?
live site: www.mybarnwoodframes.com stage site: www.methodseo.net We recently finished a redesign of our site to improve our navigation. Our developer insisted on hosting the stage site on her own server with a separate domain while she worked on it. However, somebody left the site turned on one day and Google crawled the entire thing. Now we have 4,320 pages of 100% identical duplicate content with this other site. We were upset but didn't think that it would have any serious repercussions until we got two orders from customers from the stage site one day. Turns out that the second site was ranking pretty decently for a duplicate site with 0 links, the worst was yet to come however. During the 3 months of the redesign our rankings on our live site dropped and we suffered a 60% drop in organic search traffic. On May 22, 2013 day of the Penguin 2.0 release we received an unnatural inbound links warning. Google webmaster tools shows 4,320 of our 8,000 links coming from the stage site domain to our live site, we figure that was the cause of the warning. We finished the redesign around May 14th and we took down the stage site, but it is still showing up in the search results and the 4,320 links are still showing up in our webmaster tools. 1. Are we correct to assume that it was the stage site that caused the unnatural links warning? 2. Do you think that it was the stage site that caused the drop in traffic? After doing a link audit I can't find any large amount of horrendously bad links coming to the site. 3. Now that the stage site has been taken down, how do we get it out of Google's indexes? Will it be taken out over time or do we need to do something on our end for it to be delisted? 4. Once it's delisted the links coming from it should go away, in the meantime however, should we disavow all of the links from the stage site? Do we need to file a reconsideration request or should we just be patient and let them go away naturally? 5. Do you think that our rankings will ever recover?
Technical SEO | | gallreddy0 -
What steps can you take to help a site that does not change
Hi, i am working on a product and services website www.clairehegarty.co.uk but the problem i have is, the site does not really change. The home page stays the same and the only time it changes is when a new course is advertised. The most important page on the website is http://www.clairehegarty.co.uk/virtual-gastric-band-with-hypnotherapy but we have seen the site drop in rankings because the page is not being updated. This page has all the information you could want on weight loss but we have seen the page drop from number one in google to number four. I would like to know what steps we should take to increase our rankings in google and would be grateful for your suggestions. If i put in articles on the site, had a section where we put a new article every week, would this then get google to visit the whole site more and move our pages back up the rankings, or should we be looking at doing other things.
Technical SEO | | ClaireH-1848860 -
Can 404 results from external links hurt site ranking?
Hello, I'm helping a university transition to a brand new website. In some cases the URLs will change between the old site and new site. They will put 301 redirects in place to make sure that people who have old URLs will get redirected properly to the new URLs. However they also have a bunch of old pages that they aren't using anymore. They don't really care if people still try to get to them (because they don't think many will), but they do care about the overall search engine rankings. I know that if a site has internal 404 links, that could hurt rankings. However can external links that return a 404 hurt rankings? Ryan
Technical SEO | | GreenHatWeb0 -
Internal links - is div on click still no followed by google?
Hi Mozzers Does anyone know if are still no followed by Google From a UX perspective, making a container div clickable will work well, but i don't want this link to absorb any link juice as text within the div would make much better anchor text, so i would rather that link was receiving the juice. Is the above the best approach to this issue of UX vs SEO? Many thanks Justin
Technical SEO | | JustinTaylor880 -
Can name="author" register as a link?
Hi all, We're seeing a very strange result in Google Webmaster tools. In "Links to your site", there is a site which we had nothing to do with (i.e. we didn't design or build it) showing over 1600 links to our site! I've checked the site several times now, and the only reference to us is in the rel="author" tag. Clearly the agency that did their design / SEO have nicked our meta, forgetting to delete or change the author tag!! There are literally no other references to us on this site, there hasn't every been (to our knowledge, at least) and so I'm very puzzled as to why Google thinks there are 1600+ links pointing to us. The only thing I can think of is that Google will recognise name="author" content as a link... seems strange, though. Plus the content="" only contains our company name, not our URL. Can anybody shed any light on this for me? Thanks guys!
Technical SEO | | RiceMedia0 -
We have a ton of legacy links that include /?ref=tracking-goes-here. We need concile this, can the conical tag be used to fix this? How?
www.firehost.com/?ref=pressrelease example - http://cl.ly/2O1d1x2m3b1b3K1K0h2J
Technical SEO | | FirePowered0