Best way to consolidate link juice
-
I've got a conundrum I would appreciate your thoughts on.
I have a main container page listing a group of products, linking out to individual product pages.
The problem I have is the all the product pages target exactly the same keywords as the main product page listing all the products.
Initially all my product pages were ranking much higher then the container page, as there was little individual text on the container page, and it was being hit with a duplicate content penality I believe.
To get round this, on the container page, I have incorporated a chunk of text from each product listed on the page.
However, that now means "most" of the content on an individual product page is also now on the container page - therefore I am worried that i will get a duplicate content penality on the product pages, as the same content (or most of it) is on the container page.
Effectively I want to consolidate the link juice of the product pages back to the container page, but i am not sure how best to do this.
Would it be wise to rel=canonical all the product pages back to the container page? Rel=nofollow all the links to the product pages? - or possibly some other method?
Thanks
-
Ok, it sounds like there is more going on here then just seo. You have to consider your brand, what's best for your customers, and the over all branding of your website.
However, this is an seo question so I will give an seo answer. If I were you, I would target the product pages. In a page you have much more "on page optimization" options. You can use: Tile Tags, proper Urls, H1 tags.... you get the point. If you try to use one page to taget to many products, you are going to dilute your on page optimization.
If your home page has more authority, use it to target more competitive keywords, and maybe just one or two products... and then you can link to your other products.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0 -
Does link juice pass along the URL or the folders? 10yr old PR 6 site
We have a website that is ~10yrs old and a PR 6. It has a bunch of legitimate links from .edu and .gov sites. Until now the owner has never blogged or added much content to the site. We have suggested that to grow his traffic organically he should add a worpress blog and get agressive with his content. The IT guy is concerned about putting a wordpress blog on the same server as the main site because of security issues with WP. They have a bunch of credit card info on file. So, would it be better to just put the blog on a subdomain like blog.mysite.com OR host the blog on another server but have the URL structure be mysite.com/blog? I have tried to pass as much juice as possible. Any ideas?
Intermediate & Advanced SEO | | jasonsixtwo0 -
Panda Recovery - What is the best way to shrink your index and make Google aware?
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason. We have reduced our index size by 95% and have done significant content development on the remaining 5% pages. For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size. Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc? Thanks /sp80
Intermediate & Advanced SEO | | sp800 -
What is the best way to link between all my portals?
Hi I own 12 different portals within gambling, they do more or less work and feel like this one, Casinotopplisten, what is the best way for me to link between all of them? Since there is alot going on in Google these days I havent linked between the sites at all, but i feel that to be a somewhat waste. So here is my three ideas so far, in ranked order: Add a menu at the topp right of the site, or footer, that links to the 10 different sites with different languages. The text link should then only be "Norwegian, Swedish, English etc.." Basiclly the same as about, but in addition linking to the "same page" in the other languages. As all pages have the same article set for startes this can be done. Dont do any linking between the sites and only link to the sites separately from our company blog/site.. Dont link at all. I should add that all of these sites are on different IPs with different domains and all in different languages. Hope someone can add their 2c on this one.. Thanks!
Intermediate & Advanced SEO | | MortenBratli0 -
What is the best way to hide duplicate, image embedded links from search engines?
**Hello! Hoping to get the community’s advice on a technical SEO challenge we are currently facing. [My apologies in advance for the long-ish post. I tried my best to condense the issue, but it is complicated and I wanted to make sure I also provided enough detail.] Context: I manage a human anatomy educational website that helps students learn about the various parts of the human body. We have been around for a while now, and recently launched a completely new version of our site using 3D CAD images. While we tried our best to design our new site with SEO best practices in mind, our daily visitors dropped by ~15%, despite drastic improvements we saw in our user interaction metrics, soon after we flipped the switch. SEOMoz’s Website Crawler helped us uncover that we now may have too many links on our pages and that this could be at least part of the reason behind the lower traffic. i.e. we are not making optimal use of links and are potentially ‘leaking’ link juice now. Since students learn about human anatomy in different ways, most of our anatomy pages contain two sets of links: Clickable links embedded via JavaScript in our images. This allows users to explore parts of the body by clicking on whatever objects interests them. For example, if you are viewing a page on muscles of the arm and hand and you want to zoom in on the biceps, you can click on the biceps and go to our detailed biceps page. Anatomy Terms lists (to the left of the image) that list all the different parts of the body on the image. This is for users who might not know where on the arms the biceps actually are. But this user could then simply click on the term “Biceps” and get to our biceps page that way. Since many sections of the body have hundreds of smaller parts, this means many of our pages have 150 links or more each. And to make matters worse, in most cases, the links in the images and in the terms lists go to the exact same page. My Question: Is there any way we could hide one set of links (preferably the anchor text-less image based links) from search engines, such that only one set of links would be visible? I have read conflicting accounts of different methods from using JavaScript to embedding links into HTML5 tags. And we definitely do not want to do anything that could be considered black hat. Thanks in advance for your thoughts! Eric**
Intermediate & Advanced SEO | | Eric_R0 -
Fading Text Links Look Like Spammy Hidden Links to a g-bot?
Ah, Hello Mozzers, it's been a while since I was here. Wanted to run something by you... I'm looking to incorporate some fading text using Javascript onto a site homepage using the method described here; http://blog.thomascsherman.com/2009/08/text-slideshow-or-any-content-with-fades/ so, my question is; does anyone think that Google might see this text as a possible dark hat SEO anchor text manipulation (similar to hidden links)? The text will contain various links (4 or 5) that will cycle through one another, fading in and out, but to a bot the text may appear initially invisible, like so; style="display: none;"><a href="">Link Here</a> All links will be internal. My gut instinct is that I'm just being stupid here, but I wanted to stay on the side of caution with this one! Thanks for your time 🙂 http://blog.thomascsherman.com/2009/08/text-slideshow-or-any-content-with-fades
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
Link Juice - Lots of Pages
I have a site, PricesPrices.com where I'm steadily building inbound links and pagerank. I have about 4600 pages on the site, most of which are baby products in the baby gear sector. There are many outdated items that aren't really my focus, but do pop up in long-tail search queries from time to time. My question is a pretty basic one. Theoretically if a site has say 28/100 link juice, then as you go deeper and deeper into the site, the link juice is divided more and more. My question: Is this really true or just a concept? My thoughts are to hide many of the products that i don't really need to focus on therefor passing more link juice to the products that remain, but I also don't want to that if it won't necessarily make the remaining pages rank higher or have more link juice. I also have to keep in mind the merchandising aspect of the site and providing a good user experience. If i only have 300 products on the site, there will be a ton of unhappy people who can't find the products they are looking for. Any thoughts and/or pointers in the direction of funneling that pagerank down into my site would be much appreciated. Thanks!
Intermediate & Advanced SEO | | modparent0 -
We are a web hosting company and some of our best links are from our own customers, on the same IP, but different Class C blocks..
We are a web hosting company and some of our best links are from our own customers, on the same IP same IP, but different Class C blocks. How do search engines treat the uniqie scenario of web hosting companies and linking?
Intermediate & Advanced SEO | | FirePowered0