Best strategy for "product blocks" linking to sister site? Penguin Penalty?
-
Here is the scenario -- we own several different tennis based websites and want to be able to maximize traffic between them. Ideally we would have them ALL in 1 site/domain but 2 of the 3 are a partnership which we own 50% of and why are they are off as a separate domain. Big question is how do we link the "products" from the 2 different websites without looking spammy? Here is the breakdown of sites:
Site1: Tennis Retail website --> about 1200 tennis products
Site2: Tennis team and league management site --> about 60k unique visitors/month
Site3: Tennis coaching tip website --> about 10k unique visitors/month
The interesting thing was right after we launched the retail store website (site1), google was cranking up and sending upwards of 25k search impressions/day within the first 45 days. Orders kept trickling in and doing well overall for first launching. Interesting thing was Google "impressions" peaked at about 60 days post launch and then started trickling down farther and farther and now at about 3k-5k impressions/day. Many keywords phrases were originally on page 1 (position 6-10) and now on page 3-8 instead.
Next step was to start putting "product links" (3 products per page) on site2 and site3 -- about 10k pages in total with about 6 links per page off to the product page (1 per product and 1 per category). We actually divided up about 100 different products to be displayed so this would mean about 2k links per product depending on the page.
FYI, those original 10k pages from site2 and site3 already rank very well in Google and have been indexed for the past 2+ years in there. Most popular word on the sites is Tennis so very related.
Our rationale was "all the websites are tennis related" and figured that the links on the latest and greatest products would be good for our audience. Pre-Penguin, we also figured this strategy would also help us rank for these products as well for when users are searching on them.
We are thinking through since traffic and gone down and down and down from the peak of 45 days ago, that Penguin doesn't like all these links -- so what to do now?
How to fix it and make the Penguin happy? Here are a couple of my thoughts on fixing it:
1. Remove the "category link" in our "product grouping" which would cut down the link by 1/3rd.
2. Place a "nofollow" on all the links for the other "product links". This would allow us to get the "user clicks" from these while the user is on that page.
3. On our homepage (site2 & site3), place 3 core products that change frequently (weekly) and showcase the latest and greatest products/deals. Thought is to NOT use the "nofollow" on these links since it is the homepage and only about 5 links overall.
Heck part of me debated on taking our top 1000 pages (from the 10k page) and put the links ONLY on those and distribute about 500 products on them so this would mean only 2 links per product -- it would mean though about 4k links going there. Still thinking #2 above could be better?
Any other thoughts would be great!
Thanks,
Jeremy
-
If you are saying you have 75k links from your domains cross linking then I would agree. Product websites have been under attack for awhile now.Tough one.
-
Thomas,
Thanks for your thoughts and good questions...
All 3 of the websites are hosted on different dedicated Amazon AWS instances - each having their own independent IP addresses. Yeah it is possible google could be picking up something here, but not sure.
Content is decent overall -- it does need more though and with that many products, some of the content is the "default" manufacturer text so we will need to beef it up as well and make it more unique text. As for product uniqueness, all the tennis online retailers sell the same products so the key will be unique text since the names/prices are all set from the manufacturers.
All of our product URLs are "search engine safe" urls with the following methodology -> /product/[mfg-name]/[product-name]/ I am thinking we are pretty good with the URLs.
Unfortunately I think Google is making products harder now since there normally isn't lots of "content" on a Wilson ProStaff Six.One Tennis Racquet. My guess is this is also why they are charging for google merchant as well.
I am still thinking we should drop down the "total links" (followed links) in our "linking" since right now with the different product links from Site2 and Site3 the count is about 75k. Plus also work on more unique content online as well.
-
In my opinion, linking from sister sites can be done if the links are useful. Seomoz.org links to Mozcast.com. I link from my offsite blog to my main site. I believe that when the link is relevant then you can link it over. If the links are footer or done by a program that links all "key-word" mentions to the other site then it looks bad. The best place for links is in the content and where it is relevant. Diversify the anchor text. Make it more natural. Your hosting on these sites could also be affecting your linking. Are all the links from the same IP address? Are all of links coming form within your 3 sites?
There are many other reasons your rank could be dropping. Duplicate content, thin content, bad links.. For most product websites it seems that the product description is either too thin, or duplicate from all the other product descriptions for that item online. How unique is your product content?
When ever I approach a poor performing website, I try to evaluate all the onsite elements first. After I've fixed all of the onsite elements then I will look at linking problems.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Our website is running through WordPress. We've been running Moz for over a month now. Only recently, within the past 2 weeks, have we been alerted to over 100 duplicate pages. It appears something happened that created a duplicate of every single page on our site; "www.company.com/example" and "company.com/example." Again, according to our MOZ, this is a recent issue. I'm almost certain that prior to a couple of weeks ago, there existed both forms of the URL that directed to the same page without be counting as a duplicate. Thanks for you help!
Intermediate & Advanced SEO | | wzimmer0 -
When migrating website platforms but keeping the domain name how best do we add the new site to google webmaster tools? Best redirect practices?
We are moving from BigCommerce to Shopify but maintaining our domain name and need to make sure that all links redirect to their corresponding links. We understand the nature of 301s and are fine with that, but when it comes to adding the site to google webmaster tools, not losing link juice and the change of address tool we are kind of lost. Any advice would be most welcome. Thank you so much in advance!
Intermediate & Advanced SEO | | WNL0 -
How to make Link Building for the E-Commerce sites?
Hello everyone, I just ask one question: How to make Link Building for the E-Commerce sites?
Intermediate & Advanced SEO | | backlinkmag0 -
Best Strategy to display 8mg Images on Product Pages for Ecommerce
I have an ecommerce store that has a variety of images including some super high quality images that are 8 mg. This style of image could be completed for hundreds of products in the store. Does anyone have any tips on what I should be watching out for here? Is 8 mg too unusable?
Intermediate & Advanced SEO | | LukeyJamo0 -
Strange situation - Started over with a new site. WMT showing the links that previously pointed to old site.
I have a client whose site was severely affected by Penguin. A former SEO company had built thousands of horrible anchor texted links on bookmark pages, forums, cheap articles, etc. We decided to start over with a new site rather than try to recover this one. Here is what we did: -We noindexed the old site and blocked search engines via robots.txt -Used the Google URL removal tool to tell it to remove the entire old site from the index -Once the site was completely gone from the index we launched the new site. The new site had the same content as the old other than the home page. We changed most of the info on the home page because it was duplicated in many directory listings. (It's a good site...the content is not overoptimized, but the links pointing to it were bad.) -removed all of the pages from the old site and put up an index page saying essentially, "We've moved" with a nofollowed link to the new site. We've slowly been getting new, good links to the new site. According to ahrefs and majestic SEO we have a handful of new links. OSE has not picked up any as of yet. But, if we go into WMT there are thousands of links pointing to the new site. WMT has picked up the new links and it looks like it has all of the old ones that used to point at the old site despite the fact that there is no redirect. There are no redirects from any pages of the old to the new at all. The new site has a similar name. If the old one was examplekeyword.com, the new one is examplekeywordcity.com. There are redirects from the other TLD's of the same to his (i.e. examplekeywordcity.org, examplekeywordcity.info), etc. but no other redirects exist. The chances that a site previously existed on any of these TLD's is almost none as it is a unique brand name. Can anyone tell me why Google is seeing the links that previously pointed to the old site as now pointing to the new? ADDED: Before I hit the send button I found something interesting. In this article from dejan SEO where someone stole Rand Fishkin's content and ranked for it, they have the following line: "When there are two identical documents on the web, Google will pick the one with higher PageRank and use it in results. It will also forward any links from any perceived ’duplicate’ towards the selected ‘main’ document." This may be what is happening here. And just to complicate things further, it looks like when I set up the new site in GA, the site owner took the GA tracking code and put it on the old page. (The noindexed one that is set up with a nofollowed link to the new one.) I can't see how this could affect things but we're removing it. Confused yet? I'd love to hear your thoughts.
Intermediate & Advanced SEO | | MarieHaynes0 -
Multiple sites - ownership & link structure
Hi All I am in the process of creating a number of sites within the garden products sector; each site will have unique, original content and there will be no cross over. So for example I will have one on lawn mowers, one on greenhouses, another on garden furniture etc. My original thinking was to create a single limited company that would own each of the domains, therefore all the registrant details will be identical. Is this a sensible thing to do? (I want to be totally white hat) And what, if any, are the linking opportunities between each of the sites? (16 in total). Not to increase ranking, more from an authoritative perspective. And finally, how should I link between each site? Should I no follow the links? Should I use keyword contextual links? Any advice ideas would be appreciated 🙂 Please note: It has been suggested that I just create one BIG site. I've decided against this as I want to use the keyword for each website in the domain name as I believe this still has value. Thanks
Intermediate & Advanced SEO | | danielparry0 -
To "Rel canon" or not to "Rel canon" that is the question
Looking for some input on a SEO situation that I'm struggling with. I guess you could say it's a usability vs Google situation. The situation is as follows: On a specific shop (lets say it's selling t-shirts). The products are sorted as follows each t-shit have a master and x number of variants (a color). we have a product listing in this listing all the different colors (variants) are shown. When you click one of the t-shirts (eg: blue) you get redirected to the product master, where some code on the page tells the master that it should change the color selectors to the blue color. This information the page gets from a query string in the URL. Now I could let Google index each URL for each color, and sort it out that way. except for the fact that the text doesn't change at all. Only thing that changes is the product image and that is changed with ajax in such a way that Google, most likely, won't notice that fact. ergo producing "duplicate content" problems. Ok! So I could sort this problem with a "rel canon" but then we are in a situation where the only thing that tells Google that we are talking about a blue t-shirt is the link to the master from the product listing. We end up in a situation where the master is the only one getting indexed, not a problem except for when people come from google directly to the product, I have no way of telling what color the costumer is looking for and hence won't know what image to serve her. Now I could tell my client that they have to write a unique text for each varient but with 100 of thousands of variant combinations this is not realistic ir a real good solution. I kinda need a new idea, any input idea or brain wave would be very welcome. 🙂
Intermediate & Advanced SEO | | ReneReinholdt0