External resources page (AKA a satellite site) - is it a good idea?
-
So the general view on satellite sites is that they're not worth it because of their low authority and the amount of link juice they provide.
However, I have an idea that is slightly different to the standard satellite site model. A client's website is in a particular niche, but a lot of websites that I have identified for potential links are not interested because they are a private commercial company. Many are only interested in linking to charities or simple resource pages. I created a resource section on the website, but many are still unwilling to link to it as it is still part of a commercial website. The website is performing well and is banging on the door of page one for some really competitive keywords. A few more links would make a massive difference.
One idea I have is to create a standalone resource website that links to our client's website. This would be easy to get links from sites that would flat out refuse to link to the main website. This would increase the authority of the resource and result in more link juice to the primary website.
Now I know that the link juice from this website will not be as good as getting links directly to the primary website, but would it still be a good idea? Or would my time be better spent trying to get a handful of links directly to the client's website? Alternatively, I could set up a sub-domain to set up the resource, but I'm not sure that this would be as successful.
-
Sort of. It's per page, not per website.
-
Thanks Michael, you raise some good points.
I just want to raise one point though. I was under the impression that the amount of link juice passed from one website depends on the total number of outbound links from that site (I'm sure I recall Matt Cutts saying something along these lines). Therefore, a link from a local newspaper would be good but watered down due to the total number of outbound links. A site with lower authority but with less outbound followed links (or 1 in this case) would still be of value. Is this correct or am I talking nonsense?
-
I don't think you'd run afoul of Google by doing this, as long as you kept it to one (or a small handful, say, less than 10) satellite sites. And if those sites can rank on their own for terms, and "feed" leads (not links) to the commercial site, then that might be a good strategy.
But the question really is, I think, is it worth the effort for the link juice that eventually trickles down to the commercial site, right? Think of it this way: how much would your rankings change if your commercial site got 1 link from the city newspaper's site? It'd probably make a difference, sure...but probably not massive (unless that was the only decent strong link you had). Now....how much content marketing and link-building would you have to do to get your satellite site to have the same domain authority as the city newspaper's site? Ummmm....a lifetime's. Well, probably 10-20 years anyway. So the answer really comes down to this: you can do a TON of work on the satellite site, and you're never going to get as much benefit from that work as you'd get from getting your local newspaper to do a story on your company. Or get quoted by a reporter in it (speaking of that....check out HARO...as long as you just respond to real reporters writing for real publications).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Open Site Explorer - Top Pages that don't exist / result of a hack(?)
Hi all, Last year, a website I monitor, got hacked, or infected with malware, I’m not sure which. The result that I got to see is 100’s of ‘not found’ entries in Google Search Console / Crawl Errors for non-existent pages relating to / variations of ‘Canada Goose’. And also, there's a couple of such links showing up in SERPs. Here’s an example of the page URLs: ourdomain.com/canadagoose.php ourdomain.com/replicacanadagoose.php I looked for advice on the webmaster forums, and was recommended to just keep marking them as ‘fixed’ in the console. Sooner or later they’ll disappear. Still, a year after, they appear. I’ve just signed up for a Moz trail and, in Open Site Explorer->Top Pages, the top 2-5 pages are relating to these non-existent pages: URLs that are the result of this ‘canada goose’ spam attack. The non-existent pages each have around 10 Linking Root Domains, with around 50 Inbound Links. My question is: Is there a more direct action I should take here? For example, informing Google of the offending domains with these backlinks. Any thoughts appreciated! Many thanks
Intermediate & Advanced SEO | | macthing1 -
Show parts of page A on page B & C?
Good afternoon,
Intermediate & Advanced SEO | | rayvensoft
A quick question. I am working on a website which has a large page with different sections. Lets say: Page 1
SECTION A
SECTION B
SECTION C Now, they are adding a new area where they want to show only certain sections, so it would look like this: Page 2
SECTION A Page 3
SECTION C Page 4
SECTION D So my question is, would a rel='canonical' tag back to Page 1 be the correct way of preempting any duplicate content issues? I do not need Page 2-4 to even be indexed, it is just a matter of usability and giving the users what they are looking for without all the rest of the extra stuff. Gracias. Tesekürler. Salamat Ko. Thanks. (bonus thumbs up for anybody who knows which languages each of those are) 🙂0 -
My home page is not found by the "Grade a Page" tool
My home page as well as several important pages are not found by the Grade a Page tool. With our full https address I got this http://screencast.com/t/s1gESMlGwpa With just the www address I got this http://screencast.com/t/BMRHy36Ih https://www.joomlashack.com
Intermediate & Advanced SEO | | etabush
https://www.joomlashack.com/joomla-templates We recently lost a lot of positions for our most important keyword: Joomla Templates Please help us figure this out. Whats screwy with our site?0 -
Should all pages on a site be included in either your sitemap or robots.txt?
I don't have any specific scenario here but just curious as I come across sites fairly often that have, for example, 20,000 pages but only 1,000 in their sitemap. If they only think 1,000 of their URL's are ones that they want included in their sitemap and indexed, should the others be excluded using robots.txt or a page level exclusion? Is there a point to having pages that are included in neither and leaving it up to Google to decide?
Intermediate & Advanced SEO | | RossFruin1 -
On Page vs Off Page - Which Has a Greater Effect on Rankings?
Hi Mozzers, My site will be migrating to a new domain soon, and I am not sure how to spend my time. Should I be optimizing our content for keywords, improving internal linking, and writing new content - or should I be doing link building for our current domain (or the new one)? Is there a certain ratio that determines rankings which can help me prioritize these to-dos?, such as 70:30 in favor of link-building? Thanks for any help you can offer!
Intermediate & Advanced SEO | | Travis-W0 -
Looking for a good example of local pages done right
I am looking for a company or two that serves customers in multiple regions and has their site set up in the best possible way to target those areas. I would like, if possible, to see an example of a company that has an address in each area served, and one that only has one base location, but travels to serve customers.
Intermediate & Advanced SEO | | webfeatseo0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
Interesting 302 redirect situation - could they be a good idea??
Just started with a new SEO client. The site is built on Sharepoint Server 2007 running Windows Server 2003 R2 on IIS 6.5 (I know, fun times for me). Being a standard crappy Windows setup, URLs and canonicalization is a huge issue: first and foremost, we get a 302 redirect from the root www.example.com to www.example.com/Pages/default.aspx Now standard SEO best practices dictate that we rewrite and redirect these pages so they're clean URLs. However that may or may not be possible in the current environment - so is the next best thing to change those to 301s so at least link authority is passed better between pages? Here's the tricky thing - the 302s seem to be preventing Google from indexing the /Pages/default.aspx part of the URL, but the primary URL is being indexed, with the page content accurately cached, etc. So, www.example.com 302 redirects to www.example.com/Pages/default.aspx but the indexed page in Google is www.example.com www.example.com/sample-page/ 302 redirects www.example.com/sample-page/Pages/default.aspx but the indexed page in Google is www.example.com/sample-page/ I know Matt Cutts has said that in this case Google will most likely index the shorter version of the URL, so I could leave it, but I just want to make sure that link authority is being appropriately consolidated. Perhaps a rel=canonical on each page of the source URL? i.e. the www.example.com/sample-page/ - however is rel=canonical to a 302 really acceptable? Same goes for sitemaps? I know they always say end-state URLs only, but as the source URLs are being indexed, I don't really want Google getting all the /Pages/default.aspx crap. Looking for thoughts/ideas/experiences in similar situations?
Intermediate & Advanced SEO | | OddDog0