Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Site Architecture: Cross Linking vs. Siloing
-
I'm curious to know what other mozzers think about silo's...
Can we first all agree that a flat site architecture is the best practice? Relevant pages should be grouped together. Shorter, broader and (usually) therefore higher volume keywords should be towards the top of each category. Navigation should flow from general to specific. Agreed?
As Google say's on page 10 of their SEO Starter Guide, "you should think about how visitors will go from a general page (your root page) to a page containing more specific content ." OK, we all agree so far, right? Great!
Enter my question: Bruce Clay (among others) seem to recommend siloing as a best practice. While Richard Baxter (and many others @ SEOmoz), seem to view silos as a problem.
Me? I've practiced (relevant) internal cross linking, and have intentionally avoided siloing in almost all cases.
What about you? Is there a time and place to use silos? If so, when and where? If not, how do we rectify the seemingly huge differences of opinions between expert folks such as Baxter and Clay?
-
I am confused. So lets that I have an ecommerce site that has 20 types(books, toys…) / 20 categories each / 20 subcategories each and thousands of products under each subcategory.
When we say go flat, is it ideal to go all the way like http://www.website.com/type (20 of these), and http://www.website.com/category (400 of these) and http://www.webiste.com/subcategory (8000 of these)and thousands of product pages. So there is no page more than 1 directory level down. Does this mean flat architecture?
-
the breadcrumb is one more signal about where you are in the process, on the site, in the section. Google also likes them and will often show the breadcrumb navigation links right in search results. They try to emulate breadcrumbs sometimes in search results if you don't have them, but if they do, since you're not feeding them an actual breadcrumb, theirs can sometimes guess wrong at the keywords they show in them.
-
Hi Alan...
Is there a case study -- where a silo is broken down and analyzed that I can use to understand this siloing concept !
My understanding of A Silo is -- you for example if you have a grocery store website - you create a dairy section then all related dairy products are found here and a deli dept then all cold cuts in this section etcc where all the pages are themed from the top then on downward, and trying to keep the silo 3 clicks from home
The breadcrumb -- not sure how this comes into play but if I keep the site 3 clicks from home at any time someone needs to get back to where they started from they are able to do it, so how does the breadcrumb help if I am already trying to keep the structure a 3 click structure for easy navigation and easy exit back to beginning.
-
Hey Todd, thanks.
While I definitely agree about having tightly themed categories, I'm not quite sure I am sold on using a silo. Correct me if I'm wrong here please, but isn't a silo when you don't cross link detail pages (within the same category) with each other? I think Alan feels the same way, or perhaps I've misunderstood.
Check this post about the importance of link architecture by Google. Specifically, the last Q&A.
-
I agree with Alan, and would like to add that I believe that using the silo method can increase the proximity of closely connected clusters of keywords better. In other words, by nature, in a silo structure, tightly knit keywords support each other and pass theme and relevance value to each other by default when a strong supportive breadcrumb is in place. Often with a flat site architecture extra programming needs to be done to establish those relationships as they relate to internal pages.
-
Anyone have anything else they'd like to toss into the discussion?
Any examples you'd like to share of detail page linking vs. silos?
[edit] Just found this (old) blog post by Google about the importance of (internal) link architecture... I quote:
Q: Let's say my website is about my favorite hobbies: biking and camping. Should I keep my internal linking architecture "themed" and not cross-link between the two?
A: We haven't found a case where a webmaster would benefit by intentionally "theming" their link architecture for search engines. And, keep-in-mind, if a visitor to one part of your site can't easily reach other parts of your site, that may be a problem for search engines as well.
-
exactly. "Tags" and "materials" are not exactly top level category stuff
-
I found a relatively "ghetto" approach to silo using wordpress, since I don't have the time or technical skill to implement it perfectly. Using a specific plugin, it will compare posts and reference a set number of related at the bottom, creating a link structure similar to a silo. It's not perfect but it is easy.
-
Yeah, your right. I would image those links aren't relevant when on store pages, and would definitely distract some people
On their product pages though, they use some cross linking to relevant topics. But I'm sure it's at the bottom- out of site- as to not distract people. So I would image those are mostly there for SEO. Would you agree?
-
Etsy's got a good structure with their category and sub-category sidebar that balances SEO and user experience. note though that when you get deep into the individual Etsy stores, that's gone, because it would dilute the individual store owner's account focus and distract users.
-
I found a relatively "ghetto" approach to silo using wordpress, since I don't have the time or technical skill to implement it perfectly. Using a specific plugin, it will compare posts and reference a set number of related at the bottom, creating a link structure similar to a silo. It's not perfect but it is easy.
-
I think you're right Alan, that makes great sense. Thanks. Do you think Etsy's sidebar is a good compromise between the two? I'm sure testing each site is the best way to figure out what users prefer on that specific site. But in general, do you think that's a good balance to use in order to keep too many links off the page, yet still keep detail pages within a category linking to each other?
-
Having all listed and linked is ideal for SEO, however you rapidly cross into usability problems if there are more than a handful. (Would you want 50 or hundred links in a sidebar nav? ) When a site is so big that there are more than a handful that could be linked from that sidebar, it's actually best practice to NOT have any others linked from the sidebar, else you confuse users even more (listing only some, but not all). User Experience is paramount when making these decisions. Even at the expense of SEO in some cases. And if that happens, other tactics need to be employed. Like having a separate, dedicated funnel for "featured properties". Which requires even more unique content in that funnel. But it at least boosts the ranking value for those properties included.
-
Agreed.
I spent some time working on a hybrid silo structure in my blog, and proper cross linking on the main area of the site thanks to the discussion here.
-
Sorry for the confusion Alan, and thank you very much for the discussion.
To help clarify for others reading this discussion (and for myself), are we both agreeing that: in the attached image it is an 'SEO AND usability best practice' for the hotel detail pages inside the Tallahassee category/directory to link to each other?
*Of course, there are always caveats, such the maximum outbound link limit recommended by Google, etc.
But as a general practice, would you have "Hotel 1", "Hotel 2" and "Hotel 3" (inside the "Tallahassee" category) link to one another?
-
This is a great question and an even better discussion.
Special thanks to Alan for sharing all of the details.
-
Indeed Alan, that's good advice we all should follow. Thanks. I'll follow suit from here-on
-
You're dead on accurate in the need for and importance of how "consensus" can help new people get started. The trick is helping them find enough truly experienced people who have done that testing on a wide enough variety of sites, as well as lots of disclaimers being plainly stated on all such discussions. It's why I strive to always refer to "in my experience"...
-
If I have a category California Hotels, sub-category San Francisco Hotels, then having links in a sub-navigation bar to each (if there's only a handful), each of those links reinforces the strength of the top level Hotels, 2nd level California, and third level San Francisco related phrases. They all support each other.
If, on the other hand, I have a link to "nearby hotels", that implies I'm going from a single hotel details page to a uniquely filtered "geo" category page that shows hotels based on some criteria - it might be all San Francisco, or all within a distance radius, or all within a zip code radius.
Even if it's all other hotels in San Francisco, it's not a link pointing to another (or several) same-level page(s). It's pointing one layer higher.
That's a filter more than a properly constructed category drill-down. And it implies that the page I'm on will NOT be listed on that target of the "nearby" link.
-
Also agreed. However, when new SEOs enter the sphere, they must start somewhere. And, clearly there's value in studying other's work to help clarify, expand or even challenge one's own hypothesis and practices. I also avoid implementing a tactic/ strategy on a paid client project, if reputable SEO's and/or the community as a whole, recommend against it. I may try it on my personal site, but not a customer's. Thanks for all your help Alan.
-
just to clarify regarding my input - my perspective is based on my experience with client sites on all scales, small, medium, large and mega sites.
To me it's more important to see how things work on our own sites and evolve them over time as compared to purely looking for what others do or say as it's own reason for taking action.
-
Respectfully, what's the difference between the nearby hotels example and the cars example?
More specifically, If these 'nearby hotels' links might dilute that articles topical focus, why wouldn't a link to 'mercedes' from a 'BMW' page?
Thanks Alan. -
I actually don't, because I've always thought it was a bad idea. But it seems other folks don't think it's so bad under the right circumstances.
I'd be interested in seeing a good example of an effective silo as well....
anyone? -
Thanks Dave. This is exactly why I posed this discussion.... it seems as if a lot of us are getting something different from these architecture type posts.
I think it has to do with making same-level detail pages link to each other. Especially if you link to them using the anchor text they're trying to rank for.
For example, what I get out of an article like Richard Baxter's post on SEO Architecture, is that detail pages should link to each other, and that Silos should not be used. And the more architecture posts I read on SEOmoz, SEOgadget and Distilled... the more I think it's a 'best practice'.
That said, it seems from these comments that some folks read those articles differently. I think this is a serious discrepancy that we SEOs should address
-
The slides will be going up at some point in the next few days. And I'll have a follow-up post that includes the notes for each slide. In the mean time, I did an article on Search Marketing Wisdom yesterday directly related to the last slide in that deck.
-
The "nearby hotels to consider" feature is a user thing. It may or may not pass quality page rank.
In some cases, that extra link could dilute the topical focus / strength of the page it's on.
So if I get to resort X's page, and there's a link to "nearby hotels", there's an implied relationship. Good for users. But for SEO, sure it's related stuff, yet maybe not laser focus related.
Another example is blog posts that end with a following box "related articles" and that box contains three or five links to other articles. Maybe they're highly related, maybe loosely. If they're loosely related, sure it MIGHT be good to help users. Yet it probably dilutes this article's topical focus.
-
Agreed, absolutely agreed! Thank you very much Alan!
PS. Could you share the slides from your presentation at SMX Advanced please? If not, how about a link to a post of yours?
-
Well it depends. Is there only one BMW or are there several? If there is only one, then yes - cross link all the luxury detail pages. If there are several, then that's the level for cross linking detail pages, even though it's so deep. If that's the case though, you'd better get inbound links pointing to the parent luxury category page.
And in any regard, don't just have a bunch of links on those category pages - have descriptive paragraph content focused on that category's primary topical focus.
-
Great find on that post. It lays it all out. As long as the silos are thin (not more than 2 layers beneath the home page) it can bring a benefit to adding extra ranking pages with minimal work comparatively. Rand talks about eliminating the bottom layer of the hierarchy to push the content up a level and make the resulting pages extremely stout. The major problem is always going to be the end of the chain. He calls them PageRank sinks.
-
Ultimately, the silo process just takes a bit of time for each new post making sure it links to another category?
I know there are plugins for wordpress that will do automatic linking based on any word you input, and it will link a set or random number of times throughout your site.
It could be worth setting up for me and just include some keyword phrases in the correct articles just to get the link process going properly.
-
Take for example, a resort detail page on oyster.com. They have a section called "nearby hotels to consider", which I believe serves two purposes...
#1) it's likely helpful to users, as most people don't restrict themselves to staying at just one specific resort, and
#2) it helps search engines flow PageRank, crawl and index other pages in the 'Aruba' category.What I can't figure out is, what benefit would it have to not include these links to nearby hotels? (Except perhaps, on checkout process pages of course.)
What if the 'Raddison' Resort for example, got a ton of inbound links and the 'Westin Resort' had only a few? Well, you could cross link them and help the Westin Resort page rank... and simultaneously show your users more relevant options.
-
Yeah, I agree Alan. I don't usually think it's a good return on invested time to practice PageRank sculpting either. One could for example, being building links or generating content with the time/ resources instead
I just re-read what you said, "Individual services details pages should cross-link to each other within that service section at that level though, for usability." To be sure I understand what you're saying.... if your architecture is for example
vehicles -> cars -> luxury -> bmw
vehicles -> cars -> luxury -> mercedes
vehicles -> cars -> luxury -> jaguar
then bmw, mercedes and jaguar would link to each other... correct? -
If anything, sometimes silo structure is not the best for user experience, or the drill down too deep, into ever more thinner content to the point where it's so thin as to have a negative impact on SEO and user experience.
-
I agree with Rand's '09 article in general, however there are some things I think take it a bit too far (such as redirecting PDF documents for link juice). If a PDF is truly the most relevant content on a topic, I believe it should be indexed.
The biggest factor is that if we get completely bogged down in this process just for SEO sake, we lose focus on user experience.
It's right up there with page and link sculpting - to me, it's a waste of time and harms user experience. And the time spent going that far is, in my opinion, in 2011 much better spent on other SEO tactics. Not just because Google has changed how they deal with nofollow links.
-
What are the perceived negative effects, if any, of doing a silo structure?
-
I spent some time using a silo plugin for a wordpress site early on, and also spent some time with a theme that had a silo format, but ended up switching over to a flat site architecture, I just did like the theme for wordpress that used silo, and the plugin seemed like junk.
I'd love to take a look at a well run silo site if you know of one.
-
Not at all, thanks Alan. I think we're in agreeance.
As long as one is not exceeding Google's approx. outbound links per page... and as long as the the navigation make sense to the users.... specific detail pages within the same category should be linked to each other. Is that what you're saying as well?
Here's one example of why I think this is best for indexation reasons. I've attached an image of the page where I circled some stuff
What do you think Alan?
-
Oh - wait I just re-read your question as to not wanting detail pages to link to one another...
If I'm at a sub-category, I would not want, nor need, every individual product/event page in that group to link to each other. Individual services details pages should cross-link to each other within that service section at that level though, for usability.
Does that make sense? Or did I just confuse you?
-
Yes - if there are X number of pages within a section, it becomes too many to reasonably link from a sub-nav. X being a subjective value that needs to be determined case by case.
Ideally, it might lead to yet one more sub-level (such as in sub-sub categories), or in pagination (not blocked from search. That itself is challenging to do in the right manner so as to avoid going too deep or too thin.
There's no other reason I can think of though, and no other method I'd consider a best practice.
-
Thank you dignan99. What's your opinion of silos? Do you like to cross link detail pages within a category to each other, or even category pages to each other?
-
I definitely agree EGOL. We like to meticulously plan out sites and SEO/PPC campaigns prior to launch, but over time a site's architecture definitely needs to be revisited. Usually at that time, we try to also implement any more advanced programming knowledge we might have accumulated to help ease the pain as well
Thanks EGOL!
-
Thank you Dave.
I guess it comes down to flowing PageRank within a category vs. restricting PageRank to the pages that have more links. Any idea why would someone prefer the latter?
-
Thanks Alan. You mentioned, "where all the pages in that section have a link to all the other pages in that section".........
Can you think of any reason why you would not want detail pages within a category to link to one another?
-
I really enjoy topics like this, thanks for asking such a great question.
-
The problem that a lot of people have is that their site grows in unexpected directions. So the problem is not so much deciding upon the structure but more a problem of making the most of the expanding beast!
-
Great question. While everyone has their schools of thought; both methods have their benefits. I tend to favor flat architecture with targeted cross linking. I guess you could call it a hybrid strategy. I begin with a totally flat architecture and silo where it makes sense for the rankings and the navigation for the user. It's all about logical grouping and don't forget the pages must all be link-worthy on their own. If the pages are all strong enough to generate links the problem tends to take care of itself.
-
There's never one perfect solution, however here's the bigger issue. Some people hear "flat" and they take it to the extreme. Which is a terrible concept in 2011.
If you go too flat, you muddy up the proper group relationships. This is where Siloing comes in.
In my presentation at SMX Advanced this week, one of the many methods I recommend for "sustainable SEO" is to group your content, and reinforce that group relationship in URL structure, then with breadcrumbs, and finally with section-level navigation, where all the pages in that section have a link to all the other pages in that section, but where that specific sub-navigation is replaced or disappears as appropriate when you leave that section.
If you've got more than a handful of pages in a section, you should definitely go deeper.
The trick is knowing how wide, how deep to go. It's an art as much as a process studying site data over time.
Another factor is the competitive landscape for a particular niche market. The more competitive, the more important this concept becomes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Chinese Sites Linking With Bizarre Keywords Creating 404's
Just ran a link profile, and have noticed for the first time many spammy Chinese sites linking to my site with spammy keywords such as "Buy Nike" or "Get Viagra". Making matters worse, they're linking to pages that are creating 404's. Can anybody explain what's going on, and what I can do?
Intermediate & Advanced SEO | | alrockn0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
How to properly link network of microsites and main sites?
Law firm has a main brand site (lawfirmname.com) with lots of content focusing on personal injury related areas of law. They also do other unrelated areas of law such as bankruptcy and divorce. They have a separate website for bankruptcy and a separate one for divorce. These websites have good quality content, a backlinking campaign, and are fairly large websites, with landing pages for different cities. They also have created local microsites in the areas of bankruptcy and divorce that target specific smaller cities that the main bankruptcy site and divorce site do not target well. These microsites have a good deal of original content and the content is mostly specific to the city the website is about, and virtually no backlinks. There are about 15 microsites for cities in bankruptcy and 10 in divorce and they rank pretty well for these city specific local searches. None of these sites are linked at all, and all 28 of the sites are under the same hosting account (all are subdomains of root domain of hosting account). Question, should I link these sites together at all and if so how? I considered making a simple and general page on the lawfirmname.com personal injury site for bankruptcy and divorce (lawfirmname.com/bankruptcy and lawfirmname.com/divorce) and then saying on the page something to the effect of "for more information on bankruptcy go to our main bankruptcy site at ....." and putting the link to the main bankruptcy site. Same for divorce. This way users can go to lawfirmname.com site and find Other Practice Areas, go to bankruptcy page, and link to main bankruptcy site. Is this the best way to link to these two main sites for bankruptcy and divorce or should I be linking upward? Secondly, should I link the city specific microsites to any of the other sites or leave them completely separate? Thirdly, should all of these sites be hosted on the same account or is this something that should be changed? I was considering not linking the city specific sites at all, but if I did this I didn't know if I should create different hosting accounts for them (which could be expensive). The sites work well in themselves without being linked, but wanted to try to network them in some way if possible without getting penalized or causing any issues with the search engines. Any help would be appreciated on how to network and host all of these websites.
Intermediate & Advanced SEO | | broca777110 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
One Way Links vs Two Way Links
Hi, Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie! Thank you!!
Intermediate & Advanced SEO | | JohnW-UK0 -
Multiple sites linking back with pornographic anchor text
I discovered a while ago that we had quite a number of links pointing back to one of our customer's websites. The anchor text of these links contain porn that is extremely bad. These links are originating from forums that seems to link between themselves and then throw my customers web address in there at the same time. Any thoughts on this? I'm seriously worried that this may negatively affect the site.
Intermediate & Advanced SEO | | GeorgeMaven0 -
Getting Google to Correct a Misspelled Site Link...Help!
My company website recently got its site links in google search... WooHoo! However, when you type TECHeGO into Google Search one of the links is spelled incorrectly. Instead of 'CONversion Optimization' its 'COversion Optimization'. At first I thought there was a misspelling on that page somewhere but there is not and have come to the conclusion that Google has made a mistake. I know that I can block the page in webmaster tools (No Thanks) but how in the crap can I get them to correct the spelling when no one really knows how to get them to appear in the first place? Riddle Me That Folks! sitelink.jpg
Intermediate & Advanced SEO | | TECHeGO0