How/why is this page allowed to get away with this?
-
I was doing some research on a competitor's backlinks in Open Site Explorer and I noticed that their most powerful link was coming from this page: http://nytm.org/made-in-nyc. I visited that page and found that this page, carrying a PageRank of 7, is just a long list of followed links. That's literally all that's on the entire page - 618 links. Zero nofollow tags. PR7. On top of that, there's a link at the top right corner that says "Want to Join?" which shows requirements to get your link on that page. One of these is to create a reciprocal link from your site back to theirs.
I'm one of those white-hat SEOs who actually listens to Matt Cutts, and the more recent stuff from Moz. This entire page basically goes against everything I've been reading over the past couple years about how reciprocal links are bad, and if you're gonna do it, use a nofollow tag. I've read that pages, or directories, such as these are being penalized by Google, and possible the websites with links to the page could be penalized as well. I've read that exact websites such as these are getting deindexed by the bunches over the past couple years.
My real question is how is this page allowed to get away with this? And how are they rewarded with such high PageRank? There's zero content aside from 618 links, all followed. Is this just a case of "Google just hasn't gotten around to finding and penalizing this site yet" or am I just naive enough to actually listen and believe anything that comes out of Matt Cutts videos?
-
I guess you're right, but does that mean that Google wouldn't consider this a black-hat technique just because the link juice is divided by so many links? I thought it would actually be the opposite, that having only 5 or 10 links passing juice on a page would be okay, but something like 600 would be considered spam. I don't know, but perhaps Matt Cutts has said something about this specifically.
Regardless, have you, or has anyone here heard the phrase, "If your intention is to gain rankings in Google, then it's black-hat" Basically anything you do, such as listing a bunch of links like this without a nofollow link and asking to trade links, based on what I've gathered from Matt Cutts, is considered black-hat. If I'm wrong, please let me know.
But let's assume that everything you're saying is correct. How can we make the most of this situation? For me, for example, I actually went to Open Site Explorer and checked into followed external links and sorted them based on Page Authority. This was actually the most powerful link going to the site (I believe I was researching Quirky.com) based on what Moz was telling me. If what you're saying is true, then shouldn't Moz's algorithm be updated to take into consideration the amount of links on that page, then perhaps also they can take that and divide it with the Page Authority for the page and give us a new number based on that? That would probably be a much more accurate way of ranking pages based on how powerful they are, or how much link juice is going to them. Maybe there's a way to do that now and I'm just not aware. Do you have any strategies you use for this sort of thing, dividing link juice between the number of pages on the site?
-
I would add, at best case scenario its a PR7 page divided by 600+ links, so the actual page authority passed would be very small. Then consider your link would be at the bottom of that list so you would getting even less, if any.
-
Cyto,
Thank you again for another great response. You haven't put me off, quite the contrary. I really enjoy discussions like this because I actually work alone, as a one-man-show, and I don't get the opportunity to discuss SEO or online marketing with anyone really, let alone any experts. So personally, I rely a lot on Matt Cutts, and the info I get here at Moz, and other similar sites that I subscribe to on my RSS feed. Of course I also have a Pro account here at Moz and use it a lot for all of my clients.
I personally feel like Matt Cutts is the only person who knows what they're talking about, and the only person to trust. However, I have heard an SEO say before, "The things Matt Cutts may say is nice and all, but I rely more on the results that I actually find rather than just do what he says blindly." That makes sense, but I feel like that person was referring to doing black-hat stuff, until he gets caught. Regardless, my trust is still with Matt Cutts.
You said in your post (and it may have been a typo, I don't know) this: "My gut feeling is that, Google won't penalise a website who is an internet company made in NYC and listed on a non-profit organization website with a nofollow link. It seems like a natural fit." If, in fact, all of that was true, and they were using noFollow links, I believe this entire discussion is rather pointless because just through using the noFollow links on that page instead of 100% followed links, they would be in the clear as far as I'm concerned. I don't think there is any issues with trading links, anchor text, reciprocal links, etc as long as they are no followed. But in this case they are not no followed. They are all followed links. And they are asking for anchor-text optimized followed links. This is the key for me.
Now you may say to me, hey, it's an internal page with a directory and it's a non-profit .org site. Users may actually gain from this. However, Matt Cutts has said that any time you are doing things for the purpose of gaining rankings, it is considered black-hat. They can have that directory, and keep all the links, and provide this unique benefit to their users. However, in 2013 and 2014 and beyond, Matt Cutts has said this sort of example should have nofollow links because it won't change anything at all for the user experience, but it WILL cut down on spam because people would really be attracted to that page because of the linking opportunity. If the links were nofollow links, I doubt there would be nearly as many people excited about getting on that page. To me, this page is primarily for SEO purposes, in that the page will gain back links from the people who want to be listed, and the user experience is actually secondary. I have gained from Google that the user experience should be first, and the way to do that would be to nofollow all the links.
I am also aware that the algorithm doesn't necessarily take individual pages into account, but rather groups of pages with similar issues. For example, a page with massive links with optimized anchor text from PR1 or PR2 sites will be penalized, as we've seen from past updates. Other things like a text/html ratio should be above at least 15% from what I've seen, and the maximum amount of links per page shouldn't typically be higher than 200. This goes against all that. The craziest part about all of this is that I would expect this page to be somewhere around PR3. But it's PR7. WHY. That is the question. Are they being penalized, and just overpowering the penalization to get there? Has Google in fact placed this website or page on some sort of "white-list" that isn't included in typical algorithm roundups?
I'm actually to the point where I think I'm going to send Matt Cutts and e-mail and let's see if he responds. In the mean time, I would love to keep this discussion going! Cyto, I would love to hear another response, and if anyone else has anything else to add, or any other thoughts or theories (OR EXPERIENCE WITH THIS EXACT SORT OF THING) kindly add to this discussion! Thanks!
-
I don't think there ever is an individual who knows the right answers to everything when it comes to SEO. We're all exploring ideas, learning and sharing knowledge of our own findings and research.
Let's step outside the SEO world, throw away our knowledge and look at the website. Would you say, it is a website your client should be on? If your client is indeed an internet company made in NYC, shouldn't they be mentioned in NY Tech Meetup? From this perspective, I would say yes.
My gut feeling is that, Google won't penalise a website who is an internet company made in NYC and listed on a non-profit organization website with a nofollow link. It seems like a natural fit.
Second, I'm looking at opensiteexplorer, the page has a page authority of 78/100 and 174 root domains including some big power houses like the guardian, bloomberg, forbes. (didn't see any nofollow) I definitely think these are helping.I remember working on a client's webpage once, we optimised the page with rich content, clear call to actions and it was ranking on page 2, got 2 hyperlinks from the BBC and another high authority website and two weeks later, "boom", we were ranking on page 1, position 4.
Now let's explore the "black hat" technique. The core one would be the requesting of reciprocal links with the anchor text "Made in NYC" hyperlinked
You are right, that is "black hat" if I saw someone else do it, but in this scenario, I would go "that's fine". It all depends on the situation.
- A non-profit organisation focused on supporting new york technology community
- The webpage is specific to one and one thing only, listing internet companies made in NYC
- Their selection criteria focuses on active sites with 10k+ visits and ones solely made in NYC
You see, if I were to move this whole concept to a real world scenario, where nytm was a shop and they had a book listing other shops built in NYC, would you penalize them? "you sir, should not list such shops nor should other shops say you have a list of NYC build shops!"
In all honest, I don't see what they are doing as a big no no. I think things should be looked at as a case by case scenario, not to cluster everyone as a single group.
You mentioned why can't you create a .org page similar to this.. I say why not? Note their directory page isn't the core of what they are. This is just a single page of their entire entity and I think that plays a lot in their strength in reaffirming their presence in the web.
I realize you are frustrated, and all of us have our own thoughts. My thinking has always been to compare things to a real life scenario and focus more on creating great content that others will link to, rather than chase it myself. Sure, they might all use nofollow but they clicked through wanting to see my page and I'll let my rich content, site design and clear call to action turn them to a returnee.
Don't fret my friend. In a weird way, this is the perfect board to vent out and hear everyones thoughts and ideas. I hope my thoughts haven't put you off
-
Cyto,
I like your thinking on this one. This is where I was trying to go with it. But still, you asked many of the same questions that I asked. I realize we won't have a solid answer unless Matt Cutts himself speaks on this specific issue. However, I'm still left with unanswered questions. Here's a few points that are left standing:
- I realize there are billions, if not trillions, of websites and pages in existence. However, there are not billions of pages who are at a PageRank of 7. You can try to disregard their PageRank and tell me how it's going to be deprecated soon, or it's not accurate, or whatever. But regardless, they got that page to a PR7. If you think that doesn't matter, and it's not important, I'd like to see you try to get your page to PR7 and tell me how long it took you to do that. What I'm saying is I don't think they magically got to PR7 overnight, and I don't think that Google has missed this site. There's only so many PR9, PR8, and PR7's out there. What are the chances that they completely missed AND messed up on the PageRank for this site? The only other explanation I have for the PageRank is that they were white-hat for a long time, and then when they got to PR7, they flipped to this black-hat type of page. But I doubt that's the case. They're either still benefitting from black-hat techniques, OR we are misjudging this site and Google actually does think it deserves a PR7.
- Try thinking about it like this: yes, this page is practicing many things that are straight-up black-hat, things that Matt Cutts has publicly and openly said is considered spam. Just simple things like a text/html ratio, or a certain number of links per page, or asking to trade links, or having massive links without nofollowing any of them. What if Google saw this page and said, wow this is a black-hat page, let's penalize them. And let's assume that this page is penalized. But what if all the sites on there are linking back to this page, and therefore all the link juice from the other pages pointing back to this page is basically that much more powerful than the penalization that it's basically overpowering the penalization with more back links, thus bringing them to a net PR7? The question here is: can you overpower Google's penalization with more bad back links?
- Looking deeper into the whole .org/non-profit/maybe Google likes these types of pages, perhaps they do and we're all just wrongly assuming things. In this case, I agree with Cyto, this page could be unique and it does benefit the user. However, isn't this the exact scenario that Matt Cutts has told us to implement a noFollow tag? I believe he has said repeatedly, if you must link to another site and you're not sure about it, just put on a noFollow tag. If you have reciprocal links, no need to get rid of them, just simply nofollow the links. It's this sort of thing that is giving me trouble fully accepting that this is a good page and Google likes them. And IF Googles does like this page, and the PR7 is deserved, and the followed links are fine, then I SHOULD try to get my client a link on this page. But I suppose there is a risk because we won't 100% know for sure unless Matt Cutts says so.
- Diving deeper into the "Google may like this sort of page" for the reasons you stated, it sort of contradicts what has already been said from Matt Cutts. For example, if I put a link in a press release back to my homepage, there is some value in that link to the user because it makes it easier for the user to visit that page simply by clicking instead of typing in the URL. In this case, all PR links have been nofollowed across the internet. You can use this same excuse to use a link, and say it creates value, but Google is telling us to noFollow these links. Especially when talking about a "directory" specifically, I have read that Google is shutting these sites down completely. However, we are left wondering if this specific site is on some sort of "white-list". In that case, the first-person to create a "directory of white-listed directories of followed links" I'm sure will be quite successful with that page.
- What is stopping me from creating a .org page similar to this? Why can't I build a page up to PR7 and openly exchange links with people? The biggest thing stopping me from even thinking about something like that is because I am assuming this only worked 5+ years ago. Regardless, I have a client who sells a few unique products, and one of their competitors is Quirky.com who led me to finding this page because they have a back link from this page. The problem I'm seeing is that Quirky.com is benefitting from a link on this page, and I'm worried about joining it due to a potential penalization. In this case, Quirky doesn't really have to worry about anything because they have so many links, and they're established. But if I wanted to get the same link as them, I have to worry. This is the sort of thing that makes it hard to compete with the big players. Not that I think this client is on par with them, but I just get the feeling that they're allowed to do more than we are. Perhaps I'm wrong, but it's the feeling I have.
- It's getting harder and harder for me to find white-hat followed link opportunities. It seems like everywhere I go, the link is going to be nofollowed. Other people's websites, they want to noFollow the link. Guest blog posts, they want to noFollow the link. Press releases are all nofollowed now. The case is either the link is noFollowed, or you risk penalization on a followed link. This is the corner I feel I'm getting pushed into.
- I learned a while back from an SEO that links are the most powerful form of currency in the SEO world. A link is the number one most powerful way to get higher up in the rankings, for the reason that it is basically a sign of saying "this site is trustworthy and worthwhile to check out" and Google puts those things together to say they are worthy of higher rankings. And it all makes sense to me, and I haven't seen anything to tell me otherwise. If I'm wrong and I missed something, let me know. I mean, it's great to put out unique content and all that, but what is the point of the guest post or the press release if there's no indication that you wrote it or that it has anything to do with your site? What is it worth at that point if there is no link included? I understand the organic side that some people may literally read it and visit your site off that, but that's an inefficient way of doing things. I'm down with "link-earning" but only if I can actually earn a followed link. What's the point of a link-earning process if you don't earn the link, know what I mean? It just seems like everything is going this way of noFollowing links, or you have to worry about a penalization. And before you say it, I am aware that it's less than 20% of all links that are noFollowed, but still, this is the feeling I'm getting. (That number may be higher now that all Press Release links are no followed, not sure)
- I'm really not trying to do anything black-hat. I'm trying to do white-hat stuff here, but with the purpose of accelerating my client's process of getting higher in rankings. Listen, I'm doing all the other stuff well, it's just this whole link-building/earning aspect is tough and it seems like 2014 is going to be much harder than previous years.
What are your thoughts on these points?
-
Cyto, that is one of the best analyses that I have seen in a long time.
Thumbs up!
-
Maybe it has all to do with the site itself and the "human" approach Google is taking.
- The site is a non-profit organisation supporting the New York technology community. Domain is .org
- Let's review the page, in all reality such a page is useful - it is showing internet companies made in NYC from a non-profit organisation. What other format would you take? Sure it only has links and looks spammy but it isn't a spam page. It has a purpose and exists in a format that is acceptable by its users - plus, it isn't asking for money. If you were to look at this from a human eye, you would go "this isn't spam"
- Taking the human approach again, if my friend was looking for a collection of internet companies in NYC and he had a research pile with 10 documents and one of them was a handwritten version of this page, shouldn't it be there? Or should it be filed under his research pile of 500 documents? Maybe, we need to see Google differently, it's more complex than a "if statement".
- Maybe reciprocal links to non-profit organisations are viewed differently. A good Samaritan would go "yes" and maybe Google is taking a human approach and going "let me help you as you are a good non-profit organisation"
- Now the nofollow aspect, this is a technical element and I agree, shouldn't it exist? But again, maybe just maybe Google is seeing the site and going "I support such organisations" and the support this can give to other sites isn't so bad.
- Let's take Wikipedia. Would Google punish wikipedia if they didn't have a nofollow? Was the introduction of nofollow done due to Google or a decision Wikipedia made? Maybe it was Wikipedia who noticed users abuse their site and thus introduced the nofollow.
What I'm trying to say is that, maybe Google is evolving to be more complex/human like. It's doing a moral mind on decisions.
On the flip-side, it's just a software that forgot to notice the site amongst the millions of websites out there and in time, it will capture them.
-
I agree with Tim, Moz's PA/DA is based purely on links and does not consider spam pages, google's PR (that is shows to the public) is very unreliable.
Google may have missed those links , or maybe have simply devalued them, we just don't know. But what we do know its the type of practice that google is tying to stop, so someday they might do a big penguin update an start penalizing sites with back links like that. My own rule would be if is easy for humans to spot then it either easy for the algo to spot or some day the algo will spot it.
Yahoo is directory is different as you pay to be considered to get on their list, so its not a direct pay for a link. But I think because yahoo is a trusted site they get away with it, I don't think a no name directory would get away with the same trick. Also I would question how good a yahoo directly really is (some people this its worth the money, others think its not)
-
One thing to consider is that just because Open Site Explorer found the link and assigned it a high PA/DA and the PageRank toolbar shows a high PR score doesn't mean that Google is actually passing link strength through those links. I wouldn't look at site's PageRank as a definitive answer to Google's thoughts on the site's quality.
Page Authority and PageRank are essentially just equations that consider the quantity and strength of links pointing to a page, but they don't say anything about the spamminess of a page's content. It's possible that Google has already noticed the hundreds of followed links coming out of this site, marked the page as spam and devalued all those links. All of that could be done behind the scenes without any changes to the site's PageRank and without any public notice in regards to the site's spamminess.
With that said, it's also possible that Google hasn't caught on to the spam tactics of this site and those links do still have value. The internet is a huge place and as we all know, Google is definitely capable of letting spammy sites slip through the cracks. Unfortunately, Google will probably never pull back the curtain for us, so we won't know how they treat cases like this.
As for whether you should get the link for your client. I would lean towards no, but wouldn't completely rule it out. There is some risk involved, but if your client already has a decent link profile, one link from a spammy site probably won't hurt them and if Google hasn't devalued those links, it could potentially help them.
If you do choose to go after the link, I would first have an open discussion with your client about the potential risks and rewards. If they want to be really aggressive, they might want you to go for it. If they'd rather play it safe, probably better to pass.
Tim
-
So what I've gathered so far is that you're saying that indeed this page is considered "black-hat" for the reasons mentioned above, and that eventually the site, as well as possibly the sites that are listed on this page, could all receive a penalty, and that I should stick to white-hat strategies.
But let's take this a step further. We are simply assuming it is "black-hat" page based on things we've heard and have accepted as fact. However, what if Google and Matt Cutts actually see this page as something different and actually perhaps "white-hat"? I may ask then, how could they see it as white-hat if it's breaking all these other rules? At that point I would look at something like the Yahoo directory. The one where you pay $300 per year to get one backlink. I feel like these two sites are fairly similar and breaking similar rules.
So for some reason, Google likes the Yahoo directory and lets them do what they want. Perhaps they are also putting this webpage in the same boat as the Yahoo directory. Matt Cutts' excuse may be something like, "Well, this page has been around for a while, and it's actually quite a unique page in that there are no other directories like this on the internet, and therefore is actually providing a benefit to the user in the form of a directory of startups in New York." But then I would ask, why not nofollow all the links, because isn't that the whole point of the nofollow tag?
He also may say something like, they have strict guidelines and rules to allow people into the directory. After all, it's not open to anyone, only startups in NYC who get >10k visitors per month to your site. But still, why not the nofollow links? They are also blatantly asking to trade links. And after all that they still have a PR7?
The next step I ask is, should I try to get my client on this list to, so that he may benefit from the directory and the PR7 backlink? I'm thinking twice about it because I don't want to wake up one day and be penalized because I have a link on that directory, and I have a reciprocal link going back to that site. If you ask me, this page is a perfect example of what Google doesn't want, and yet Google is rewarding them. So I'm not sure if perhaps I'm the one who is wrong and perhaps Google actually likes this site because it may be unique to some extent.
I would love to hear some sort of official response on this from someone at Moz, as well as any other people here who are familiar with this sort of situation, and any success on pages like this that we are assuming are black-hat. I would love to have someone from Moz actually visit that page and give me their analysis on if the page is breaking any rules, and why it has the PageRank it does.
-
Thanks for the laugh.. Gagan. That is a really funny quote from Buffet.
I am going to go make a page like this just so my competitors will get their panties in a wad.
-
Hello Trenton,
There are several Million sites in web space as of now and think of hundred multi trillion pages
If a site doing wrong and you doing correct - patience rewards. Sooner or later - you will see rewards coming in your favor only
Warren Buffet said a famous quote (though all his quotes are famous) - "No mater how great are your efforts in business - some things just take time. You cant produce a baby in 1 month time by making 9 women pregnant"
So - my advise is to ignore that poor strategy adopted by competition or even that site - sooner or later it will get on to a dead end. If you still want it to happen fast
you may submit an anti-spam report to Google about that web page :- https://www.google.com/webmasters/tools/spamreportform?hl=en&spamurl=https%3A%2F%2Fwww.google.co.in%2Fwebhp%3Fsourceid%3Dchrome-instant%26espv%3D210%26ie%3DUTF-8
-
Yo have started to learn SEO and you will find such frustrating things again and again where People are doing things against rules of Google and getting better results than those who are doing legitimate things!
But dont get frustrated, stick with things your are doing and learning. Websites that do tricky things will not going to last for long.
You will see such websites tanked very very shortly by Penguin (3 or 3.1 :-))!!!!!!
Regards
-
I am totally flustered by this example and more like this.
on most of my client's projects I see gray and black SEO sites that are leading the serps.
I try to follow the best practice advices and keep a clean shop.
cannot wait to get more info on this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical tag On Each Page With Same Page URL - Its Harmful For SEO or Not?
Hi. I have an e-commerce project and they have canonical code in each and every page for it's own URL. (Canonical on Original Page No duplicate page) The url of my wesite is like this: "https://www.website.com/products/produt1"
White Hat / Black Hat SEO | | HuptechWebseo
and the site is having canonical code like this: " This is occurring in each and every products as well as every pages of my website. Now, my question is that "is it harmful for the SEO?" Or "should I remove this tags from all pages?" Is that any benefit for using the canonical tag for the same URL (Original URL)?0 -
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Google admits it can take up to a year to refresh/recover your site after it is revoked from Penguin!
I found myself in an impossible situation where I was getting information from various people that seem to be "know it all's" but everything in my heart was telling me they were wrong when it came to the issues my site was having. I have been on a few Google Webmaster Hangouts and found many answers to questions I thought had caused my Penguin Penalty. After taking much of the advice, I submitted my Reconsideration Request for the 9th time (might have been more) and finally got the "revoke" I was waiting for on the 28th of MAY. What was frustrating was on May 22nd there was a Penguin refresh. This as far as I knew was what was needed to get your site back up in the organic SERPS. My Disavow had been submitted in February and only had a handful of links missing between this time and the time we received the revoke. We patiently waited for the next penguin refresh with the surety that we were heading in the right direction by John Mueller from Google (btw.. John is a great guy and really tries to help where he can). The next update came on October 4th and our rankings actually got worse! I spoke with John and he was a little surprised but did not go into any detail. At this point you have to start to wonder WHAT exactly is wrong with the website. Is this where I should rank? Is there a much deeper Panda issue. We were on the verge of removing almost all content from the site or even changing domains despite the fact that it was our brand name. I then created a tool that checked the dates of every last cached date of each link we had in our disavow file. The thought process was that Google had not re-crawled all the links and so they were not factored into the last refresh. This proved to be incorrect,all the links had been re-cached August and September. Nothing earlier than that,which would indicate a problem that they had not been cached in time. i spoke to many so called experts who all said the issue was that we had very few good links left,content issues etc.. Blah Blah Blah, heard it all before and been in this game since the late 90's, the site could not rank this badly unless there was an actual penalty as spam site ranked above us for most of our keywords. So just as we were about to demolish the site I asked John Mueller one more time if he could take a look at the site, this time he actually took the time to investigate,which was very kind of him. he came back to me in a Google Hangout in late December, what he said to me was both disturbing and a relief at the same time. the site STILL had a penguin penalty despite the disavow file being submitted in February over 10 months ago! And the revoke in May. I wrote this to give everyone here that has an authoritative site or just an old one, hope that not all is lots just yet if you are still waiting to recover in Google. My site is 10 years old and is one of the leaders in its industry. Sites that are only a few years old and have had unnatural link building penalties have recovered much faster in this industry which I find ridiculous as most of the time the older authoritative sites are the big trustworthy brands. This explains why Google SERPS have been so poor for the last year. The big sites take much longer to recover from penalties letting the smaller lest trustworthy sites prevail. I hope to see my site recover in the next Penguin refresh with the comfort of knowing that my site currently is still being held back by the Google Penguin Penalty refresh situation. Please feel free to comment below on anything you think is relevant.
White Hat / Black Hat SEO | | gazzerman10 -
IS http://ezinearticles.com/ good or bad for backlinks?
Hi Everyone, Is http://ezinearticles.com/ any good to use? Thanks
White Hat / Black Hat SEO | | vanplus0 -
Will my association's network of sites get penalized for link farming?
Before beginning I found these similar topics here: http://www.seomoz.org/q/multiple-domains-on-same-ip-address-same-niche-but-different-locations http://www.seomoz.org/q/multiple-domains-on-1-ip-address We manage over two dozen dental sites that are individually owned through out the US. All these dentists are in a dental association which we also run and are featured on (http://www.acedentalresource.com/). Part of the dental associations core is sharing information to make them better dentists and to help their patients which in addition to their education, is why they are considered to be some of the best dentists in the world. As such, we build links from what we consider to be valuable content between the sites. Some sites are on different IPs and C-Blocks, some are not. Given the fact that each site is only promoting the dentist at that brick and mortar location but also has "follow" links to other dentists' content in the network we fear that we are in the grey area of link building practices. Questions are: Is there an effective way to utilize the power of the network if quality content is being shared? What risks are we facing given our network? Should each site be on a different IP? Would having some of our sites on different servers make our backlinks more valuable than having all of our sites under the same server? If it is decided that having unique IPs is best practice, would it be obvious that we made the switch? Keep in mind that ALL sites are involved in the association, so naturally they would be linking to each other, and the main resource website mentioned above. Thanks for your input!
White Hat / Black Hat SEO | | DigitalElevator0 -
Is pulling automated news feeds on my home page a bad thing?
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom. After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
White Hat / Black Hat SEO | | amit20760 -
What on-page/site optimization techniques can I utilize to improve this site (http://www.paradisus.com/)?
I use a Search Engine Spider Simulator to analyze the homepage and I think my client is using black hat tactics such as cloaking. Am I right? Any recommendations on to improve the top navigation under Resorts pull down. Each of the 6 resorts listed are all part of the Paradisus brand, but each resort has their own sub domain.
White Hat / Black Hat SEO | | Melia0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1