How/why is this page allowed to get away with this?
-
I was doing some research on a competitor's backlinks in Open Site Explorer and I noticed that their most powerful link was coming from this page: http://nytm.org/made-in-nyc. I visited that page and found that this page, carrying a PageRank of 7, is just a long list of followed links. That's literally all that's on the entire page - 618 links. Zero nofollow tags. PR7. On top of that, there's a link at the top right corner that says "Want to Join?" which shows requirements to get your link on that page. One of these is to create a reciprocal link from your site back to theirs.
I'm one of those white-hat SEOs who actually listens to Matt Cutts, and the more recent stuff from Moz. This entire page basically goes against everything I've been reading over the past couple years about how reciprocal links are bad, and if you're gonna do it, use a nofollow tag. I've read that pages, or directories, such as these are being penalized by Google, and possible the websites with links to the page could be penalized as well. I've read that exact websites such as these are getting deindexed by the bunches over the past couple years.
My real question is how is this page allowed to get away with this? And how are they rewarded with such high PageRank? There's zero content aside from 618 links, all followed. Is this just a case of "Google just hasn't gotten around to finding and penalizing this site yet" or am I just naive enough to actually listen and believe anything that comes out of Matt Cutts videos?
-
I guess you're right, but does that mean that Google wouldn't consider this a black-hat technique just because the link juice is divided by so many links? I thought it would actually be the opposite, that having only 5 or 10 links passing juice on a page would be okay, but something like 600 would be considered spam. I don't know, but perhaps Matt Cutts has said something about this specifically.
Regardless, have you, or has anyone here heard the phrase, "If your intention is to gain rankings in Google, then it's black-hat" Basically anything you do, such as listing a bunch of links like this without a nofollow link and asking to trade links, based on what I've gathered from Matt Cutts, is considered black-hat. If I'm wrong, please let me know.
But let's assume that everything you're saying is correct. How can we make the most of this situation? For me, for example, I actually went to Open Site Explorer and checked into followed external links and sorted them based on Page Authority. This was actually the most powerful link going to the site (I believe I was researching Quirky.com) based on what Moz was telling me. If what you're saying is true, then shouldn't Moz's algorithm be updated to take into consideration the amount of links on that page, then perhaps also they can take that and divide it with the Page Authority for the page and give us a new number based on that? That would probably be a much more accurate way of ranking pages based on how powerful they are, or how much link juice is going to them. Maybe there's a way to do that now and I'm just not aware. Do you have any strategies you use for this sort of thing, dividing link juice between the number of pages on the site?
-
I would add, at best case scenario its a PR7 page divided by 600+ links, so the actual page authority passed would be very small. Then consider your link would be at the bottom of that list so you would getting even less, if any.
-
Cyto,
Thank you again for another great response. You haven't put me off, quite the contrary. I really enjoy discussions like this because I actually work alone, as a one-man-show, and I don't get the opportunity to discuss SEO or online marketing with anyone really, let alone any experts. So personally, I rely a lot on Matt Cutts, and the info I get here at Moz, and other similar sites that I subscribe to on my RSS feed. Of course I also have a Pro account here at Moz and use it a lot for all of my clients.
I personally feel like Matt Cutts is the only person who knows what they're talking about, and the only person to trust. However, I have heard an SEO say before, "The things Matt Cutts may say is nice and all, but I rely more on the results that I actually find rather than just do what he says blindly." That makes sense, but I feel like that person was referring to doing black-hat stuff, until he gets caught. Regardless, my trust is still with Matt Cutts.
You said in your post (and it may have been a typo, I don't know) this: "My gut feeling is that, Google won't penalise a website who is an internet company made in NYC and listed on a non-profit organization website with a nofollow link. It seems like a natural fit." If, in fact, all of that was true, and they were using noFollow links, I believe this entire discussion is rather pointless because just through using the noFollow links on that page instead of 100% followed links, they would be in the clear as far as I'm concerned. I don't think there is any issues with trading links, anchor text, reciprocal links, etc as long as they are no followed. But in this case they are not no followed. They are all followed links. And they are asking for anchor-text optimized followed links. This is the key for me.
Now you may say to me, hey, it's an internal page with a directory and it's a non-profit .org site. Users may actually gain from this. However, Matt Cutts has said that any time you are doing things for the purpose of gaining rankings, it is considered black-hat. They can have that directory, and keep all the links, and provide this unique benefit to their users. However, in 2013 and 2014 and beyond, Matt Cutts has said this sort of example should have nofollow links because it won't change anything at all for the user experience, but it WILL cut down on spam because people would really be attracted to that page because of the linking opportunity. If the links were nofollow links, I doubt there would be nearly as many people excited about getting on that page. To me, this page is primarily for SEO purposes, in that the page will gain back links from the people who want to be listed, and the user experience is actually secondary. I have gained from Google that the user experience should be first, and the way to do that would be to nofollow all the links.
I am also aware that the algorithm doesn't necessarily take individual pages into account, but rather groups of pages with similar issues. For example, a page with massive links with optimized anchor text from PR1 or PR2 sites will be penalized, as we've seen from past updates. Other things like a text/html ratio should be above at least 15% from what I've seen, and the maximum amount of links per page shouldn't typically be higher than 200. This goes against all that. The craziest part about all of this is that I would expect this page to be somewhere around PR3. But it's PR7. WHY. That is the question. Are they being penalized, and just overpowering the penalization to get there? Has Google in fact placed this website or page on some sort of "white-list" that isn't included in typical algorithm roundups?
I'm actually to the point where I think I'm going to send Matt Cutts and e-mail and let's see if he responds. In the mean time, I would love to keep this discussion going! Cyto, I would love to hear another response, and if anyone else has anything else to add, or any other thoughts or theories (OR EXPERIENCE WITH THIS EXACT SORT OF THING) kindly add to this discussion! Thanks!
-
I don't think there ever is an individual who knows the right answers to everything when it comes to SEO. We're all exploring ideas, learning and sharing knowledge of our own findings and research.
Let's step outside the SEO world, throw away our knowledge and look at the website. Would you say, it is a website your client should be on? If your client is indeed an internet company made in NYC, shouldn't they be mentioned in NY Tech Meetup? From this perspective, I would say yes.
My gut feeling is that, Google won't penalise a website who is an internet company made in NYC and listed on a non-profit organization website with a nofollow link. It seems like a natural fit.
Second, I'm looking at opensiteexplorer, the page has a page authority of 78/100 and 174 root domains including some big power houses like the guardian, bloomberg, forbes. (didn't see any nofollow) I definitely think these are helping.I remember working on a client's webpage once, we optimised the page with rich content, clear call to actions and it was ranking on page 2, got 2 hyperlinks from the BBC and another high authority website and two weeks later, "boom", we were ranking on page 1, position 4.
Now let's explore the "black hat" technique. The core one would be the requesting of reciprocal links with the anchor text "Made in NYC" hyperlinked
You are right, that is "black hat" if I saw someone else do it, but in this scenario, I would go "that's fine". It all depends on the situation.
- A non-profit organisation focused on supporting new york technology community
- The webpage is specific to one and one thing only, listing internet companies made in NYC
- Their selection criteria focuses on active sites with 10k+ visits and ones solely made in NYC
You see, if I were to move this whole concept to a real world scenario, where nytm was a shop and they had a book listing other shops built in NYC, would you penalize them? "you sir, should not list such shops nor should other shops say you have a list of NYC build shops!"
In all honest, I don't see what they are doing as a big no no. I think things should be looked at as a case by case scenario, not to cluster everyone as a single group.
You mentioned why can't you create a .org page similar to this.. I say why not? Note their directory page isn't the core of what they are. This is just a single page of their entire entity and I think that plays a lot in their strength in reaffirming their presence in the web.
I realize you are frustrated, and all of us have our own thoughts. My thinking has always been to compare things to a real life scenario and focus more on creating great content that others will link to, rather than chase it myself. Sure, they might all use nofollow but they clicked through wanting to see my page and I'll let my rich content, site design and clear call to action turn them to a returnee.
Don't fret my friend. In a weird way, this is the perfect board to vent out and hear everyones thoughts and ideas. I hope my thoughts haven't put you off
-
Cyto,
I like your thinking on this one. This is where I was trying to go with it. But still, you asked many of the same questions that I asked. I realize we won't have a solid answer unless Matt Cutts himself speaks on this specific issue. However, I'm still left with unanswered questions. Here's a few points that are left standing:
- I realize there are billions, if not trillions, of websites and pages in existence. However, there are not billions of pages who are at a PageRank of 7. You can try to disregard their PageRank and tell me how it's going to be deprecated soon, or it's not accurate, or whatever. But regardless, they got that page to a PR7. If you think that doesn't matter, and it's not important, I'd like to see you try to get your page to PR7 and tell me how long it took you to do that. What I'm saying is I don't think they magically got to PR7 overnight, and I don't think that Google has missed this site. There's only so many PR9, PR8, and PR7's out there. What are the chances that they completely missed AND messed up on the PageRank for this site? The only other explanation I have for the PageRank is that they were white-hat for a long time, and then when they got to PR7, they flipped to this black-hat type of page. But I doubt that's the case. They're either still benefitting from black-hat techniques, OR we are misjudging this site and Google actually does think it deserves a PR7.
- Try thinking about it like this: yes, this page is practicing many things that are straight-up black-hat, things that Matt Cutts has publicly and openly said is considered spam. Just simple things like a text/html ratio, or a certain number of links per page, or asking to trade links, or having massive links without nofollowing any of them. What if Google saw this page and said, wow this is a black-hat page, let's penalize them. And let's assume that this page is penalized. But what if all the sites on there are linking back to this page, and therefore all the link juice from the other pages pointing back to this page is basically that much more powerful than the penalization that it's basically overpowering the penalization with more back links, thus bringing them to a net PR7? The question here is: can you overpower Google's penalization with more bad back links?
- Looking deeper into the whole .org/non-profit/maybe Google likes these types of pages, perhaps they do and we're all just wrongly assuming things. In this case, I agree with Cyto, this page could be unique and it does benefit the user. However, isn't this the exact scenario that Matt Cutts has told us to implement a noFollow tag? I believe he has said repeatedly, if you must link to another site and you're not sure about it, just put on a noFollow tag. If you have reciprocal links, no need to get rid of them, just simply nofollow the links. It's this sort of thing that is giving me trouble fully accepting that this is a good page and Google likes them. And IF Googles does like this page, and the PR7 is deserved, and the followed links are fine, then I SHOULD try to get my client a link on this page. But I suppose there is a risk because we won't 100% know for sure unless Matt Cutts says so.
- Diving deeper into the "Google may like this sort of page" for the reasons you stated, it sort of contradicts what has already been said from Matt Cutts. For example, if I put a link in a press release back to my homepage, there is some value in that link to the user because it makes it easier for the user to visit that page simply by clicking instead of typing in the URL. In this case, all PR links have been nofollowed across the internet. You can use this same excuse to use a link, and say it creates value, but Google is telling us to noFollow these links. Especially when talking about a "directory" specifically, I have read that Google is shutting these sites down completely. However, we are left wondering if this specific site is on some sort of "white-list". In that case, the first-person to create a "directory of white-listed directories of followed links" I'm sure will be quite successful with that page.
- What is stopping me from creating a .org page similar to this? Why can't I build a page up to PR7 and openly exchange links with people? The biggest thing stopping me from even thinking about something like that is because I am assuming this only worked 5+ years ago. Regardless, I have a client who sells a few unique products, and one of their competitors is Quirky.com who led me to finding this page because they have a back link from this page. The problem I'm seeing is that Quirky.com is benefitting from a link on this page, and I'm worried about joining it due to a potential penalization. In this case, Quirky doesn't really have to worry about anything because they have so many links, and they're established. But if I wanted to get the same link as them, I have to worry. This is the sort of thing that makes it hard to compete with the big players. Not that I think this client is on par with them, but I just get the feeling that they're allowed to do more than we are. Perhaps I'm wrong, but it's the feeling I have.
- It's getting harder and harder for me to find white-hat followed link opportunities. It seems like everywhere I go, the link is going to be nofollowed. Other people's websites, they want to noFollow the link. Guest blog posts, they want to noFollow the link. Press releases are all nofollowed now. The case is either the link is noFollowed, or you risk penalization on a followed link. This is the corner I feel I'm getting pushed into.
- I learned a while back from an SEO that links are the most powerful form of currency in the SEO world. A link is the number one most powerful way to get higher up in the rankings, for the reason that it is basically a sign of saying "this site is trustworthy and worthwhile to check out" and Google puts those things together to say they are worthy of higher rankings. And it all makes sense to me, and I haven't seen anything to tell me otherwise. If I'm wrong and I missed something, let me know. I mean, it's great to put out unique content and all that, but what is the point of the guest post or the press release if there's no indication that you wrote it or that it has anything to do with your site? What is it worth at that point if there is no link included? I understand the organic side that some people may literally read it and visit your site off that, but that's an inefficient way of doing things. I'm down with "link-earning" but only if I can actually earn a followed link. What's the point of a link-earning process if you don't earn the link, know what I mean? It just seems like everything is going this way of noFollowing links, or you have to worry about a penalization. And before you say it, I am aware that it's less than 20% of all links that are noFollowed, but still, this is the feeling I'm getting. (That number may be higher now that all Press Release links are no followed, not sure)
- I'm really not trying to do anything black-hat. I'm trying to do white-hat stuff here, but with the purpose of accelerating my client's process of getting higher in rankings. Listen, I'm doing all the other stuff well, it's just this whole link-building/earning aspect is tough and it seems like 2014 is going to be much harder than previous years.
What are your thoughts on these points?
-
Cyto, that is one of the best analyses that I have seen in a long time.
Thumbs up!
-
Maybe it has all to do with the site itself and the "human" approach Google is taking.
- The site is a non-profit organisation supporting the New York technology community. Domain is .org
- Let's review the page, in all reality such a page is useful - it is showing internet companies made in NYC from a non-profit organisation. What other format would you take? Sure it only has links and looks spammy but it isn't a spam page. It has a purpose and exists in a format that is acceptable by its users - plus, it isn't asking for money. If you were to look at this from a human eye, you would go "this isn't spam"
- Taking the human approach again, if my friend was looking for a collection of internet companies in NYC and he had a research pile with 10 documents and one of them was a handwritten version of this page, shouldn't it be there? Or should it be filed under his research pile of 500 documents? Maybe, we need to see Google differently, it's more complex than a "if statement".
- Maybe reciprocal links to non-profit organisations are viewed differently. A good Samaritan would go "yes" and maybe Google is taking a human approach and going "let me help you as you are a good non-profit organisation"
- Now the nofollow aspect, this is a technical element and I agree, shouldn't it exist? But again, maybe just maybe Google is seeing the site and going "I support such organisations" and the support this can give to other sites isn't so bad.
- Let's take Wikipedia. Would Google punish wikipedia if they didn't have a nofollow? Was the introduction of nofollow done due to Google or a decision Wikipedia made? Maybe it was Wikipedia who noticed users abuse their site and thus introduced the nofollow.
What I'm trying to say is that, maybe Google is evolving to be more complex/human like. It's doing a moral mind on decisions.
On the flip-side, it's just a software that forgot to notice the site amongst the millions of websites out there and in time, it will capture them.
-
I agree with Tim, Moz's PA/DA is based purely on links and does not consider spam pages, google's PR (that is shows to the public) is very unreliable.
Google may have missed those links , or maybe have simply devalued them, we just don't know. But what we do know its the type of practice that google is tying to stop, so someday they might do a big penguin update an start penalizing sites with back links like that. My own rule would be if is easy for humans to spot then it either easy for the algo to spot or some day the algo will spot it.
Yahoo is directory is different as you pay to be considered to get on their list, so its not a direct pay for a link. But I think because yahoo is a trusted site they get away with it, I don't think a no name directory would get away with the same trick. Also I would question how good a yahoo directly really is (some people this its worth the money, others think its not)
-
One thing to consider is that just because Open Site Explorer found the link and assigned it a high PA/DA and the PageRank toolbar shows a high PR score doesn't mean that Google is actually passing link strength through those links. I wouldn't look at site's PageRank as a definitive answer to Google's thoughts on the site's quality.
Page Authority and PageRank are essentially just equations that consider the quantity and strength of links pointing to a page, but they don't say anything about the spamminess of a page's content. It's possible that Google has already noticed the hundreds of followed links coming out of this site, marked the page as spam and devalued all those links. All of that could be done behind the scenes without any changes to the site's PageRank and without any public notice in regards to the site's spamminess.
With that said, it's also possible that Google hasn't caught on to the spam tactics of this site and those links do still have value. The internet is a huge place and as we all know, Google is definitely capable of letting spammy sites slip through the cracks. Unfortunately, Google will probably never pull back the curtain for us, so we won't know how they treat cases like this.
As for whether you should get the link for your client. I would lean towards no, but wouldn't completely rule it out. There is some risk involved, but if your client already has a decent link profile, one link from a spammy site probably won't hurt them and if Google hasn't devalued those links, it could potentially help them.
If you do choose to go after the link, I would first have an open discussion with your client about the potential risks and rewards. If they want to be really aggressive, they might want you to go for it. If they'd rather play it safe, probably better to pass.
Tim
-
So what I've gathered so far is that you're saying that indeed this page is considered "black-hat" for the reasons mentioned above, and that eventually the site, as well as possibly the sites that are listed on this page, could all receive a penalty, and that I should stick to white-hat strategies.
But let's take this a step further. We are simply assuming it is "black-hat" page based on things we've heard and have accepted as fact. However, what if Google and Matt Cutts actually see this page as something different and actually perhaps "white-hat"? I may ask then, how could they see it as white-hat if it's breaking all these other rules? At that point I would look at something like the Yahoo directory. The one where you pay $300 per year to get one backlink. I feel like these two sites are fairly similar and breaking similar rules.
So for some reason, Google likes the Yahoo directory and lets them do what they want. Perhaps they are also putting this webpage in the same boat as the Yahoo directory. Matt Cutts' excuse may be something like, "Well, this page has been around for a while, and it's actually quite a unique page in that there are no other directories like this on the internet, and therefore is actually providing a benefit to the user in the form of a directory of startups in New York." But then I would ask, why not nofollow all the links, because isn't that the whole point of the nofollow tag?
He also may say something like, they have strict guidelines and rules to allow people into the directory. After all, it's not open to anyone, only startups in NYC who get >10k visitors per month to your site. But still, why not the nofollow links? They are also blatantly asking to trade links. And after all that they still have a PR7?
The next step I ask is, should I try to get my client on this list to, so that he may benefit from the directory and the PR7 backlink? I'm thinking twice about it because I don't want to wake up one day and be penalized because I have a link on that directory, and I have a reciprocal link going back to that site. If you ask me, this page is a perfect example of what Google doesn't want, and yet Google is rewarding them. So I'm not sure if perhaps I'm the one who is wrong and perhaps Google actually likes this site because it may be unique to some extent.
I would love to hear some sort of official response on this from someone at Moz, as well as any other people here who are familiar with this sort of situation, and any success on pages like this that we are assuming are black-hat. I would love to have someone from Moz actually visit that page and give me their analysis on if the page is breaking any rules, and why it has the PageRank it does.
-
Thanks for the laugh.. Gagan. That is a really funny quote from Buffet.
I am going to go make a page like this just so my competitors will get their panties in a wad.
-
Hello Trenton,
There are several Million sites in web space as of now and think of hundred multi trillion pages
If a site doing wrong and you doing correct - patience rewards. Sooner or later - you will see rewards coming in your favor only
Warren Buffet said a famous quote (though all his quotes are famous) - "No mater how great are your efforts in business - some things just take time. You cant produce a baby in 1 month time by making 9 women pregnant"
So - my advise is to ignore that poor strategy adopted by competition or even that site - sooner or later it will get on to a dead end. If you still want it to happen fast
you may submit an anti-spam report to Google about that web page :- https://www.google.com/webmasters/tools/spamreportform?hl=en&spamurl=https%3A%2F%2Fwww.google.co.in%2Fwebhp%3Fsourceid%3Dchrome-instant%26espv%3D210%26ie%3DUTF-8
-
Yo have started to learn SEO and you will find such frustrating things again and again where People are doing things against rules of Google and getting better results than those who are doing legitimate things!
But dont get frustrated, stick with things your are doing and learning. Websites that do tricky things will not going to last for long.
You will see such websites tanked very very shortly by Penguin (3 or 3.1 :-))!!!!!!
Regards
-
I am totally flustered by this example and more like this.
on most of my client's projects I see gray and black SEO sites that are leading the serps.
I try to follow the best practice advices and keep a clean shop.
cannot wait to get more info on this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise / Help on Bad Link Removals
Hey everyone.
White Hat / Black Hat SEO | | TheITOteam
Im new to the community and new to backlinks - hence the question to the community today.
I would like help understanding options and work load around back links and removing them.
I have a client with over 8000 back links as a few years ago he paid someone about £10 to boost his rankings by adding thousands of backlinks.
We fear this is having a bad effect on their site and rankings organically as 90% of these back links have a spam score of over 50% and also no follows. My questions to the community (if you could be so kind to share) are:
1. Whats the best way to decide if a Backlink is worth keeping or removing
2. Is there a tool to decide this or assist with this somewhere on the internet? Ive had advise stating if its not hurting the page we should keep it. However, again...
How do I know what damage each Backlink is causing to the domain? I appriciate anyones time to offer some advice to a novice looking to clear these1 -
A Sitemap Web page & A Sitemap in htaccess - will a website be penalised for having both?
Hi I have a sitemap url already generated by SEO Yoast in the htaccess file, and I have submitted that to the search engines. I'd already created a sitemap web page on the website, also as a helpful aid for users to see a list of all page urls. Is this a problem and could this scenario create duplicate issues or any problems with search engines? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Recovering from Black Hat/Negative SEO with a twist
Hey everyone, This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you. Scenario In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to. We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages. Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site. Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them. Next Steps? The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are: Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1) Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795) Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? Would love some input or examples from anyone who can help, thanks in advance!
White Hat / Black Hat SEO | | Etna0 -
How to ignore spam links to page?
Hey Moz pals, So for some reason someone is building thousands of links to my websites (all spam), likely someone doing negative seo on my site. Anyway, all these links are pointing to 1 sub url on my domain. That url didn't have anything on it so I deleted the page so now it comes up with a 404. Is there a way to reject any link that ever gets built to that old page? I don't want all this spam to hurt my website. What do you suggest?
White Hat / Black Hat SEO | | WongNs0 -
How to Get Backlinks to a Coupon Code Website
Hello Guys, I run a coupon code website, which by its very nature does not contain the most compelling of content. As you can probably understand, not many people are going to want to link to a page which lists a number of coupons relating to a specific online retailer. I am really struggling to come up with new and innovative ways of attracting links and wondered if anybody was in a similar position to me or could offer some advice. Would love to get some feedback. Thanks!
White Hat / Black Hat SEO | | Marc-FIMA1 -
EMD with 3.3million broad match searches got hit hard by Panda/Penguin
k, so I run an ecommerce website with a kick ass domain name. 1 keyword (plural)
White Hat / Black Hat SEO | | SwissNinja
3.3 million broad match searches (local monthly)
3.2 million phrase match
100k exact match beginning of march I got a warning in GWT about unnatural links. I feel pretty certain its a result of an ex-employee using an ALN listing service to drip spun article links on splogs. This was done also for another site of mine, which received the same warning, except bounced back much sooner (from #3 for EMD w/ 100k broad, 60k phrase and 12k exact, singular keyword phrase) I did file reinclusion on the 2nd (smaller) domain. Received unnatural warning on 4/13 and sent reconsideration on 5/1 (tune of letter is "I have no clue what is up, I paid someone $50 and now Im banned) As of this morning, I am not ranking for any of my terms (had boucned back on main keyword to spot #30 after being pushed down from #4) now back to the interesting site....
this other domain was bouncing between 8-12 for main keyword (EMD) before we used ALN.
Once we got warning, we did nothing. Once rankings started to fall,we filed reinclusion request...rankings fell more, and filed another more robustly written request (got denials within 1 week after each request)until about 20 days ago when we fell off of the face of the earth. 1- should I take this as some sort of sandbox? We are still indexed, and are #1 for a search on our domain name. Also still #1 in bing (big deal) 2- I've done a detailed analysis of every link they provide in GWT. reached out to whatever splog people I could get in touch with asking them to remove articles. I was going to file another request if I didn't reappear after 31 days after I fell off completely. Am I wasting my time? there is no doubt that sabatoge could be committed by competition by blasting them with spam links (previously I believed these would just be ignored by google to prevent sabatoge from becoming part of the job for most SEOs) Laugh at me, gasp in horror with me, or offer some advice... I'm open to chat and would love someone to tell me about a legit solution to this prob if they got one thanks!0 -
Landing Page or Doorway ?- that is the question!
Hi Guys, So, I'm looking at a project to build a series of landing pages that cross map cities with Suname. E.g. Sydney + Smyth, New York + Fitzpatrick. On those pages I'll pull in from our directory relevant name based listings and try and display some other tailored / information. The page itself is the end goal - it is definitely not a doorway in the classic sense of encouraging someone to then go on the main site. I want the user to fill out a form on this page because they realise they've landed on a valuable service. I'm looking at potentially 500 names against 2000 locations, creating 1,000,000 landing pages. Although some of the content will be repetitive I genuinely believe someone doing the appropriate search and finding our page will derive value from our page as our whole business is designed to answer their needs. However I'm worried that Google may classify these pages as doorway pages. Could anyone please shine the light of experience on this for me? Thanks!
White Hat / Black Hat SEO | | flow_seo0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0