NYT article on JC Penny's black hat campaign
-
Saw this article on JC Penny receiving a 'manual adjustment' to drop their rankings by 50+ spots:
http://www.nytimes.com/2011/02/13/business/13search.html
Curious what you guys think they did wrong, and whether or not you are aware of their SEO firm SearchDex? I mean, was it a simple case of low-quality spam links or was there more to it? Anyone study them in OpenSiteExplorer?
-
Just seeing this post now. Does anyone find it ironic that NYT drops a follow link to JCPenny in the article?
-
Today (April 27) I see them down at #51 for "dresses". It will be interesting to see how long Google keeps them in the tank. They made a lot of money during the Christmas season that other rule-abiding retailers would like to have earned.
I think that they should be in the tank at least until the end of the 2011 Christmas season.
If I bought 100,000 links I bet my site would be out of the SERPs.
-
I figured that when this hit the mainstream, our clients would want to be sure we weren't doing anything below board. Interestingly, in many instances, it had the opposite result. They wanted to know how JC Penny was having so much success...
-
I've read a lot about this over the web, but essentially Thomas below has summed it up. It's good to have these high profile cases in the SEO world as it reminds us all why we link build manually ad by the book!!
-
I guess the NYTimes article gives Googl a pretty good reason for the -50 filter:
"Someone paid to have thousands of links placed on hundreds of sites scattered around the Web, all of which lead directly to JCPenney.com."
Seems like they did the majority of their link building over a year ago - http://www.majesticseo.com/reports/compare-domain-backlink-history?d0=JCPenney.com&type=0
And btw, congrats SEOmoz for getting OSE mentioned in the NYtimes article
-
Hey Mike: From what I read, it was a simple case of buying links and when the NYTbrought it to Matt & Co's attention, they manually delisted them.
Vanessa Fox had a great write up on it at Search Engine Land.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local Map Pack: What's the best way to handle twin cities?
Google is increasing cracking down on bad local results. However, in many regions of the US there are twin cities or cities that reside next to each other, like Minneapolis-Saint Paul or Kansas City. According to Google guidelines your business should only be listed in the city in which your business is physically located. However, we've noticed that results just outside of the local map pack will still rank, especially for businesses that service the home. For example, let's say you have a ACME Plumbing in Saint Paul, MN. If you were to perform a search for "Plumbing Minneapolis" you typically see local Minneapolis plumbers, then Saint Paul outliers. Usually the outliers are in the next city or just outside of the Google map centroid. Are there any successful strategies to increase rank on these "Saint Paul outliers" that compete with local Minneapolis results or are the results always going lag behind in lieu of perceived accuracy? We're having to compete against some local competitors that are using some very blackhat techniques to rank multiple sites locally (in the map results). They rank multiple sites for the same company, under different company names and UPS store addresses. Its pretty obvious, especially when you see a UPS store on the street view of the address! We're not looking to bend the rules, but rather compete safely. Can anything be done in this service based scenario?
White Hat / Black Hat SEO | | AaronHenry0 -
New online store and use black hat to bring lots of sales
I have one online store and all the seo rules are follow to increase ranking and sales. Buying a new url a launching a new store ( to sale exactly the same products) is fast, easy and cheap. How about using black hat to this new store? I think I have nothing to loose. Is there something I should know before moving ahead? Launching a new store is very cheap and black hat can be done by one of those overseas company at low prices First thing, this new store should not link to my actual store I guess. Any advice? Thank you, BigBlaze
White Hat / Black Hat SEO | | BigBlaze2050 -
Will implementing 301's on an existing domain impact massively on rankings?
Hi Guys,I have a new SEO client who only has the non-www domain setup for GWT and I am wondering if implementing a 301 for www will have a massive negative impact on rankings. I know a percentage of link juice and PageRank will be affected. So my question is: If I implement the 301 should I brace myself for a fall in rankings. Should I use a 301 instead to maintain link juice and PageRank? Is it good practice to forward to www? Or could I leave the non www in place and have the www redirect to it to maintain the data? Dave
White Hat / Black Hat SEO | | icanseeu0 -
Google 'most successful online businesses'
how come this guy has all but 1 of the top ten results? (UK results - I'm guessing same in USA?) - with thin content on a spammed keyword on multi-sub domains? How can we 'white hat' guys compete if stuff like this is winning?
White Hat / Black Hat SEO | | TheInternetWorks0 -
Is this a white hat SEO tactic?
Hi, I just noticed this website http://www.knobsandhardware.com hosts pages like http://www.knobsandhardware.com/local/hardware/California-Cabinet-Hardware.html that are filled with permutations of products + cities. These pages rank for these long tail phrases. Is this considered white hat?
White Hat / Black Hat SEO | | anthematic0 -
How many times should one submit the same article to various websites? 1 time? 10 times? What is okay to do with the most recent Panda update?'
For link-building purposes, seemingly it was okay to post the same article to multiple sites for links in the past. However, after the most recent Panda update our thought is that this may not be a good practice. So the question is, how many times is okay to submit an article for link building purposes. Should you always only submit to one site? Is it okay to do more than once? What is the right way to submit for link-building in Google's eyes? Thanks
White Hat / Black Hat SEO | | Robertnweil10 -
Pages For Products That Don't Exist Yet?
Hi, I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for. Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger. What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party? The pages would be like "coming soon" pages, but still optimized to the main product search term. About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking. What do you think? Thanks!
White Hat / Black Hat SEO | | 945010 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0