NYT article on JC Penny's black hat campaign
-
Saw this article on JC Penny receiving a 'manual adjustment' to drop their rankings by 50+ spots:
http://www.nytimes.com/2011/02/13/business/13search.html
Curious what you guys think they did wrong, and whether or not you are aware of their SEO firm SearchDex? I mean, was it a simple case of low-quality spam links or was there more to it? Anyone study them in OpenSiteExplorer?
-
Just seeing this post now. Does anyone find it ironic that NYT drops a follow link to JCPenny in the article?
-
Today (April 27) I see them down at #51 for "dresses". It will be interesting to see how long Google keeps them in the tank. They made a lot of money during the Christmas season that other rule-abiding retailers would like to have earned.
I think that they should be in the tank at least until the end of the 2011 Christmas season.
If I bought 100,000 links I bet my site would be out of the SERPs.
-
I figured that when this hit the mainstream, our clients would want to be sure we weren't doing anything below board. Interestingly, in many instances, it had the opposite result. They wanted to know how JC Penny was having so much success...
-
I've read a lot about this over the web, but essentially Thomas below has summed it up. It's good to have these high profile cases in the SEO world as it reminds us all why we link build manually ad by the book!!
-
I guess the NYTimes article gives Googl a pretty good reason for the -50 filter:
"Someone paid to have thousands of links placed on hundreds of sites scattered around the Web, all of which lead directly to JCPenney.com."
Seems like they did the majority of their link building over a year ago - http://www.majesticseo.com/reports/compare-domain-backlink-history?d0=JCPenney.com&type=0
And btw, congrats SEOmoz for getting OSE mentioned in the NYtimes article
-
Hey Mike: From what I read, it was a simple case of buying links and when the NYTbrought it to Matt & Co's attention, they manually delisted them.
Vanessa Fox had a great write up on it at Search Engine Land.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Black Seo --> Attack
Hello there, Happy new year for everyone, and good luck this year. I have a real problem here, I saw in MOZ link history that somehow the "Total Linking Root Domains" is growing from a medium of 30 - 40 to 240 - 340 links and keep it growing. I guess somebody make me good joke, cause i did not buy any link :)) even cn, brasil, jp links, my store is from Romania. How I can block these links I think google will make me bad instead. What should i do? Thank you so much. With respect,
White Hat / Black Hat SEO | | Shanaki
Andrei 0tYg1wB.png0 -
Does showing the date published for an article in the SERPS help or hurt click-through rate?
Does showing the date published for an article in the SERPS help or hurt click-through rate?
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
Hackers are selling fake 'Likes' on FB, Instragram
An interesting article on how to get social media buzz: http://www.huffingtonpost.com/2013/08/16/fake-instagram-likes_n_3769247.html
White Hat / Black Hat SEO | | ChristopherGlaeser0 -
'Stealing' link juice from 404's
As you all know, it's valuable but hard to get competitors to link to your website. I'm wondering if the following could work: Sometimes I spot that a competitor is linking to a certain external page, but he made a typo in the URL (e.g. the competitor meant to link to awesomeglobes.com/info-page/ but the link says aewsomeglobes.com/info-page/). Could I then register the typo domain and 301 it to my own domain (i.e. aewsomeglobes.com/info-page/ to mydomain.com/info-page/) and collect the link juice? Does it also work if the link is a root domain?
White Hat / Black Hat SEO | | RBenedict0 -
Big loss in Google traffic recently, but can't work out what the problem is
Since about May 17 my site - http://lowcostmarketingstrategies.com - has suffered a big drop in traffic from Google, presumed from the dreaded Penguin update. I am at a loss why I have been hit when I don't engage in any black hat SEO tactics or do any link building. The site is high quality, provides a good experience for the user and I make sure that all of the content is unique and not published elsewhere. The common checklist of potential problems from Penguin (such as keyword stuffing, web spam and over optimisation in general) don't seem relevant to my site. I'm wondering if someone could take a quick look at my site to see any obvious things that need to be removed to get back in Google's good books. I was receiving around 200 - 250 hits per day, but that has now dropped down to 50 - 100 and I fee that I have been penalised incorrectly. Any input would be fantastic Thanks 🙂
White Hat / Black Hat SEO | | ScottDudley0 -
Auto-link inside your own site to the same domain is white-hat?
Hi, I am using a plugin in wordpress that make auto link for some certain keywords in my site suppose: My site is example.com My important keyword is: sample and across the domain example.com through out the content if there is the word: sample it is linked automatically to example.com I like your opinion about this practice, if it may carry any kind of punishment by SEs? Thanks.
White Hat / Black Hat SEO | | Pooria0