How to unrank your content by following expert advice [rant]
-
Hi,
As you can probably see from the title, a massive rant is coming up. I must admit I no longer understand SEO and I just wanted to see if you have any ideas what might be wrong.
So, I read this blog post on MOZ https://moz.com/blog/influence-googles-ranking-factor - where the chap is improving ranking of content that is already ranking reasonably well.
I've got two bits of news for you. The good news is - yes, you can change your articles' ranking in an afternoon.
Bad news - your articles drop out of Top 100.
I'll give you a bit more details hoping you can spot what's wrong.
Disclaimer - I'm not calling out BS, I'm sure the blogger is a genuine person and he's probably has had success implementing this.
The site is in a narrow but popular ecommerce niche where the Top 20 results are taken by various retailers who have simply copy/pasted product descriptions from the manufacturer's websites.
The link profile strength is varied and I'm not making this up. The Top 20 sites range from DA:4 to DA:56. When I saw this I said to myself, it should be fairly easy to rank because surely the backlinks ranking factor weight is not as heavy in this niche as it is in other niches. My site is DA:18 which is much better than DA:4. So, even if I make my pages tiny tiny bit better than this DA:4 site, I should outrank it, right?
Well, I managed to outrank it with really crap content. So, I got to rank two high-traffic keywords in #8 or #9 with very little effort. And I wish I stayed there because what followed just completely ruined my rankings.
I won't repeat what was written in the blog. If you're interested, go and read it, but I used it as a blueprint and bingo, indeed Google changed my ranking in just a couple of hours.
Wait, I lost more than 90 positions!!!! I'm now outside Top100. Now even irrelevant sites in Chinese and Russian are in front of me. They don't even sell the products. No, they're even in different niches altogether but they still outrank me.
I now know exactly what Alice in Wonderland felt like. I want out please!!!!
-
Hi there,
I know it's been a while, but were you able to figure it out? What happened after you requested a fetch?
I'd love to do some type of case study on this with you if you still didn't recover.
-
Why was this marked as "answered"? I don't know what I'm more shocked about - my discovery on Google or your lack of reaction here...
-
Update. I checked Google cache on all my experiment pages and it was out of date. So "fetching as Google" adds the page to a queue and it takes several days for the queue to be processed. Judging by my recent experience, Google withholds queued pages from the index until it's got a chance to re-crawl the page. I don't think I have a problem with this, but...!
Now all my experiment pages are back exactly where they were (no, dear Mr Jeff Baker, no improvement whatsoever), however, I still think this is really bad news, and here's why:
- You can't listen to "experts" too much even if they publish on authoritative sources. That's why it's important to keep a few test sites so that you can test theories before you apply them on your client sites.
- Content is not king. You won't be able to increase your ranking by purely providing better content without improving on other factors
- If you have a stronger link profile, you can afford to serve your visitors crap content
Indirectly, Google is encouraging people to buy links. Next thing I'm going to do after hitting the "Post" button on this page is contacting my link broker whom I stopped using in 2013. Who is the winner in this situation? Nobody, apart from my link broker. I'll tell you who is the main loser in this situation - the visitor. He will be served crap content because from today I will stop caring about providing valuable content to my visitors. Thank you very much, Google!
Conveniently, comments are closed 30 days after MOZ guest posts go live, preventing people to call out BS. Well, I guess it keeps the circle of friends happy, which is the most important thing, right?
-
Thank you, I did check Copyscape and it is not copied.
The original post is here: https://moz.com/blog/influence-googles-ranking-factor - I did check a few of their own site (Brafton) and their articles are ranking reasonably well. I wouldn't say amazing, it is hit and miss but some of his own content ranks pretty well.
So, I can now officially confirm that I'm NOT going mad. I have since done another two experiments and both backfired spectacularly!
Experiment 1 - improve a product description of another product. For the previous experiments I also embedded useful YouTube videos in the description. To make sure it's not the iframe that causes ranking to disappear, I didn't include a video this time. Just took the description from 56 words of nonsense to 300+ words of content that answers buyers' questions.
Tank!!! From #9 to #Nowhere
Experiment 2 - a completely different website with a different audience, different link profile and different buyers intent (lead gen site)... Identified a static page (not a blog post, not a product description) that ranks #17 for a super popular lead gen keyword. It had very average 500-word content. I improved the structure (H1-H3) and added 400 fresh words based on real-life questions that this business receive from potential customers. Sounds useful to you? Sure it does. Google downgraded the page to #27
I am massively worried now. I think I'm giving up SEO and I'm not joking. If this is how Google rewards valuable content, my other option is to make a really crap spam site with copied content. But hey, I just don't want to do it.
I will do another experiment. I will revert one of the product descriptions back to the super-crap content that it used to have, however, knowing Google I doubt that I will regain the positions. I will report back.
In the meantime, if you've had similar experience, I suggest we join forces and challenge Google's staff on Twitter or other social media.
-
Do you have a link to the post? Would be interesting to have a look at it now, compare it to Wayback Archive version, look at the differences etc. Can also run it through Copyscape to see whether by random chance you have typed something very similar to something else well known on the web :') unlikely but... monkeys in a room with a typewriter, and all that. There are any number of variables which could have contributed to this, or it could be a legitimate Google glitch
-
Hi Alex,
I took two product descriptions. Both pages were very similar - just a couple of sentences taken from a manufacturer's brochure.
I went through competing pages with a critical eye, made a list of topics that would match the buyer's intent and crafted original product descriptions that answered a lot of questions.
So I took it from less than 100 words of nonsense to nearly 500 words of in-depth human-generated content.
I didn't do anything else at the same time because I was keen on finding out how much truth there is in the "give Google amazing content" lie. My thinking behind this was that all competitors were ranked with copied product descriptions. So if I can provide original descriptions, I'd be rewarded, all other things being equal.
As for reverting back, no I didn't revert back. I don't know why, I probably don't see a point because rankings now seem completely random.
I've had lots of success both prior and post the 2012 "scare campaigns" however in the last year or two it's just sliding downhill slowly but steadily and I have no idea why.
-
Hi,
I find that very interesting. Obviously, the post was a bit of a plug for Marketmuse, but I always felt that the underlying advice was solid.
Are you saying that you simply went from having a very focussed single topic page to a more in-depth article and found that you lost rank? Was there anything else you did at a similar time? I assume you have now reverted your content? (And has that had any effect?)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing server location nearest to visitors? i am confused with the content part.
hi there, currently hosted in Singapore, and target audience is the US, john mueller said keep the url, content and cms the same. i am confused with the content part i have been tweaking the content for a month now because i have changed content on my site a day ago if i change the server the next day? is that bad? what should be done?
Algorithm Updates | | maria-cooper90 -
Can we add header tags followed by header tags without text in-between? Best practice?
Hi all, I need clarification on this. We are adding new pages where H2 is followed by few bold bullet point headings with plain text description under each bullet point. I am just wondering whether we can given these bold bullet points as H3 tags as just leave as text. In the below example, can "**Good for website" **and "**Good for visitors" **be H3 tags or not? Benefits of SEO (H2) Good for website: Followed for best practices to show in search results Good for visitors: Will give better user experience. Number of H3 tags followed by a H2 is fine? In fact header tags followed by any header tag if Okay without plain text in-between? Thanks
Algorithm Updates | | vtmoz0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
Hello, I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings. Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions? Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings? Thank you for your help!
Algorithm Updates | | airnwater0 -
Duplicate content advice
Im looking for a little advice. My website has always done rather well on the search engines, although it have never ranked well for my top keywords on my main site as they are very competitive, although it does rank for lots of obscure keywords that contain my top keywords or my top keywords + City/Ares. We have over 1,600 pages on the main site most with unique content on, which is what i attribute to why we rank well for the obscure keywords. Content also changes daily on several main pages. Recently we have made some updates to the usability of the site which our users are liking (page views are up by 100%, time on site us up, bounce rate is down by 50%!).
Algorithm Updates | | jonny512379
However it looks like Google did not like the updates....... and has started to send us less visitors (down by around 25%, across several sites. the sites i did not update (kind of like my control) have been unaffected!). We went through the Panda and Penguin updates unaffected (visitors actually went up!). So i have joined SEOmoz (and loving it, just like McDonald's). I am now going trough all my sites and making changes to hopefully improve things above and beyond what we used to do. However out of the 1,600 pages, 386 are being flagged as duplicate content (within my own site), most/half of this is down to; We are a directory type site split into all major cities in the UK.
Cities that don't have listings on, or cities that have the same/similar listing on (as our users provide services to several cities) are been flagged as duplicate content.
Some of the duplicate content is due to dynamic pages that i can correct (i.e out.php?***** i will noindex these pages if thats the best way?) What i would like to know is; Is this duplicate content flags going to be causing me problems, keeping in mind that the Penguin update did not seem to affect us. If so what advise would people here offer?
I can not redirect the pages, as they are for individual cities (and are also dynamic = only one physical page but using URL rewriting). I can however remove links to cities with no listings, although Google already have these pages listed, so i doubt removing the links from my pages and site map will affect this. I am not sure if i can post my URL's here as the sites do have adult content on, although is not porn (we are an Escort Guide/Directory, with some partial nudity). I would love to hear opinions0 -
Need help with some duplicate content.
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was. It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this: http://noahsdad.com/resources/ http://noahsdad.com/resources/page/2/ http://noahsdad.com/therapy/page/2/ I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category. What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed." There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories. Any ideas what I should do? Thanks.
Algorithm Updates | | NoahsDad0 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0