Major Ranking Drop
-
My site bestdslrreview dot net was ranking number one for Canon T3i. I was amazed considering my competition. Canon T3i was at number 3 to 5 usually for US traffic.. Canon 7d review was ranking number 2 and Canon 7D was on page one.
Canon 7D keywords dropped about 10 places but Canon T3i Review dropped about 50. and same for Canon T3i.
I worry about a number of things, including thin affiliate site but I don't know, my articles are original. I also worry about to many links with my keywords in anchor text.
My on page results from SEOMOZ look great to me. This drop was my main reasons for going ahead and signing up. Hoping I could find out what happened.
The site is only about six months old. I waited probably three months before doing any significant link building.
The good thing is, I was ranking high from Black Thursday until a few days before Christmas. It can't be someone else just started doing SEO as I fell far too fall. Some kind of penalty. I had Google do this to another of my sites once though and after a couple of months it bounced back for most of my keywords. A site very much like this one.
Is this just the Google Dance?
Thanks
-
Oh I laughed about blaming it on the kids but personally, I think it is more likely the dog. haha
-
I don't THINK I have a think content issue. But I'll re-write those features just to be on the safe side.
I think I built too many links too fast, thousands of them and then stopped. I wont do that again. I think it is a sudden decrease in link velocity. I've have struggled with link velocity a lot and finally, I think I understand why it is important to continue with it.
You touched on it. To me it could look unnatural but like I said a small site could be popular and get lots of links. Another SEO expert told me that is true but if the backlink velocity drops to fast it could be that Google takes that as a signal the page is no longer popular as well as the possible natural linking problems.
I will continue with my blog network, slower but better quality than the others I built. However, I will also work on the duplicate content issue. I have lots of content with no links just to help with the thin but now I realize that a page could still be thin too. Will make things a bit better if I do that too.
As always, I'm all ears.
-
I dont mean that duplicate content on same site is ok, only that you want be penelized for it. But only one version of the duplicate content will rank. meaning you wont get any credit for it. they will pick a version to rank, using a canonical you can hint to the search engines what version that should be.
what the penalty you want to avoid is thin content. meaning what is left on the page after you take away advertizing and duplicate content found elsewhere on the web, then if what is left is not substantual and usefull, then it is thin content. Duplicate content wont hurt you in itself, it wont help you, but thin content will hurt you.
-
Yes, I made them no-index follow. I've now removed the no-follow on category and tags but that's the way it was before. the drop.
As for the duplicate content, only the bullet text is duplicate. That's 105 words out of 890. That's 13% and it is too high for me. I'll re-write that and resolve that just to be on the safe side. Its really two of the bullets that are causing it. But I'll change them all, looks like I got lazy there. I'll check the other articles on the site as well to make sure I didn't do that on those.
The duplicate content issue is as clear as mud. Now I know that Goggle says they don't penalize a site for duplicate content on other sites. Of course, that doesn't hold true for autoblogs. But that doesn't seem to be true when one considers backlinks. I'm uncertain. You seem very knowledgeable and I'm sure you are. What you basing the statement that onsite duplicate content will not hurt you. I'm pretty sure I've read on Google site itself that they HATE that. The reason they do is because of having the same page show up in the SERPs more than once. And I clearly had that going on before.
Yeah, It was Matt Cutts, I just couldn't remember his last name.
I'm really having trouble with this issue and based on what I've seen around the web, I'm not alone. I've got one fairly large site with over 1500 indexed pages. Google Web Tools showed a LOT of pages with duplicate titles and it took me forever to get those to disappear. Finally they did after I removed the index for categories and tags.
I'm having a lot of trouble with canonical URLs. I know I'm not alone. i've read much and understand little. I learned the most for an SEOmoz help link.
If I understand it, the intent is to tell SE where the main article is. I turned the canonical option back on in SEO platinum but then turned it off again. I'll show you why. It gave me two different links in the header. One for my category and one for my article. That seems counter productive to me and exactly what I don't want to happen.
Thanks for all of your help.
I probably shouldn't have marked your message Good answer as that made the thread answered. LOL oh well, I'll know next time. I don't want to take up all of your time.
Thanks very much for the help.
-
i noticed to made them follow that is good becqause linkjuice can still flow back out.
but you that fact that a page is not indexed does not change the flow of LJ to it.I fyou have 2 duplicates then use a canonical tag, or get rid of one of them things
like tag pages with shippets of other pages, are ok, at worst they wont rank, they
will not hurt you. its duplicate content from other sites that is the problem.As a rule blocking from robots is a last resort. i would concentrate on the duplicate
content from other pages, you need to add enouth original conent to make the page usefull.As Matt Cutts said, take away advertisments, content that is avaiable elsewhere, and what is left,
is it usefull and original?I wouldent bother with the tags in the sitemaps, category pages yes, bing for one say that
they only want the main pages in the site map, not every page, but having said that I asked
Duane Forrester from Bing and he said for a small site listing all pages is not a problem.We have alll been in panic mode when we lose rankings, I am a computer programmer
and when things are not working I start to believe anything and everything is responcible,
I will uninstall things, delete thing, blame my kids, assume i have a virus, but when i settle
down i usually find its somthing simple. -
As users build links to a website or URL, we see that pattern emerge and can
track the time over which links are built. Fast growth can indicate
popularity. Normally links build in a similar fashion across all websites, with
spikes potentially indicating popularity. When we compare that data against the
other signals we track, we get a truer picture of if that popularity is real or
not. Machines can build links quickly, but if we know the source of the links
is a common location or service that is “less than organic”, we’ll discount the
value of the links. -
I'm really glad you brought up the no no-index issue.
Most of that was probably created after the drop and I'll tell you why I did it to hopefully learn why I shouldn't do it.
I no-index privacy policy and disclaimer earnings type pages. I do that because I don't want those pages to dilute my money pages. Is that a mistake?
I also went in and usually do it from the beginning and set tag and categories to no index. I'm not sure when I did that on this site but I think it was after the drop. I'm not sure. The reason I do it is that seems to create duplicate listings in the SERPs. Or I thought it was afraid of duplicate content. Is my thinking wrong on that? I see the same pages/articles listed when I do site:domain.com and sometimes I get two listings for the same page on page one of Google. One for the article and one for the tag or category. I use SEO Platinum to accomplish this I also removed categories and tags from my sitemap using the xml sitemap generator plugin.
On the scripts, now that is something totally new to me. I'm pretty sure I did that after the drop. When I did site:domain.com I saw a lot of them listed. My robots.txt file looks like this:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: http://bestdslrreview.net/sitemap.xml
This is likely what is causing the no index on the scrips/plugins. So I should remove those to lines? I'm going to go ahead and do that since I was never sure that they should have been there in the first place.
Also I do not include categories and tags in my sitemap, should I start including those in the sitemap?
Thanks very much for the help. Sometimes I over think when I don't know exactly what the answers are.
-
Thank you. I think it will come back too but the too many links issue.....
I have always wondered about that and I seem to get different advise.
I forgot his name, the Goggle guy, Matt ??? most of you will know... He wrote a post indicating you can't link too many times. That is what most of the people that say you can't build to many links to fast base their advice on. But to me, I've always wondered if it makes since to have 2000 links to a site with five articles. Then I reason people could love a site with only a few pages, I'm sure that happens naturally at times?
If you're right, then I've made it worse because I've been building more links but aiming at higher PR sites when I do it. I've got a lot of links from PR 0 sites but in open site explorer it appears I have a pretty good natural looking distribution of links across different PR sites so I'm not at all sure.
I'm all ears if you and others have more to say to help me with this issue.
-
There is a fewe pages with no index, They see to have content i cant see why you would no index them, like this one
/tag/canon-7d/
You have also no indexed a few scripts, I beleive this is a mistake, because a search engine knows that the scripts are being used, but does not know what you are doing with them, and may lead to a lack of trust.Thhere are a few other technical issues, but not what would lead to a huge drop in ranking.
You do have a fair bit of duplicate content, the reviews are on other pages, if you wrote the reviews i would not worry, search engines usealy know why is copying and who is the creater.
If you are getting this content from elsewhere, you need to add enouth content to each page so that it apears original.
-
So you are saying you have been only building links for 3 months, but looking at OSE the Canon TSi page has 425 linking domains and canon 7D 751 linking domains?? This is way too many for that time frame IMHO. I would say you have tripped a filter. Wait 60 days and see if it comes back up... it should do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Drop in Traffic
We recently went through a server upgrade, including SSL, Encryption, Browser Caching and Image Compression. Most articles I've read, state there will be a slight decrease in traffic shortly after launching before an increase from the normal volume. However, we launched over 3 weeks ago and our traffic has dropped consistently since. Question1: What would be an average length of time after implementing this type of update to rebound from the drop? Question 2: I follow Moz to the letter and have scores of 97 and 99 ,from the optimize a page section, for most of our product pages targeted keywords. What recommendations do you, my friends in the Moz Community, suggest we do next? TY in advance, you input is very valuable to me. KJr 7NSVm
On-Page Optimization | | KevnJr1 -
Optimise root level page for secondary keyword or try to make second level rank?
My Main keyword ranks no where in the top 100 for any pages in my site. for / I am ranked at number 10 for a great secondary keyword. Really want to get this secondary keywordranking on /secondary-keyword.html , should I focus my main page only on my primary keyord and remove any references to "Secondary Keyword"? What I feel here is that I would be sacrificing my rank for "Secondary Keyword"
On-Page Optimization | | Adamzoz0 -
Site update and what to do about current keyword rank
Hi im in the process of giving my site a major update as its only one page and not responsive . The the new site will have a homepage with a list of my services with a small description of each and each service will link to its own page. My one page site at the moment ranks number one in my area for a low competition keyword (moz keyword score of 13%) of course this sends me very little traffic. With my new site im also going for more keywords but these will be Moz KWD 30% and 40%. I know this will be quite hard and take some time but i'm pretty sure I know what I need to do to get there. Now my question is what do i do with my current home page (only page) that ranks for that low comp keyword?
On-Page Optimization | | juun
I dont want to lose my rank for that, so do i make a new page on my new site that is optimized for that keyword, but then I don't want to 301 my homepage to that so I guess as its such an easy word to rank for my new page would soon rank for that? What are your thoughts and advice please, thanks in advance.1 -
Schema business address - can this affect me in the national rankings
Looking at using Schema microdata on our new website, we currently rank quite decently in Google UK. If I was to add schema mark up for our business address could this effect us in Google UK rankings wise i.e. our rankings for national search (for want of a better word!) suffer but our local rankings increase?
On-Page Optimization | | Jon-C0 -
My site has been dropping, not sure why!
My site has been dropping in the rankings, not sure - my metrics seem better than my competitors. Historically I have been a very stable #2 for my main term, but now it's down to 7! According to SEO Moz, my domain authority is 32, while my better performing competitors are are 26, 11, and 1! Have more links than they do. Trying to think it through, not sure what is happening. My home page bounces at a low 20%-ish, other Google Analytics are good. I have a company Facebook account, occasionally upload YouTube vids, do online press releases, etc. I do have to target several metros scattered across the state, while my competitors usually focus on one major metro. I do have some SEO Moz errors, which focus on dup content due to our web editor's naming system. An example would be domain.com/keyword-keyword-i-14 vs. domain/differnet keyword-different better keyword-i-14. 14 would be the actual page number. Our system lets me change the page title keywords, as I've added new links and pages over the years there are some dupes. The only major change is I've added a password protected section for sales rep materials. The hosting/web guru firm we use has assured me Google doesn't see pages behind the password portection. Not sure if Google is testing a new SERP formula. All social media or non-website results seem to have dropped out of search for my terms. Just local business sites like mine and some directory sites remain. Any advice or private consult would be greatly appreciated as I am a ... self taught 'OneManBand' for high tech marketing in our company. Thanks
On-Page Optimization | | OneManBand0 -
Impact of mobile pages on current rankings
Morning all - I've been getting a bit of traffic recently for mobile phrases, so am thinking of putting a mobile optimised page on my site which users will be automatically directed to when they visit my site. My question is though, how will this affect the rankings of those current pages which are trying to target mobile users. For example, let's say I've got a page at www.betting.com/iphone which is ranking really well for those users looking to place a few bets on their iPhone. Once I stick my mobile optimised page up anyone clicking through to this URL will be re-directed to a generic mobile landing page at a different URL. Is this likely to effect my rankings of the original www.betting.com/iphone page at all given the fact that all visitors are being immediately re-directed elsewhere? Thanks very much for your help
On-Page Optimization | | theshortstack0 -
Should you try to rank for misspelled keywords?
Hi there, 2 part question: Is it best practice to try to rank for misspelled keywords that bring in lots of traffic or should you instead just try to rank for the correct spelling of that keyword and hope that you rank better on the misspelling as an indirect result? E.G. The misspelled keyword "Hamilton island accomodation" is a common misspelling that brings in traffic but we have an "F" rank for that term (obviously because we spell accommodation correctly on our site). We don't want to misspell anything but are there techniques to rank better for misspellings that won't hurt content quality? The On-Page Optimization tool says that our website doesn't rank in the top 50 on Google Aus for "Accomodation Hamilton Island" or "Hamilton Island Accomodation" but when i do a manual search, we actually are the first result. Is this an error with the On-Page optimization tool? Thanks!
On-Page Optimization | | HamiltonIsland0 -
Has anyone ever attempted (successfully or not) to consciously reduce their amount of homepage links in order to improve their search engine rankings?
I work for a retail company that is highly segmented. We have a lot of categories to cover the types of merchandise we offer on our home page, and way more than the recommended # of links on our homepage because of it. Has anyone ever attempted (successfully or not) to consciously reduce their amount of links (and categories) in order to improve their search engine rankings? If so, can you walk through your process and your advice on whether or not reducing links on a site like mine is a good idea?
On-Page Optimization | | reallygoodstuff0