HELP: What happened to my rankings? No warning from google how to know if i was penalised?
-
Hi Guys
I have just completely a site re-design, I have 3 top level domains.
I have no idea whats causing the drop in ranking. I have changed the title tags and meta tags to improve them and make them better, as the last ones weren't really doing us justice. But I see now it has actually dropped our main keyword. I read somewhere that i had to completed **site search **to check and I don't see our home page showing. I was ranking for the keyword: "online psychics" for over 4months at #6 and now is not showing anywhere in the top 50 keywords. I'm also affraid I can not find our other keyword "online psychic readings" which we were ranked #11 seems to have dropped to #44 I have no idea why this would be the case. Our new home page shows a better user experience and also added more content, unqiue content at that - our last design was content thin so I have no idea why we have dropped so much in rankings. The site is also new about 6months new. I have checked WMT and have not received any warnings of any penalties as such, unless it is still coming?
Does anyone have any suggestions here?
Cheers
-
WOW thanks again Dirk, your feedback is beyond! Thanks for informing me of these changes and sending me those links!
I will work on getting the above sorted asap with my developer
Cheers for all your help!!!!
Justin
-
Forgot to mention: don't forget to change it on all three sites.
The new site looks really nice compared to the old one - and speed seems to have improved. However still some work to do:
- time to first byte takes ages - could be related to the configuration of the server or related to some plugin causing delay.
- ask your programmer to use gzip to compress HTML
- minify your css & js
- optimise your images
- modify your caching (time is too short)
The detailed result from webpage test is here: http://www.webpagetest.org/result/150503_A2_BZ7/
Also check https://developers.google.com/speed/pagespeed/insights/ - your score is not too bad but check the improvements that are suggest.
Good luck!
Dirk
-
Hi Justin,
Normally everything should return to normal after a few days. You could try to speed up the process a bit by taking your key landing pages from Analytics. Fetch these pages in Webmaster tools (Fetch like Google) - when they are fetched submit them to the index (off course you first have to remove the noindex tag).
It's a quite common mistake - we had a similar case with a test robots.txt which was put in production. I was on holiday at the time and only noticed the error when I returned (3 days after go live). Everything returned to normal within a day.
rgds,
Dirk
-
Oh wow hey Dirk, silly me! Yes thank you for that oh how embarrassing.
I had told my developer to remove this last week - seems he hadn't and I didn't bother to check it! After I remove this should all be back to normal again?
-
Hi Justin
I fear that you have migrated your test settings to production - you have 196 HTML pages on your site - 115 pages have a content='noindex, follow' name='robots'> in the head section - this is also the case for your homepage and several other pages that seem important for SEO. Removing the tag will certainly help
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
On-site duplication working - not penalised - any ideas?
I've noticed a website that has been set up with many virtually identical pages. For example many of them have the same content (minimal text, three video clips) and only the town name varies. Surely this is something that Google would be against? However the site is consistently ranking near the top of Google page 1, e.g. http://www.maxcurd.co.uk/magician-guildford.html for "magician Guildford", http://www.maxcurd.co.uk/magician-ascot.html for "magician Ascot" and so on (even when searching without localisation or personalisation). For years I've heard SEO experts say that this sort of thing is frowned on and that they will get penalised, but it never seems to happen. I guess there must be some other reason that this site is ranked highly - any ideas? The content is massively duplicated and the blog hasn't been updated since 2012 but it is ranking above many established older sites that have lots of varied content, good quality backlinks and regular updates. Thanks.
White Hat / Black Hat SEO | | MagicianUK0 -
Will implementing 301's on an existing domain impact massively on rankings?
Hi Guys,I have a new SEO client who only has the non-www domain setup for GWT and I am wondering if implementing a 301 for www will have a massive negative impact on rankings. I know a percentage of link juice and PageRank will be affected. So my question is: If I implement the 301 should I brace myself for a fall in rankings. Should I use a 301 instead to maintain link juice and PageRank? Is it good practice to forward to www? Or could I leave the non www in place and have the www redirect to it to maintain the data? Dave
White Hat / Black Hat SEO | | icanseeu0 -
Please Help- Confusion about how to Avoid Keyword Self-Cannibalization and Keyword Stuffing
I am pretty much a rookie when it comes to the SEO game and to be completely honest SEO is really confusing. I just recently started using MOZ and I was looking at my On-Page report and I saw that I needed to correct some “Avoid Keyword Self-Cannibalization” errors. So I looked at the error and the fix. Here is what MOZ gave me. Cannibalizing link "How to make a fake diploma", "How to get a fake diploma", "Making a Fake High School Diploma", "Fake Diploma Template", and "Framing your fake diploma" Explanation It's a best practice in SEO to target each keyword with a single page on your site (sometimes two if you've already achieved high rankings and are seeking a second, indented listing). To prevent engines from potentially seeing a signal that this page is not the intended ranking target and creating additional competition for your page, we suggest staying away from linking internally to another page with the target keyword(s) as the exact anchor text. Note that using modified versions is sometimes fine (for example, if this page targeted the word 'elephants', using 'baby elephants' in anchor text would be just fine). Recommendation Unless there is intent to rank multiple pages for the target keyword, it may be wise to modify the anchor text of this link so it is not an exact match. This error is for my Hompage(http://www.fake-diploma.com) for the keyword Fake Diploma. My understanding is that for Self-Cannibalization to occur I would have to have a link on this page pointing to another page using "Fake Diploma" as my anchor text since I want this page to rank for Fake Diploma. I do have the right hand sidebar which contains my most recent posts and some of my titles do include Fake Diploma. How to make a Fake Diploma
White Hat / Black Hat SEO | | diplomajim
Fake Diploma Template
Framing your Fake Diploma
To me theses are separate longtail keywords. While they do include Fake Diploma in them I thought theses were fine because they are not an Exact Match to each other nor are they an Exact Match to “Fake Diploma”. Am I wrong about this? Secondly I reached out on another Forum trying to get a better understanding of this and just got even more confused. I was told that I am also Keyword Stuffing and could be penalized. They said because I have Fake Diploma in most of my article titles that I am Stuffing Fake Diploma. I am in a Niche Market and of course most of my titles include Fake Diploma because that is what my entire site is about. I used the Google Keyword Tool and searched Fake Diploma and it gave me a list of about 79 related keywords like: Make a Fake Diploma Online
Create a Fake Diploma
Fake Diploma Software This is just a few of the many that I have. I thought the best way to rank for a keyword was to actually write a post about that Keyword and use it as the title of the article. I am not over using the Keyword in the actual article and I maybe have a Keyword density of about 2-5%. I thought Keyword Stuffing was where you actually used the Keyword like 50 times and also just added random Keywords to the article that did not belong. Please help me with any insights you can offer. I feel like I am doing all of this completely wrong.0 -
Do shady backlinks actually damage ranking?
That is, it looks like a whole bunch of sites got smacked around the penguin/panda updates, but is this by virtue of actually being hurt by google's algorithms, or by virtue of simply not being helped "as much"? That is, was it a matter of the sites just not having any 'quality' backlinks, having relied on things google no longer liked, which would result in not having as much to push them to the top? That is, they would have been in the same position had they not had those shoddy practices? Or was google actively punishing those sites? That is, are they worse off for having those shoddy practices? I guess the reason I ask is I'm somewhat terrified of going "out there" to get backlinks -- worst case scenario: would it just not do much to help, or would it actually hurt? Thanks!
White Hat / Black Hat SEO | | yoni450 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
What Google considers to be a branded keyword?
We can set our own keywords as branded in SeoMoz campaign, but Google would not necessarily see them like branded. After reading the Blog post at http://www.seomoz.org/blog/how-wpmuorg-recovered-from-the-penguin-update I had a question: Are there known rules (or at least guesses) what Google considers a branded keyword/anchor text? I guess the first one would be your website domain. So bluewidget.com for example would be a branded keyword for bluewidget.com website. How about Blue Widget or Blue Widget Company?
White Hat / Black Hat SEO | | SirMax0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0