Shoing strong for last 2 years for search terms NOW GONE! What happened?
-
Hi All! I have a 9-11~ my website www.popscrap.com has been showing strong (top 2) for about 2 years now for many of the search terms we are targeting (scrap software; scrap medal software; recycling software; etc.), and I just noticed today that we are nowhere. What do you suggest for troubleshooting this to find the cause and fix?
Thanks!
-
Well, I removed the suspect content, and after 2 weeks, nothing. Then I added Google Authorship to each page, and the NEXT DAY the site is back in the top positions for our target terms, and the leads are pouring in. Was it the Google Authorship? It certainly felt like it. But I thought that was not a ranking factor.
Anyway, thanks for all the support! BB
-
On a quick look my gut instinct is that this is ok. However, on a site: search I'm seeing that you have over 19,000 pages indexed in Google. That's a bit of a Panda flag for me as most likely there are not 19,000 unique pages that add value on your site.
-
Thanks for the response, Marie
I asked the question as I was wondering whether I'd need to add "boilerplate" text to each description to fill it out. I'd rather not as a) it's not very scaleable and b) I'm not sure it would add value to our users per se, as in the main people want to see pictures. Here's an example of one of the shorter descriptions we run.
-Is the content the same MLS description that is on multiple sites? If so, then I'd noindex it Of the 4,500 pages, 95-98% are content that's unique to our site (the other ~2-5% are managed by individual realtors who I'm guessing probably copy and paste descriptions from their own sites. We're not in the US so aren't part of the MLS network).
-Do users engage with your content? Mos' def.
-
It's hard to say what Google views as thin. Here are some factors I would consider when making that decision:
-Is the content the same MLS description that is on multiple sites? If so, then I'd noindex it.
-Do users engage with your content? Short content can be useful. If Google sees that people are actually engaging with your site then they will have no problem with thin content.
It sounds to me like these pages are probably ok. But I can't say for certain.
-
"Thin content" question:
I run a real estate website and carry about 4,500 property pages (each page consisting of between 5-13 photos and about 50-300 words of a property description) Might the pages of ~50 words run the risk of being deemed "thin content" even though they have photos on them?
I also have around 200-250 article pages that are far more text-heavy.
FWIW, I don't think I've been hit by Panda 4.0. (I've slid from about #8 to #12 over the past 2 weeks but I suspect that's more to do with sluggish content marketing/link-building). -
If unpublishing causes the pages to either be removed from your site or noindexed then yes, that's the same thing.
-
Thank you! But what about unpublising? Is that the same thing as removing, in the eyes of Google?I want to remove ALL pages under the "Scrap Laws" menu, because I think that is where the issue is. But I don't want to delete totally and have to recreate them all later. Thnaks again!
-
While you can test this over time, it would be difficult because you will never know if you've done enough to satisfy Panda. And really, you don't even know for sure if Panda is the culprilt. (I think it is, but no one can say for sure.)
So, let's say you took out some of the low quality content and a month later nothing has changed. That could mean that you didn't take out enough to make the Panda algorithm see your site as high quality. But, it could be that you just need more time. While some sites recover within one Panda refresh (and that usually happens approximately monthly), others seem to need several refreshes.
In regards to unpublishing vs deleting the content, you can either delete the pages or you can use a noindex tag to tell Google not to include the pages in the index. Having low quality pages on your site that are noindexed will not hurt you in the eyes of Panda.
-
Thanks Marie! I'm getting the feeling it's the content. Quick question: Could I just unpublish the content and then test over time, OR do I need to completely delete the questionable content from the site? Does Google see it if it is unpublished and still penalize?
-
There were two major algorithm updates last week - Panda and the Payday loans algorithm. Payday loans affects sites that had done really spammy link building and it is very unlikely that this affected you. But, Panda is certainly possible.
I haven't had a good look at the site, but I see that you have 263 pages indexed in Google. Are all of these pages high quality pages that Google would be proud to show to searchers? If you've got duplication amongst the pages or if you've got "unhelpful" pages that are indexed then you need to remove or noindex them. On a quick look here are some examples of pages that should be removed or noindexed:
http://www.popscrap.com/component/content/category/11-demo-articles
http://www.popscrap.com/component/users/?view=remind
http://www.popscrap.com/24-products/120-scrapshield - It looks like a good amount of the text on this page is on multiple pages of your site.
Of course, there could be other issues. If you've made any changes to the site recently then I'd look at those changes first, but otherwise I'd go on a thorough cleanup so that only the pages that are the best are shown to Google.
-
To help figure out what is causing the 404 errors do the following in webmaster tools:
-login to your websites profile, then on the left hand side navigation hit crawl > crawl errors > not found. Under not found review the list of URL's for clues (you can also click on an individual link to see where the 404 page was linked from). Depending on how large your site is, if the 747 not found URL's is a large percentage of your total page count, you could be experiencing a temporary rankings drop that will disappear one you fix your error pages. If you could add a link to a few of the 404 error pages we could help you figure out what is wrong with your site code or server setup.
-
Just my two cents friend..
4 days back, Google released Panda 4.0. You can check if that caused the drop.
Here is a tool that can help you find if any of the Google penalties are behind the drop:
http://www.barracuda-digital.co.uk/panguin-tool/
Once on the page, click on the 'Log-in to Analytics' button and allow the tool to access your Google Analytics account and check if the recent Panda caused the drop. Hope this helps.
Good luck. By the way, thin content is of no use these days and you should be investing all your quality time in producing quality content.
Best,
Devanur Rafi
-
I looked at some of your content, and some of it seems quite thin, such as the regulations for each state. There's really only a couple of sentences (in the instances that I saw) that deal with the individual state, and then there's a lot of boilerplate content, navigation, and other site elements that are the same from page to page. Just one more thing to think about.
-
It looks like google penalize you, it's happen to one of my websites on January, I was going nuts because I didn't see any message until 2 weeks after on my google webmaster tools. I would recommend a couple o days to see if you see something if not then try to check your links if a couple of websites you are linking got penalized they you can get in trouble too.
-
Kevin, any insight into where to start with respect to the 747 missing URL's?? What causes that? How to fix? Thanks!!
-
haha! Ok! Thanks Kevin!
-
No, no. My bad. You mentioned above that you've been ranking strong for two years, and then when I peaked at your site I saw the RT template. I wrongly assumed the Joomla template was released at the same time as the Magento template (I actually use the same exact template for Magento at www.88k.com.tw, although heavily modified). I was just thinking if you had done a site revamp with a new template that might be a factor in your recent bump off SERPs. Sorry to worry you about that. But it looks like you found an issue with the 404 errors. Good job.
-
Also, I just noticed this (see image). 747 missing URL's!?
-
What do you mean by "it's not 2 years old"? Is being under 2 years old a factor?
-
Thanks! Yes, it's Google. We actually are ranking better on Bing and Yahoo now!
Looked at Google Webmaster and it shows a steep drop on 5-21. (image attached)
-
A couple of things I'd do right away:
Look in Google Webmaster Tools to see if there are any notices there (I'm going to assume that it's Google where you are no longer ranking).
Look in your analytics to see if there was a particular day that you dropped off. You can then look to see if that coincided with any known algorithm update.
-
My bad. Looks like it is. It was release for Magento only late last year.
-
Always great to help out a fellow Rocketeer! Did you recently update your website, because that template is not 2 years old. This could certainly be a factor.
-
Thanks, Kevin. I haven't made any changes in months, and do not do any crazy linking schemes. Competitors seem to be at the same places on the page. We are the only one hit by this.
-
That's a tough one without more to go on. Google releases updates to it's ranking algorithm every so often and some site get hit hard. If you're content hasn't changed and you haven't engaged in any unusual activity in terms of link building or advertising, then I'd say wait it out. Give it a week or two, which is how long it's taken many other quality sites to bounce back from a Google update. Unlikely you'll have issues here, but you still might want to check your webmaster tools to see if any manual actions have been applied.
This might be a good time to go over your site, again, for the first time;-) See what could be done to answer visitor questions and lead them to the right pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Wrong Search words coming in search console
Hey there, My website All good but, in webmaster Search console some bad Queries(search terms) coming which is totally different from website. I want to make sure, is that harmful for my website traffic, as well as keywords Ranking?? How should i stop them to be crawl, ?? can any help for this query.?? i have attached screenshot of that, please check & help out, http://prntscr.com/cmusoq Thnx in advance.
Intermediate & Advanced SEO | | poojaverify060 -
Javascript search results & Pagination for SEO
Hi On this page http://www.key.co.uk/en/key/workbenches we have javascript on the paginated pages to sort the results, the URL displayed and the URL linked to are different. e.g. The paginated pages link to for example: page2 http://www.key.co.uk/en/key/workbenches#productBeginIndex:30&orderBy:5&pageView:list& The list is then sorted by javascript. Then the arrows either side of pagination link to e.g. http://www.key.co.uk/en/key/workbenches?page=3 - this is where the rel/prev details are - done for SEO But when clicking on this arrow, the URL loaded is different again - http://www.key.co.uk/en/key/workbenches#productBeginIndex:60&orderBy:5&pageView:list& I did not set this up, but I am concerned that the URL http://www.key.co.uk/en/key/workbenches?page=3 never actually loads, but it's linked to Google can crawl it. Is this a problem? I am looking to implement a view all option. Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Is this organic search sketchiness worth unwinding?
Started working on a site and learned that the person before me had done a fairly sketchy maneuver and am wondering if it's a net gain to fix it. The site has pages that it wanted to get third party links linking to. Thing is, the pages are not easy to naturally link to boost them in search. So, the woman before me started a new blog site in the same general topic area as the first/main site. The idea was to build up even the smallest bit of authority for the new blog, without tipping Google off to shared ownership. So, the new blog has a different owner/address/registrar/host and no Google Analytics or Webmaster Tools account to share access to. Then, as one method of adding links to the new blog, she took some links that originally pointed to the main site and re-directed them to the blog site. And voila! ...Totally controllable blog site with a bit of authority linking to select pages on the main site! At this point, I could un-redirect those links that give the blog site some of its authority. I could delete the links to the main site on the blog pages. However, on some level it may have actually helped the pages linked to on the main site. The whole thing is so sketchy I wonder if I should reverse it. I could also just leave it alone and not risk hurting the pages that the blog currently links to. What do you think? Is there a serious risk to the main site in this existing set up? The main site has hundreds of other links pointing to it, a Moz domain authority of 43, thousands of pages of content, 8 years old and Open Site Explorer Spam Score of 1. So, not a trainwreck of sketchiness besides this issue. To me, the weird connection for Google is that third party sites have links that (on-page-code-wise) still point to the main site, but that resolve via the main site's redirects to the blog site. BTW, the blog site points to other established sites besides the main site. So, it's not the exclusive slave to the main site. Please let me know what you think. Thanks!
Intermediate & Advanced SEO | | 945010 -
How did my dev site end up in the search results?
We use a subdomain for our dev site. I never thought anything of it because the only way you can reach the dev site is through a vpn. Google has somehow indexed it. Any ideas on how that happened? I am adding the noindex tag, should I used canonical? Or is there anything else you can think of?
Intermediate & Advanced SEO | | EcommerceSite0 -
How do I Improve Google Local search position
Hi, I think its called local search position, what I'm referring to is when you do a search on a keyword and google lists not only the best matches but also usually the second match is a group of 3 businesses with telephone numbers, google reviews and at the bottom of the group it will say something like: "See results for <your keyword="">on a map. This is what I'm referring to. in anycase my question is if I click on the link to see more results on a map I'm listed as number 3, however on the search page before where the link is displayed which I just clicked on I'm not being listed and instead one business name is being listed three times, each of the listings uses the same address but a different telephone number, In addtion the business that is being listed three times is also listed in the results being returned above in this case position #1 for the keyword I have searched. I assume this has something to do with them also being listed in the group of local businesses below three time.. The business I'm interested in getting listed in this group of results is currently being listed page 2 position 5 for the keyword..</your> Any suggestions would be greatly appreciated.. Thanks in advance..
Intermediate & Advanced SEO | | robdob11 -
Ranking for local searches without city specific keywords?
Hey guys! I had asked this question a few months ago and now that we are seeing even more implicit information determining search results, I want to ask it again..in two parts. Is is STILL best practice for on-page to add the city name to your titles, h1s, content etc? It seems that this will eventually be an outdated tactic, right? If there is a decent amount of search volume without any city name in the search query (ie. "storefont signs", but no search volume for the phrase when specific cities are added (ie. "storefront signs west palm beach) is it worth trying to rank and optimize for that search term for a company in West Palm Beach? We can assume that if there are 20,000 monthly searches for the non-location specific term that SOME of them would be fairly local, so do we optimize the page without the city name and trust Google to display results with a local intent...therefore showing our client's site in the SERPS when someone searches "sign company" and they are IN West Palm Beach? If there is any confusion, please just ask me to clarify! I think this would be a great WhiteBoard Friday topic for Rand!
Intermediate & Advanced SEO | | RickyShockley0 -
Google Webmaster Now Shows YourMost Recent Links
I just saw this story today about a new Google Webmaster feature which lets you download a file of the most recent links. http://searchengineland.com/google-now-shows-you-your-most-recent-links-127903 I downloaded the file today and I already discovered a major site issue. Our site blog was completely duplicated on a secondary domain we own and Google was showing that site as recent links. I already emailed the dev team to fix this pronto. Anybody else using this new feature and perhaps can share if it helps you in any way.
Intermediate & Advanced SEO | | irvingw1 -
Could targeting 2 geographic locations decrease rankings?
Hello, I think that us targeting 2 different geographic locations (San Francisco, CA and Salt Lake City, UT) is negatively effecting the rank of some of our main keywords. My evidence for this: Since December our main keyword (NLP) dropped in ranking for nlpca(dot)com from about 19th to about 40th. This is about when we started to really target 2 different locations. Other main keywords dropped a lot as well, like the important term "NLP Training" Also, our name, nlpca(dot)com indicates NLP California (CA stands for California in Google) The other day we bolded a sentence with the words "Salt Lake City, Utah" at the top of our content and in one of Google's Databases (the one I was looking at) we dropped in rankings for "NLP California" where we used to be completely sitelinked (where we took up a lot of space at the top of the search). Also, we shot up to 1st on my datacenter for both "NLP Utah" and "NLP Salt Lake City". At the same time, our rankings for the term "NLP" dropped off the map. It has come back up, but we've also targeted California again. A lot of our anchor text has the word "California" in it. We're thinking about building a separate site for Utah and just linking to it from the California website when we need to. Does it sound to you, in your experience, that targeting both locations in our case is what's causing a decrease in rankings? Thank you!
Intermediate & Advanced SEO | | BobGW0