Whether our shared articles caused Panda hit to our high quality site
-
Hello,
We are a quality site hit by Panda
Our article collection:
http://www.nlpca(dot)com/DCweb/NLP_Articles.html
is partially articles written by the site owners and partially articles that are elsewhere on them web.
We have permission to post every article, but I don't know if Google knows that.
Could this be why we were hit by Panda? And if so, what do we do? We've dropped way down in rank but have worked our way half-way back up.
Two of our main keywords are:
NLP
NLP Training
Thanks!
-
You have some valid points to consider... things seem to be improving and the articles that you might cut do pull in some traffic.
I can't tell you how to make your decision but here is how I made mine..
I had hundreds of republished articles but a lot more that I had written myself. Deleting lots of republished articles would cut my traffic and cut my income. Noindexing them would cut my traffic and cut my income. However, although those were serious losses they were small in comparison to other content on my site. So, knowing that google does not like duplicate content I got rid of them. There is still lots of great content on my site, visitors still find stuff to read, I know which of the things that I cut I should author a customized version for my own site.
The upside.... My site is more compact but still has thousands of content pages and the content that remains should be a lot stronger. After making the cuts my rankings, income and traffic increased. Not quite to previous levels but back to nice numbers.
I have reduced risk and am pleased with that. Everything that I cut was redirected to similar content. The most valuable of what was cut will be replaced with custom content with 301 redirects from the old content.
============================
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What from this is unique? Definitely keep that. Keep what is not struggling in Google. Keep what is essential to your site but replace with better that you create yourself.
Do you see further risk in future panda updates?
Yep... that's why I cut off my foot.
My thoughts are to rel=author each of our own articles,
YES... In the past I wanted all of my content to be anonymously written. I have changed my mind on that and used rel=author on the best stuff.
no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles.
heh.... Here I would be chopping off two of those sites and merging them into one. I would have done that years ago before panda was ever heard of.
I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
I agree.
-
Hi EGOL,
We are getting a lot of traffic off of some of these articles, so the site owners are not sure they want to no-index them just in case that's not causing the problem. Our rankings have come up from 40 to 26 on our main term, and similar for other terms, even though we still have duplicate content. We were originally at 19 before a big drop in November/December
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What would you suggest I let the owner's know? Do you see further risk in future panda updates?
My thoughts are to rel=author each of our own articles, no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles. I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
However, it's not my decision, your thoughts?
-
I don't know. Everything that I have done is an experiment.
If you are really scared, delete... if you have some tolerance for uncertainty then play around with noindex or canonical. I deleted from a really important site.... used canonical where I the ranking loss was small and the risk was not jugular.
-
Hi EGOL,
When is no-indexing enough and when would you suggest deletion?
-
Can we no-index all the duplicate stuff? Or is some deletion necessary?
On one of my sites I deleted a lot and noindexed followed everything else that was a duplicate. We saw rankings recover in about a month.
On another site i had a lot of .pdf documents that were used to control printing of graphics. We used rel=canonical on them. That works very very slowly to remove them from the index. We are seeing slow recovery on that site.
if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
If the article belongs to someone else then I would noindex or delete. (Just saying what I would do it if was on my site). If it was my content I would set up a google + profile and use rel=author and rel=me to attribute them to a verified author.
-
Perhaps you could add a link to the original source on some of these where you have the permission. This should send a signal to google that you are showing it on your site for the convenience of users, but it is from a different source.
-
Can we no-index all the duplicate stuff? Is that enough to save our arse? Or is some deletion necessary?
I assume if we are not first in google for the content and title of an article, it is a potential duplicate content problem, correct? For example, if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
Any advice is appreciated. You're one of the best EGOL.
-
We have permission to post every article, but I don't know if Google knows that.
Google probably does not know and certainly does not care. If you have duplicate content on your site you are a potential target.
What type of link-building have you been doing? You might have been hit by the overoptimization penalty.
I was republishing some third-party content on a couple of my sites. I deleted most of it and no indexed the rest. Cut off your foot to save your arse.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Migration of high traffic site:
We plan to perform a domain migration in 6 months time.
Intermediate & Advanced SEO | | lcourse
I read the different articles on moz relating to domain migration, but some doubts remain: Moving some linkworthy content upfront to new domain was generally recommended. I have such content (free e-learning) that I could move already now to new domain.
Should I move it now or just 2 months before migration?
Should I be concerned whether this content and early links could indicate to google a different topical theme of the new domain ? E.g. in our case free elearning app vs a commercial booking of presential courses of my core site which is somehow but not extremely strongly related) and links for elearning app may be very specific from appstores and from sites about mobile apps. we still have some annoying .php3 file extensions in many of our highest traffic pages and I would like to drop the file-extension (no further URL change). It was generally recommended to minimize other changes at the same time of domain migration, but on the other hand implementing later another 301 again may also not be optimum and it would save time to do it all at the same time. Shall I do the removal of the file extension at the same time of the domain migration or rather schedule it for 3 months later? On the same topic, would the domain migration be a good occasion to move to https instead of http at the same time, or also should we rather do this at a different time? Any thoughts or suggestions?0 -
Panda Recovery Question
Dear Friends, One of my customers was hit by the Panda, we were working on improve the tiny content on several pages and the remaining pages were: 1 NOINDEX/FOLLOW 2. Removed from sitemap.xml 3. Un-linked from the site (no one page on the site link to the pour content) As conclusion we can't see any improvement, my question is should I remove the pour content pages (404)? What is your recommendation? Thank you for your time Claudio
Intermediate & Advanced SEO | | SharewarePros0 -
Regional and Global Site
We have numerous versions of what is basically the same site, that targets different countries, such as United States, United Kingdom, South Africa. These websites use Tlds to designate the region, for example, co.uk, co.za I believe this is sufficient (with a little help from Google Webmastertools) to convince the search engines what site is for what region. My question is how do we tell the search engines to send traffic from other regions besides the above to our global site, which would have a .com TLD. For example, we don't have a Brazilian site, how do we drive traffic from Brazil to our global .com site? Many thanks, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
Can a Hosting provider that also hosts adult content sites negatively affect our SEO rankings on a non-adult site hosted on same platform?
We're considering moving a site to a host that also offers hosting for adult websites. Can this have a negative affect on SEO, if our hosting company is in any way associated with adult websites?
Intermediate & Advanced SEO | | grapevinemktg0 -
Lots of incorrect urls indexed - Googlebot found an extremely high number of URLs on your site
Hi, Any assistance would be greatly appreciated. Basically, our rankings and traffic etc have been dropping massively recently google sent us a message stating " Googlebot found an extremely high number of URLs on your site". This first highligted us to the problem that for some reason our eCommerce site has recently generated loads (potentially thousands) of rubbish urls hencing giving us duplication everywhere which google is obviously penalizing us with in the terms of rankings dropping etc etc. Our developer is trying to find the route cause of this but my concern is, How do we get rid of all these bogus urls ?. If we use GWT to remove urls it's going to take years. We have just amended our Robot txt file to exclude them going forward but they have already been indexed so I need to know do we put a redirect 301 on them and also a HTTP Code 404 to tell google they don't exist ? Do we also put a No Index on the pages or what . what is the best solution .? A couple of example of our problems are here : In Google type - site:bestathire.co.uk inurl:"br" You will see 107 results. This is one of many lot we need to get rid of. Also - site:bestathire.co.uk intitle:"All items from this hire company" Shows 25,300 indexed pages we need to get rid of Another thing to help tidy this mess up going forward is to improve on our pagination work. Our Site uses Rel=Next and Rel=Prev but no concanical. As a belt and braces approach, should we also put concanical tags on our category pages whereby there are more than 1 page. I was thinking of doing it on the Page 1 of our most important pages or the View all or both ?. Whats' the general consenus ? Any advice on both points greatly appreciated? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
Is this site legit?
http://www.gglpls.com/ is this site legit? Submit website to google + directory?
Intermediate & Advanced SEO | | SEODinosaur0 -
Has Panda 2.5 Hit?
I'm sure a few people have been asking this direct question throughout the forums but most of them are masked by indirect questions like "my traffic has dipped" and the like. Does anyone have a firm confirmation that Panda has hit? My indication that it has hit is that I'm experiencing a ~%30 increase in traffic to my e-commerce client from organic searches after this past weekend. We haven't made any significant changes to content besides daily postings, but even that doesn't account for a %30 spike that has maintained for 3 days straight. So again, what have you guys experienced? Anything to support this?
Intermediate & Advanced SEO | | linztm0