Whether our shared articles caused Panda hit to our high quality site
-
Hello,
We are a quality site hit by Panda
Our article collection:
http://www.nlpca(dot)com/DCweb/NLP_Articles.html
is partially articles written by the site owners and partially articles that are elsewhere on them web.
We have permission to post every article, but I don't know if Google knows that.
Could this be why we were hit by Panda? And if so, what do we do? We've dropped way down in rank but have worked our way half-way back up.
Two of our main keywords are:
NLP
NLP Training
Thanks!
-
You have some valid points to consider... things seem to be improving and the articles that you might cut do pull in some traffic.
I can't tell you how to make your decision but here is how I made mine..
I had hundreds of republished articles but a lot more that I had written myself. Deleting lots of republished articles would cut my traffic and cut my income. Noindexing them would cut my traffic and cut my income. However, although those were serious losses they were small in comparison to other content on my site. So, knowing that google does not like duplicate content I got rid of them. There is still lots of great content on my site, visitors still find stuff to read, I know which of the things that I cut I should author a customized version for my own site.
The upside.... My site is more compact but still has thousands of content pages and the content that remains should be a lot stronger. After making the cuts my rankings, income and traffic increased. Not quite to previous levels but back to nice numbers.
I have reduced risk and am pleased with that. Everything that I cut was redirected to similar content. The most valuable of what was cut will be replaced with custom content with 301 redirects from the old content.
============================
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What from this is unique? Definitely keep that. Keep what is not struggling in Google. Keep what is essential to your site but replace with better that you create yourself.
Do you see further risk in future panda updates?
Yep... that's why I cut off my foot.
My thoughts are to rel=author each of our own articles,
YES... In the past I wanted all of my content to be anonymously written. I have changed my mind on that and used rel=author on the best stuff.
no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles.
heh.... Here I would be chopping off two of those sites and merging them into one. I would have done that years ago before panda was ever heard of.
I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
I agree.
-
Hi EGOL,
We are getting a lot of traffic off of some of these articles, so the site owners are not sure they want to no-index them just in case that's not causing the problem. Our rankings have come up from 40 to 26 on our main term, and similar for other terms, even though we still have duplicate content. We were originally at 19 before a big drop in November/December
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What would you suggest I let the owner's know? Do you see further risk in future panda updates?
My thoughts are to rel=author each of our own articles, no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles. I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
However, it's not my decision, your thoughts?
-
I don't know. Everything that I have done is an experiment.
If you are really scared, delete... if you have some tolerance for uncertainty then play around with noindex or canonical. I deleted from a really important site.... used canonical where I the ranking loss was small and the risk was not jugular.
-
Hi EGOL,
When is no-indexing enough and when would you suggest deletion?
-
Can we no-index all the duplicate stuff? Or is some deletion necessary?
On one of my sites I deleted a lot and noindexed followed everything else that was a duplicate. We saw rankings recover in about a month.
On another site i had a lot of .pdf documents that were used to control printing of graphics. We used rel=canonical on them. That works very very slowly to remove them from the index. We are seeing slow recovery on that site.
if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
If the article belongs to someone else then I would noindex or delete. (Just saying what I would do it if was on my site). If it was my content I would set up a google + profile and use rel=author and rel=me to attribute them to a verified author.
-
Perhaps you could add a link to the original source on some of these where you have the permission. This should send a signal to google that you are showing it on your site for the convenience of users, but it is from a different source.
-
Can we no-index all the duplicate stuff? Is that enough to save our arse? Or is some deletion necessary?
I assume if we are not first in google for the content and title of an article, it is a potential duplicate content problem, correct? For example, if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
Any advice is appreciated. You're one of the best EGOL.
-
We have permission to post every article, but I don't know if Google knows that.
Google probably does not know and certainly does not care. If you have duplicate content on your site you are a potential target.
What type of link-building have you been doing? You might have been hit by the overoptimization penalty.
I was republishing some third-party content on a couple of my sites. I deleted most of it and no indexed the rest. Cut off your foot to save your arse.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ajax tabs on site
Hello, On a webpage I have multiple tabs, each with their own specific content. Now these AJAX/JS tabs, if Google only finds the first tab when the page loads the content would be too thin. What do you suggest as an implementation? With Google being able to crawl and render more JS nowadays, but they deprecated AJAX crawling a while back. I was maybe thinking of doing a following implementation where when JS is disabled, the tabs collapse under each other with the content showing. With JS enabled then they render as tabs. This is usually quite a common implementation for tabbed content plugins on Wordpress as well. Also, Google had commented about that hidden/expandable content would count much less, even with the above JS fix. Look forward to your thoughts on this. Thanks, Conrad
Intermediate & Advanced SEO | | conalt1 -
How get an image on a third party site rank high on goode images?
Hello, I have sometimes articles written about my product online, is there anything else I can do except make a good file name for it, perhaps I can ask the site owner to modify in the article to make it rank higher? Also on some small websites I can see that images rank very high for the specific search term that is difficult to rank for in images, if I were to contact the site with a sponsored post request, what I should make sure the site adds except filename to that sponsored post... I think there are also some other methods such as reddit to make images rank high on third party page, just need to find out how... thanks a lot
Intermediate & Advanced SEO | | bidilover0 -
High Quality Domains and what to do with them
Hi, I rus a travel photography business. The primary function of the website is to sell prints, though I blog about my travels on the same domain name as well as a few pieces of content that are helpful to users interested in some of the places I travel to. I do okay with it, but obviously, I am always looking for a way to increase visibility and sales of prints. I own a couple of high quality keyword domain names, that I've been trying to figure out what to do with. One of which is for a city that my prints of my photography are probably best known for. The domains I'm really trying to decide what to do with are basically a www.citystatephotography.com and www.citystatephotos.com, where the city and state are the ones I'm targeting. The question is, what do I do with it? I've seen various ideas from other photographers that have various levels of success. Here are the options I'm considering: Just redirect it to the photo gallery of photos that I'm trying to rank highly for. From what I read on various blogs, this doesn't really do much of anything, but maybe I've read wrong? Create a website or microsite with some quality content related to the city that also links back to my photography website on various places and possibly once in the navigation. I do have quality content I could put up that would be helpful to people from the city besides just trying to get sales. But there's always a chance this will cannibalize my original domain without helping sales, I assume? Spam my photo galleries across two domains. Most of my photography galleries would stay on my main domain that I already run, but the photo galleries that are key to that city would be hosted on that citystatephotography.com domain name. I've seen a photographer from Colorado do quite well with this method. (www.imagesofrmnp.com and www.morninglight.us) He's heavily known for his images of Rocky Mountain National Park and that seems to be his main brand, but all of his non-RMNP travel photography goes on the other site. The two sites look almost identical, though they link back and forth fairly extensively. There doesn't seem to be much in the way of duplicate content either. I've considered this method, but I'm nervous I'll kill what I've already built up if this were to fail. Do nothing with the domains. Seems wasteful, as these domains, particularly the citystatephotography.com domain seems useful in some way. Any thoughts? Thanks in advance!
Intermediate & Advanced SEO | | shannmg10 -
When Mobile and Desktop sites have the same page URLs, how should I handle the 'View Desktop Site' link on a mobile site to ensure a smooth crawl?
We're about to roll out a mobile site. The mobile and desktop URLs are the same. User Agent determines whether you see the desktop or mobile version of the site. At the bottom of the page is a 'View Desktop Site' link that will present the desktop version of the site to mobile user agents when clicked. I'm concerned that when the mobile crawler crawls our site it will crawl both our entire mobile site, then click 'View Desktop Site' and crawl our entire desktop site as well. Since mobile and desktop URLs are the same, the mobile crawler will end up crawling both mobile and desktop versions of each URL. Any tips on what we can do to make sure the mobile crawler either doesn't access the desktop site, or that we can let it know what is the mobile version of the page? We could simply not show the 'View Desktop Site' to the mobile crawler, but I'm interested to hear if others have encountered this issue and have any other recommended ways for handling it. Thanks!
Intermediate & Advanced SEO | | merch_zzounds0 -
Site re-design, full site domain A/B test, will we drop in rankings while leaking traffic
We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete
Intermediate & Advanced SEO | | leshonk0 -
Site not progressing at all....
We relaunched our site almost a year ago after our old site dropped out of ranking due to what we think was overused anchor text.... We transferred over the content to the new site, but started fresh in terms of links etc. And did not redirect the old site. Since the launch we have focused on producing good content and social, but the site has made no progress at all. The only factor I can think off is that one site linked to us from all of their pages, which we asked them to remove which they did over 3 months ago, but still showing in Webmaster tools.... Any help would be appreciated. Thanks
Intermediate & Advanced SEO | | jj34340 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Why was I hit by the above the fold algo?
I cannot find any decent information on this update. Everybody appears to reiterate what Matt Cutts posted. I understand what the update was 'supposed' to combat and achieve. I also understand what those abusing ads above the fold, are supposed to do. But I need to know what to do, when you don't have too many ads above the fold. One of my websites was hit. Several professionals have tried to convince me it was Panda, that was released on 18th January. But I am now 99.9% certain it wasn't. I'm in the UK, traffic on the 18th was normal, little better than expected actually. Traffic on the 19th was normal until around 10pm GMT and there as a noticeable dip, compared with previous days and weeks. January 20th continued this decline and at the end of the day, there was a 25% loss. I have 1 Google ad, that is above the fold. Most of the pages have no ads at all. No affiliates or anything else. I have always prided myself in keeping my website clean and ad free. We also have the lowest bounce rates in the niche, by a mile, as far as I can tell, from speaking to other webmasters. I cannot remove this 1 ad, the servers alone, cost a couple of grand. The advertising supports the website. Nothing I can find online, gives me any indicators, as to what might be tripping this filter. But I need to figure it out, as traffic has now reached a 50% loss, which equates to about 15,000 visitors per day. I have tore my pages apart, removed functionality, added functionality and yet traffic has continued to decline. Can anybody shed light on this? I worry that perhaps Google is seeing the images on my website as ads. The images open a bigger preview in lightbox, but with JS turned off, opens a new page, with nothing but the image on it. I am at my wits end. Any help would be greatly appreciated. Paul
Intermediate & Advanced SEO | | seo-wanna-bs0