Panda 2.5
-
I'm sure we have all read about the latest round of Google's algorithm changes also known as the "Panda 2.5" updates. This latest update seems to have hit some pretty large press release sites including PR Newswire and Businesswire (both of these have a great page rank and domain authority making them a great tool for SEO's in regards to inbounds links).
Ultimately this update has directly affected their sites traffic, keyword rankings, and the number of indexed pages in Google. But what will this do to our smaller sites that benefit from these great links? Will these panda updates continue to target these content farms and lower their domain authority? Will that extrapolate out and effect the domain authority of our sites?
What are your thoughts for those of us that utilize these services, should we re-evaluate our process?
I look forward to a great discussion.
Regards - Kyle
-
Oh if I felt like agreeing. Understand it was a challenging day. I have one client site surpassed by two sites with PA and DA of 1. No links, on page worse than my clients and, arguably, we have better content.
On top of that I have another client site surpassed by clowns with Chinese and Australian links to no where, etc. The others are all junk - literally 40 to 60 linking root domains with maybe five even close to unpaid, non-reciprocal, and that are at least within 90 degrees of the site content (the others are somewhere beyond 180 degrees.
Yes, Google occasionally has a moment and a JC Penny feels the sting. On the whole, I see too many who do not and the rankings in anything competitive are replete with BS content, from BS sites stuffed to the gills with Keywords and linked to Bangladeshi laundromats.So, why the rant.....I was getting ready to suggest PR Newswire to my clients as a counterposition after seeing so many competing sites simply run the same content as on their homepage over and over through our RSS feeds.
Yes, both Ryan and Justin are correct; I just wish Mr. Google would take some uppers and get about the business of cleaning out the junkers.
Don't worry, I is still smilin' cause we are smarter than them! -
What are your thoughts for those of us that utilize these services, should we re-evaluate our process?
Yes, on an ongoing basis.
After every Panda update it is important to quickly assess what changes were made (i.e. who was hit) and how these changes affect our clients.
In short, Google has made us clearly aware of what they want to see in terms of links: a completely independent link to websites. They do not want to see any form of influence.
-
Press Releases is content usually under the complete control of the company which offered the release. It is not an unbiased link.
-
Articles published on various content farm sites are not (usually) independent links.
-
Links from various forms of link networks, directories, etc. all fall under this category as well.
While the above links do offer some value, it is greatly diminished compared to the value of an authentic link.
The question is, how do we adjust? My suggestion is to focus more then ever on onpage SEO. Work with clients to ensure their websites are more streamlined, more focused, more usable, liked, trusted, helpful (and 100 other adjectives) then ever before. There is one word which encompasses everything else...compelling.
In the world of sales there are products and services which need to be sold, and there are products and services which sell themselves. Is the content on your site what you want to write about? Or does it cover topics readers want to hear about? Do you have known detractors from site quality (i.e. ads, keyword stuffing, etc)?
TL:DR: Treat your website like it is the only site you own. Make the site the most helpful and compelling resource on the topic you cover.
Once the above is complete, you only need to let the world know about your site and they will want to link to it. Keep pounding away at on page until you feel you attained a level of perfection. Then, ask for and openly accept feedback. Ask other SEOs and more importantly ask your visitors whether it is via surveys or A/B testing. "Build it and they will come".
-
-
I feel Google no longer wants people to build links in the manner of the great bringing up the meek, but more of the meek making themselves individuals with their content and uniqueness so that people cause natural traffic. I think Panda is more about making an Organic operating system over a silicon hive mind that ranks people based off of their involvement with already established sites. This is just my opinion but I feel it is for the better, because it actually makes it easier to optimize a site if one take the initiative and has to the discipline to do research and hop on social and economic trends with great content which will gain all the organic users. Then again I could be entirely wrong, but I like Pandas and Google Panda ^.^
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi. Has anyone seen a drop in their PA recently? We have over 40 clients and 80% dropped and it has been like this for the past 2 months. I wanted to check if other websites are also experiencing this drop or is it just us?
I'm specifically asking for drops happened in the last 2 months, because before that we had a normal trend. Some websites would go up in their PA rankings and some would drop. But seeing 80% of our clients drop like this is just weird. Just wanted to see if other market leaders are also having the same issue so I can stop attributing it something X-Filish! Our clients are all .EDUs if anyone was wondering. Also the maximum drop has been 4 points in a month.
Intermediate & Advanced SEO | | AP_Search0 -
Huge httaccess with old 301 redirects. Is it safe to delete all redirects with no traffic in last 2 months?
We have a huge httaccess file over several MB which seems to be the cause for slow server response time. There are lots of 301 redirects related to site migration from 9 months ago where all old URLs were redirected to new URL and also lots of 301 redirects from URL changes accumulated over the last 15 years. Is it safe to delete all 301 redirects which did not receive any traffic in last 2 months ? Or would you apply another criteria for identifying those 301 that can be safely deleted? Any way to get in google analytics or webmaster tools all 301 that received traffic in the last 2 months or any other easy way to identify those, apart from checking the apache log files ?
Intermediate & Advanced SEO | | lcourse0 -
2 pages optimised for same keyword... what should I do?
Hi, I have two pages appearing in positions 11 and 12 for the keyword: 80 btl mortgage. These are: https://www.commercialtrust.co.uk/btl/landlord-advice/mortgages/btl-mortgage-80-ltv/ https://www.commercialtrust.co.uk/btl/product-types/80-buy-to-let-mortgages/ Both pages are good, provide useful information and I would not wish to remove one of them. However, I am concerned that the reason neither one of the pages is on page 1 is because the keywords targeted on both pages is essentially the same. Should I reoptimise one of them for other variations of 80 BTL mortgage keywords? (e.g. 80% LTV Buy to Let Mortgage, 80 Buy to Let Mortgage, etc etc) Or, is there another solution I haven't yet thought of? I welcome your insights! Thanks! Amelia
Intermediate & Advanced SEO | | CommT0 -
Panda Updates - robots.txt or noindex?
Hi, I have a site that I believe has been impacted by the recent Panda updates. Assuming that Google has crawled and indexed several thousand pages that are essentially the same and the site has now passed the threshold to be picked out by the Panda update, what is the best way to proceed? Is it enough to block the pages from being crawled in the future using robots.txt, or would I need to remove the pages from the index using the meta noindex tag? Of course if I block the URLs with robots.txt then Googlebot won't be able to access the page in order to see the noindex tag. Anyone have and previous experiences of doing something similar? Thanks very much.
Intermediate & Advanced SEO | | ianmcintosh0 -
HTML 5 sites, segmentation and Meta data?
Hello Mozers, I am currently building an HTML 5 site. I've run into a couple of issues. While implmenting segmentation in each of my mian menu iten, I am able to pluggin Meta data only for one segement (or the page). I am unable to inser Meta data for each of the segments. For example: I have (main menu) Services ----> Submenu (teaching, upgrading, Dancing) I can implement meta data for the Services but not for teaching, upgrading and Dancing as they are segment in the same page. Whats the best logic to get around this
Intermediate & Advanced SEO | | waspmobile0 -
SEOmoz is only crawling 2 pages out of my website
I have checked on Google Webmaster and they are crawling around 118 pages our of my website, store.itpreneurs.com but SEOmoz is only crawling 2 pages. Can someone help me? Thanks Diogo
Intermediate & Advanced SEO | | jslusser0 -
Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
I have a site that has been affected by Panda, and I think I have finally found the problem. When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to. The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding. Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet. Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used. With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example: http://www.mysite.com/en/this-is-a-pandalized-page/ 301 to http://www.mysite.com/this-is-the-rewritten-page/ The benefits of doing this are: decreasing the amounts of directories in the URL getting rid of pages that are possibly causing trouble getting fresh pages added to the site Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead 🙂 Thanks, Giorgio PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
Intermediate & Advanced SEO | | VisualSense1 -
Examples of sites other than Hubpages that have used subdomains to recover from Panda?
Everyone knows subdomains worked for Hubpages to recover from Panda. Does anyone know of other examples of sites that have recovered from Panda using subdomains?
Intermediate & Advanced SEO | | nicole.healthline0