Our Panda Content Audit Process
-
We've put together this process over the past year that has shown success when it comes to sites that appear to be hit by Panda.
The idea was to put together a process that would allow us to give our clients an understanding of the problem at hand and metrics we can use to explain how recovery is going.
Would love to hear your opinion or if you have a different/similar strategy.
-
We updated this today to include a new type of content to look out for...
Broken Content (Extremely Dangerous) – Broken Content refers to pages that are broken in some way. Ask yourself if a visitor arrived at your page, would they notice something was wrong with the code? This typically includes broken links (internal and external), broken images, embedded content that no longer works and improper use of HTML or CSS.
We added this after seeing that Google was typically filtering/unfavorable of "broken content".
Let us know what you think!
-
Good idea. Was considering putting this one on Youmoz but decided it was better to get it up ASAP for our clients' sake.
-
Very good process and well explained in the blog. You should do a You Moz post on this showcasing case studies and results that you are experiencing with your clients following this process.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Categories VS Tag Duplicate Content
Hello Moz community, I have a question about categories and tags . Our customer www.elshow.pe just had a redesign of its website. We use the same categories listed before . The only change was that two sub categories were added ( these sub-categories were popular tags before ) .Then now I have 2 URL's covering the same content: The first is the URL of the subcategory : www.elshow.pe/realitys/combate/ The second is the URL that is generated by the tag "combate" that is www.elshow.pe/noticias/combate/ I have the same with the second sub category: "Esto es guerra" www.elshow.pe/realitys/esto-es-guerra/ www.elshow.pe/noticias/esto-es-guerra/ The problem is when I search the keyword "combate" in my country (Perú), the URL that positions is the tag URL in 1st page. But, when I search for "esto es guerra" the URL that positions is the **sub category **in the second page. I also check in OSE both links and sub categories goes better than tags. So what do you guys recommend for this? 301 redirect? canonicals? Any coment is welcome. Thanks a lot for your time. Italo,
Technical SEO | | neoconsulting
@italominano WmzlklG.png 1RKcoX8.png0 -
How can something be duplicate content of itself?
Just got the new crawl report, and I have a recurring issue that comes back around every month or so, which is that a bunch of pages are reported as duplicate content for themselves. Literally the same URL: http://awesomewidgetworld.com/promotions.shtml is reporting that http://awesomewidgetworld.com/promotions.shtml is both a duplicate title, and duplicate content. Well, I would hope so! It's the same URL! Is this a crawl error? Is it a site error? Has anyone seen this before? Do I need to give more information? P.S. awesomewidgetworld is not the actual site name.
Technical SEO | | BetAmerica0 -
Google inconsistent in display of meta content vs page content?
Our e-comm site includes more than 250 brand pages - lrg image, some fluffy text, maybe a video, links to categories for that brand, etc. In many cases, Google publishes our page title and description in their search results. However, in some cases, Google instead publishes our H1 and the aforementioned fluffy page content. We want our page content to read well, be descriptive of the brand and appropriate for the audience. We want our meta titles and descriptions brief and likely to attract CTR from qualified shoppers. I'm finding this difficult to manage when Google pulls from two different areas inconsistently. So my question... Is there a way to ensure Google only utilizes our title/desc for our listings?
Technical SEO | | websurfer0 -
Duplicate content, how to solve?
I have about 400 errors about duplicate content on my seomoz dashboard. However I have no idea how to solve this, I have 2 main scenarios of duplication in my site: Scenario 1: http://www.theprinterdepo.com/catalogsearch/advanced/result/?name=64MB+SDRAM+DIMM+MEMORY+MODULE&sku=&price%5Bfrom%5D=&price%5Bto%5D=&category= 3 products with the same title, but different product models, as you can note is has the same price as well. Some printers use a different memory product module. So I just cant delete 2 products. Scenario 2: toners http://www.theprinterdepo.com/brother-high-capacity-black-toner-cartridge-compatible-73 http://www.theprinterdepo.com/brother-high-capacity-black-toner-cartridge-compatible-75 In this scenario, products have a different title but the same price. Again, in this scenario the 2 products are different. Thank you
Technical SEO | | levalencia10 -
Familiar with the malware reinclusion process?
One of our sites was haXX0red and at the moment I'm thinking it was a non-updated paid for WP plugin using the old version of timthumb. While not important to my question, the hack included .htaccess files in all the /uploads/ to redirect to a site (tonycar dot com) which I assume installed some sort of malware or spyware. I changed all ftp and admin log ins, updated the timthumb files and deleted all the .htaccess files, for added measure I've currently made the upload folders read only. I've requested a review through webmaster tools and the image that WMT claimed to be an issue has been removed as being an issue. That is to say if I clicked on the malware warning in WMT, it told me imagex.jpg was a problem and now it doesn't tell me anything is an issue, though the malware warning still persists. As I no longer have any indication as to what (if anything) is wrong, I tried going through some contacts at adwords to no avail, though they have said there's a note saying there's no malware currently on the site (I'm hoping that's by them and not just my reinclusion request). Assuming the all mighty G is now satisfied there's no malware on the site (or being processed by the site), does anyone have any idea how to get rid of the warning? Alternatively if the warning is accurate, how can I find out what's being effected?
Technical SEO | | StalkerB0 -
Content Delivery Network
Anyone have a good reference for implementing a content delivery network? Any SEO pitfalls with using a CDN (brief research seems to indicate no problems)? I seem to recall that SEOmoz was using Amazon Web Services (AWS) for CDN. Is that still the case? All CDN & AWS experiences, advice, references welcomed!
Technical SEO | | Gyi0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0