Breaking news, I guess Ripoffreport removed themselves? http://searchengineland.com/ripoff-report-not-banned-but-removes-itself-from-googles-index-89000
Posts made by john4math
-
RE: Ripoffreport.com De-Indexed?
-
RE: Ripoffreport.com De-Indexed?
Looks to be the case. Aaron Wall just posted about it: http://www.seobook.com/google-rips-rip-report-search-results
-
RE: Why is noindex more effective than robots.txt?
The disallow in robots.txt will prevent the bots from crawling that page, but will not prevent the page from appearing on SERPs. If a page with a lot of links to it is disallowed in the robots.txt, it may still appear on SERPs. I've seen this on a few of my own pages... and Google picks a weird title for the page...
If you put the meta noindex tag, Google will actively remove that page from their search results when they re-index that page.
Here was one webmaster central thread I found about it.
-
RE: Getting tons of duplicate content and title errors on my asp.net shopping cart, is there way to resolve this?
Sounds like you need canonical tags on these pages, if you need those URLs to have those parameters on them (so you can't redirect them). There was just another question in the forum about putting canonical tags on aspx pages here.
If you have no idea what a canonical tag is, read these first:
-
RE: How to Add canonical tags on .ASPX pages?
The extensions of the pages won't matter, provided you're able to actually put the canonical tag itself within the of the page. If you put in the , it'll be ignored.
You only need to put the canonical tag on pages that are duplicates of other pages. You'll need to be able to specify the correct href for the canonical tag for each page, which is the full URL of the page it's a duplicate of. If you only have that level of control to place this only on the duplicate pages, you are still ok, as you can have a page rel=canonical to itself (according to Matt Cutts here). So if all the duplicate URLs and the original URL all rel=canonical to the original page, it should work. If you don't even have that level of control, you might not be able to use the canonical tag. I hope that's what you mean by "Master Page"... if you have each master page rel=canonical to itself, it sounds like it could solve this for you.
FYI, if you can 301 redirect these duplicate pages to the original page, that's the preferred method of resolving duplicate content issues.
-
RE: Mobile vs Website Duplicate Data / Meta
Yeah... if you want the mobile version of your pages to appear in the mobile SERPs, I think you have to live with these errors. If you'd prefer your regular page to appear in the mobile SERPs, and you'll redirect the user when they get there, then you could rel=canonical your mobile site pages to their corresponding www pages, which should take care of these errors.
-
RE: Should our social network put all of our member profiles in the site map?
I would definitely do this. Since you have so many profiles, you'll want to have one main sitemaps linking to smaller sitemaps with the profile URLs. One of the sites I work on has several million activities created by users that we put into sitemaps. We automatically generate our sitemaps files frequently and add the new activities created. We include a last modified date for each activity as well so the search bot will know if anything has changed since the last time it's indexed it.
I would create a sitemap system where all of your profiles could be found, and by including the last modified date, you can leave it up to the searchbot as to whether or not a profile has been updated and needs to be re-indexed. There are a couple other properties you could use listed on http://www.sitemaps.org/protocol.php as well.
-
RE: Mobile vs Website Duplicate Data / Meta
Matt Cutts talks about it here. And here's some info from the Google Webmaster Blog. What he's recommending is to serve your mobile pages to Googlebot Mobile, and your regular pages to Googlebot.
And here a relevant Q&A question with some good answers.
I wouldn't change the title to "Mobile SiteName" unless when your site comes up in Google's Mobile Search, you want the page title to actually read "Mobile SiteName".
-
RE: Should "View All Products" be the canonical page?
I wouldn't show all products as a default page for users, as that doesn't sound like a good user experience.
Here are a few other Q&A entries from people with similar questions. They endorse some different solutions to the problem:
-
RE: Should "View All Products" be the canonical page?
It sounds like the changes are in the URL parameters and their values. Now in Google Webmaster Tools, if you go to the Site configuration > URL parameters page, you can tell Google how different parameters affect the page.
-
RE: What are the Best Practices for moving a blog from subdomain to domain/subcategory?
If you're going to do it in phases, I'd do it as follows:
- Set up domain.com/blog, and make sure it's serving pages correctly. So now, both blog.domain.com and domain.com/blog would be serving duplicate content for the time being.
- Set up all pages in blog.domain.com to 301 redirect to their counterparts under domain.com/blog.
- Update all links in my site pointing to blog.domain.com pages to point to their counterparts under domain.com/blog.
As long as there isn't a lot of lag between steps 1 & 2 I wouldn't worry too much about duplicate content, as there won't be links pointing to your blog under domain.com/blog, so Google likely won't even find it.
-
RE: Does a mobile site count as duplicate content?
Matt Cutts talks about it here. And here's some info from the Google Webmaster Blog. What he's recommending is to serve your mobile pages to Googlebot Mobile, and your regular pages to Googlebot.
And here a relevant Q&A question with some good answers.
-
RE: Does This Really Annoy You
I'd be more concerned with writing quality content and getting a following as opposed to trying to post your articles on article websites with links back to your site. Get a good following on Twitter, Facebook, and an RSS feed, and promote your content that way.
Writing 8 articles is a lot in one week. You would be better off focusing on quality vs. quantity. Google is looking for amazing content to promote at the top of its SERPs. Rand said it best in this Whiteboard Friday:
Will: Good content is . . .
Rand: Mediocre at this point in terms of value.
In the post-Panda world we're living in, you really have to provide exceptional content that people will find value in, enjoy, and link to. Ask yourself, would you expect to see this article in print? (also from that Whiteboard Friday)
-
RE: Redirecting duplicate .asp pages??
ASP is dynamic. HTML is simply formatting of content. ASP streams HTML on output to the browser. You can do much much much more with ASP than with flat up HTML.
-
RE: Adding no follow links on my site
My understanding of nofollow is that you really shouldn't be using it for links to other pages on your site. There's no benefit to it. You used to be able to sculpt the flow of pagerank, but they changed how it worked awhile ago to make it so it's not beneficial to do this to your own links. For example, suppose page A has 6 units of pagerank to pass, and links out to the following pages:
- page B
- Page C
- Page D
Normally, each of the pages would get 2 units of pagerank. Suppose you nofollow the link to page D. What will happen is that page B and page C will continue to get 2 units of pagerank, and page D will get none. It didn't increase the pagerank passed to pages B and C.
So, nofollowing a link will prevent pagerank from getting passed to that page, but won't increase the pagerank passed to the other linked pages from that page, so I wouldn't do it.
-
RE: Redirecting duplicate .asp pages??
Yeah, if you're serving duplicate content under different URLs like this, it's best if you 301 redirect them all to the same URL. So for your home page, you should redirect www.<my domain="">.co.uk/index.asp to www.<my domain="">.co.uk, for the same reason you want to redirect the www.<my domain="">.co.uk/index.html to www.<my domain="">.co.uk. You should pick one URL for your contact us page as well, and redirect the duplicate versions to that. Roger may not have found that other version of your contact us page, but if it's duplicate content, go ahead and redirect the duplicate URLs to the URL you prefer for that page.</my></my></my></my>
The reason you do this is to keep all the link juice you're getting for those pages consolidated in one place, so rather than having two pages with some link juice competing on the search result pages, you can have one page with all the link juice which will rank better.
-
RE: Branching out from .com... good idea?
Oh wow, I had totally misinterpreted that article. Thanks for clarifying!
-
RE: Branching out from .com... good idea?
Of course you can. Any domain suffix can rank (unless excluded from Google altogether). Overall, .com's tend to rank the best. Take a look at http://www.seomoz.org/article/search-ranking-factors#metrics.
If you can get an exact keyword match for a suffix like .ly, it might be worthwhile over getting a .com domain that doesn't match as well.
-
RE: Google Adwords CPC Tips and Bid Managment
I agree 100% with EGOL to look at your quality scores. If your scores are low for particular keywords, but the keywords look like they should be well-targeted, you might want to change your match type for those keywords. Google has several: broad, modified broad, phrase, and exact. If a broad match keywords isn't doing that well, you might want to change it to phrase, or exact. The broad matching might be matching queries that aren't a good fit. Here's what Google says about their match types.
There are a lot of features and targeting options that Adwords offers that you might not be taking advantage of. Do you have sitelinks for your search campaigns? Does it make sense to show ads on Google Image search? If appropriate for your site, have you tried a remarketing campaign? Have you tried YouTube advertising? Do you run display ads with Banners/Rich media ads and text ads? Have you tried targeting the display network with topics and audiences? Are there sites and pages you always want to advertise on in the display network? Most people start with keyword-driven campaigns for search and display, but there's a lot more too it. Some of our best converting campaigns aren't keyword driven.
In my case, Enhanced CPC never really did any better than regular CPC, and conversion optimizer has consistently shown an improvement over regular CPC bidding. I've heard other people say that conversion optimizer didn't do much for them. The only way to know is to try it out and see how it goes and compare it to the history when you were CPC bidding.
-
RE: Hyperlinks under description in organic listings ...
Even better, here is a Q&A post about it: http://www.seomoz.org/q/has-anyone-found-a-way-to-get-site-links-in-the-serps
-
RE: Hyperlinks under description in organic listings ...
There's no way to force their hand, but you can do some things to make it more likely. Here is Google's help page about sitelinks. And here are some articles I came across about things you can do to help your site get sitelinks:
- http://www.seopedia.org/internet-marketing-and-seo/google-sitelinks-the-ultimate-faq/
- http://www.techwyse.com/blog/search-engine-optimization/get-google-sitelinks-working-for-you/
- http://www.hochmanconsultants.com/articles/sitelinks.shtml
- http://www.webdesign-bureau-of-mauritius.com/optimising-site-for-google-sitelinks/
-
RE: Hyperlinks under description in organic listings ...
These are sitelinks. Google finds pages on your site it thinks are appropriate for them, and then adds them. You cannot add them yourself. Once they're added, you can block them from within Google Webmaster Tools (On the Site configuration > Sitelinks page), but only Google can add them.
If you run a Adwords campaign, you can configure them yourself for when your ad appears in the top spots.
-
RE: Please take a look at the SEO of my site
Do you have a blog, a Facebook page, or a twitter account? I didn't see links to any of these things. If you're continuing to write articles, that would be a good way to get your content out and build a following. That's a good way to show that you're a credible source for NLP coaching.
Some of the content doesn't read quite naturally. For example, if you were speaking, would you ever say Boise and Idaho as much as the bottom paragraph on the home page? It reads like you're stuffing keywords for Boise, Idaho. Some of the paragraphs in italics at the bottom of your pages read the same way for other keywords. These look like they were written soley for SEO. The keywords you've placed at the top of your home page and some other pages doesn't bother me that much, but it also looks like keyword stuffing to me.
I don't like the title text on your tabs, as when I roll over a tab and there is sub-navigation, the title text appears and covers the first item in the sub-navigation so I can't read it. For example, roll over "Free 30 Minute Sample Session".
Put alt text on your header image, and any other images on your site. Alt text should describe the image. If you want to display text when a user rolls over the image, you can place title text on the image as well.
Strive to keep the site up to date. Your training sessions page says "$79 if You Register by May 25".
Remove links to pages that don't exist. For example, if you click "Calendar" in the navigation, and then click the "The Wealthy Mind Information" link, you get to a 404 page.
If you have more testimonials, I always like seeing a page with lots of testimonials so I can see what people are saying. You have a couple of good ones on your home page.
Your jquery JS files aren't being found on any pages that have a subdirectory. On the home page, they're found properly at http://bobweikel.com/wp-content/themes/StudioBlue/jquery.min.js and http://bobweikel.com/wp-content/themes/StudioBlue/jquery.cross-slide.js. On your Free 30 Minute Sample Session page, it's trying to find them at http://bobweikel.com/free-sample-session-2/wp-content/themes/StudioBlue/jquery.min.js and http://bobweikel.com/free-sample-session-2/wp-content/themes/StudioBlue/jquery.cross-slide.js. It's trying to reference them in the "free-sample-session-2/" directory. You can change the references to these files from relative to absolute to fix this. For example, instead of pointing to "/wp-content/themes/StudioBlue/jquery.min.js", point to "http://bobweikel.com/wp-content/themes/StudioBlue/jquery.min.js"
Do you get a lots of newsletter signups? You might consider having a page for newsletter signups, and using that area on the side of all your pages to have a box to contact you, or a box to sign up for a free 30 minute sample lesson. This might be a good A/B test to run to see which has the most traction for your site. Or you might have room to have two boxes on the right there...
The best things I did when starting out was setting up Google Webmaster Tools and Bing Webmaster Tools, and poking around in those. Also, using the page inspector in Chrome, or the Firebug add-on in Firefox can tell you a lot about what is happening on a webpage. That's how I found the jquery errors.
-
RE: SEO Developers
It sounds like they want you to disallow those components in your robots.txt file to keep them from getting indexed by search engines. Here's what the Google Webmaster Help says about robots.txt. If the ads are in an iframe, you can disallow the page the iframe points to. If it's an Flash file for example, and the link is in the Flash, you can block robots from indexing any of these ads if you put all of them in their own directory. For ads that will get indexed (if they're in the HTML), if you put a "rel=nofollow" on the links, I think search engines consider that enough?
For page speed, there are a few free tools people to help with page speed. In Chrome, you can install Page Speed. You can install the add-on in Firefox as well if you've installed Firebug first. This will test your page and give you a list of things you can do to improve performance. Once it's installed, you can have it test any page on your site, and it'll give you a list of things to do to improve performance. Another similar Firefox add-on I haven't had much experience with is YSlow.
-
RE: SEO Developers
Sorry, but what is DART code? Looked around a bit but couldn't find any info about it.
Depending on what you want IT and marketing to do... mostly I monitor the tools, and tell marketing and IT what needs to be done. I don't think IT would need to be in there, you should probably be able to tell them what changes need to be made without them digging through data and reports. Marketing could use tools to find good keywords to target, and especially if they do link building to find opportunities for that.
-
RE: Building on specific keywords
I would think it would be more worthwhile to write these articles, and then post them on your site, rather than running around submitting them to different sites. If you write interesting and unique content, people will start to follow you and come to you to see what you have to say. Start getting links and tweets about your articles, and voila, your sites rankings will increase.
Doing things like submitting articles at a bunch of different sites is becoming more and more transparent to Google. They're onto this kind of thing, especially if you're submitting the same articles to different sites. This was somewhat mentioned in the SEOMoz post yesterday (see here). If I'm reading this correctly, you're going to be building a large truncated pagerank, which is a spam signal to the search engines...
I'd focus on writing interesting, unique articles, and work on linkbuilding to them at your site, as opposed to posting them elsewhere.
-
RE: SEO Developers
Good developers should do a lot of these things by default, like optimize page load time, use sprites, avoid duplicate content, load page content prior to ads, etc. A good SEO should be aware of all of these things, and when things need to change, should be able to communicate those changes to a developer. Identifying these issues are more on the SEOs themselves, not the developers. In my experience, most tasks are more front-end tasks, and a few are back-end, so depending on what your developer does, they should be able to handle the tasks within their niche if you point them out.
I don't think they need to put "SEO" in their skill-set.
-
RE: What should I do about links coming in that are from link farm type sites?
Google won't penalize you for their links to your site. You have no control over who links to you, and there's little you could do to get them removed. I see new links for my sites come up all the time, and some are from these spammy sites. I've never seen a penalty accessed.
If they did penalize sites for this, people would set up even more link farm sites just to link to their competitors.
-
RE: On my site, www.myagingfolks.com, only a small number of my pages appear to be indexed by google or yahoo. Is that due to not having an XML sitemap, keywords, or some other problem?
When I search in Google, it's showing me 800 pages!
Having a sitemap will help search engines index your pages better.
-
RE: Is it important to have exact keyword in your URL
In the SEOMoz 2011 Search Engine Ranking Factors report, long domain names show a -0.07 correlation with rankings. Having exact match domain names has a 0.22 positive correlation, so there is a lot to be said for that.
18 letters is pretty long. That's a lot of typing for someone to come to your site. Does a subset of this phrase make sense, like 2 of the words? That would be easier to type, and you'd still get a benefit from matching some of the search terms in the phrase. You'll also match less-specific phrases people will search for, so you'll do better in more searches.
-
RE: .us domains vs .com - What does Google Think?
In the SEOMoz 2011 rankings report, it shows that exact match .com domains are 0.22 correlated with higher rankings, whereas exact match domain names (with other domain suffixes, like .us) are only 0.17 correlated with higher rankings. So, it still appears that .com is still the boss.
-
RE: PPC + SEO - Both well ranked, which penalty if there's any.
Doing PPC won't affect your SEO. You might see some drop in some organic traffic, since people will be clicking your ads instead of your organic search results.
In one Google thread I found, the Google employee summed it up quite well:
Being an AdWords advertiser will not cause one's site to be listed or unlisted in the unpaid search results on the left hand side of the page. There is no relationship between one and the other at all. Put another way, being an AdWords advertiser will neither help nor hurt one's chances of being listed in the unpaid results, nor will it impact one's position in any way if they already appear in the unpaid search results.
-
RE: Rel Canonical issues for two urls sharing same IP address
This doesn't sound ideal. So only the home page is under URL B, and the rest of the pages are hosted under URL A? That would seem really strange to me as a user if I entered through URL B, went to another page, and the domain changed completely.
Ideally, you should 301 redirect everything under URL A to URL B rather than using rel=canonical for them. There's no reason to host two identical sites like this. It's fine for multiple sites to be hosted under the same IP. Here's a really good SEOMoz blog post about using 301 redirects vs.canonical tags.
-
RE: HTTP 404 for 404-page?
If it returns a 200 OK code, it can be indexed as a page and may appear in search results, which isn't great since you don't really want to be bringing people in to your 404 page. According to Wikipedia, this is a "Soft 404". Since there is a section under Crawl Errors in Google Webmaster Tools for Soft 404's, I take it that it's better to return a 404 code.
-
RE: Organic bounce rate after site re-launch
Some of the improvements are probably for non-paid keywords that aren't entirely relevant to your site. For example, if you had a construction website, and were targeting "house addition", and started doing well for addition queries, you might start appearing in addition searches where the user is looking to practice math. Construction isn't relevant to them, and they'll almost always bounce.
You should be able to look this up in whatever you're using to track analytics. Look at the queries with the highest bounce rates, and make sure they're relevant. If you notice of a cluster of keywords that are relevant to the site with high bounce rates, if it's worth your clients time, you might consider having them create some content for those queries.
-
RE: How to get Facebook likes?
I've seen promotions where if you like a company, you can get a discount on your next purchase. Or let people know you'll be running promotions through your page, so they have an incentive to like you.
You can also run Facebook ads to get people to like you. They offer a lot of demographic and interest targeting, so you could target the right audience. Once you get a few likes, you can target these ads to people who have friends who like you, which makes it more likely for that person to add you, since they can see their friend already likes you.
-
RE: Does 301 redirecting a site multiple times keep the value of the original site?
Yeah, it's a bad idea to chain multiple redirects. From the whiteboard Friday a little over a year ago where Rand talked with Matt Cutts:
Is It a Bad Idea to Chain Redirects (e.g. 301-->301-->301)?
"It is, yeah."
Matt was very clear that Google can and usually will deal with one or two redirects in a series, but three is pushing it and anything beyond that probably won't be followed. He also reiterated that 302s should only be used for temporary redirects...but you already knew that, right?
Also, here's a similar Q&A question with some good answers.
-
RE: Should I use a coupon code in my ad copy?
Looks like you asked this question twice... some answers are over at http://www.seomoz.org/q/should-i-put-a-promo-code-directly-into-adwords-copy.
-
RE: Questions regarding Google's "improved url handling parameters"
-
It seems like canonicalizing the URLs is still a good idea. For one, there are other search engines that respect the tag, so you want them to do the right things. However, it does sound like for Google that this could replace the need for canonicalization tags due to URL parameters. If you have duplicate content that's not caused by URL parameters, you'd still need the canonical tag. Just to be safe, I'm still doing both.
-
When I logged into my webmaster tools, all of the parameters Google had found were there already. I went through and edited them and added the new information.
-
-
RE: Should I put a promo code directly into Adwords copy?
I don't think it's necessarily bad. I'm just not sure it's as compelling as stating the savings and an expiration date in your ad. You could try running an ad with the code, and a similar one without it, and see which does better.
If the user doesn't write down the code and later forgets it, they may might not be able to find it again! If you do use a promo code in your ad copy, make sure it's easy to find on the landing page, or automatically added the customers cart when coming from your ad.
One article I found reinforcing this: http://certifiedknowledge.org/blog/be-careful-of-using-discount-codes-in-your-ad-copy/
-
RE: Parameter handling (where to find all parameters to handle)?
Yeah, Google generally will come across just about all of the parameters while indexing your site, but you can add parameters as well. When you log into Google Webmaster Tools, you should see a list of parameters when you go to the Site configuration > URL parameters page. They've given more options now that you can change for each parameters, beyond whether or not Google should ignore them. If you click Edit for a parameter, now you can set:
- Does this parameter change page content seen by the user? (Yes, No)
- How does this parameter affect page content? (Sorts, Narrows, Specifies, Translates, Paginates, Other)
- Which URLs with this parameter should Googlebot crawl? (Let Googlebot decide, Every URL, Only URLs with value ___, No URLs)
It will also show you sample URLs with the parameter to make it easier to figure out when these parameters appear, which is very useful, as sometimes you don't which pages have which parameters.
Google's help file for this can be found here.
-
RE: Strange Ranking On 1st Page Of Google For Competitive Keyword
Yeah, it's your personalized results. If you go to http://www.google.com/search?q=mixtapes&pws=0 (the &pws=0 removes personalization), you won't find it on the first page, or in the first several pages of results.
-
What to tweet and blog about?
A little background:
The site I'm working on, www.ixl.com, is a math practice site for pre-K through 8th grade. Our content consists of math skills where we randomly generate math problems, grouped by grades and topics. We have around 300 skills per grade.
The question:
We don't currently have a twitter feed, Facebook page, or a blog. I'm working on setting these up, or at least a twitter feed and the Facebook page first. I'm wondering, what kinds of things will we tweet about once I get a twitter feed set up? What kinds of things should we be tweeting about? Things like online education articles? And articles about teaching math? Do you guys think that would be compelling enough to get followers and make this productive for us?
Another question, how important do you think a blog is, compared to a twitter feed and Facebook page? I was thinking it would be fun to set up a blog, and post math questions from the site 2x or 3x a week, as well as other content that could overlap with the twitter feed.
-
RE: Hosting in the US for an Australian website
You should be able to get around this with geotargeting. You can set indicators for Google and other search engines that your site is for that country. I don't think where you host the site is as important as having good machines and indicating to the search engines what country and language your site is for.
Geotargeting:
-
RE: Webmaster Tools 404 Errors Pages Never Created
Most of the time in GWT you can see the page where the referring link is coming from. The crawl errors in GWT are links to your site, so if people use an invalid URL to link to your site, it'll show up as a 404 error in GWT.
In my experience when this happens en masse, some spammy site goes a little nuts and tries to link to a bunch of pages, but goofs something up to make the URLs 404s, like putting a extra space which renders as a %20, in the href at the end of the URL.
If it's a common mistake, you might considering putting a rewrite rule in to 301 redirect those 404 URLs to valid pages on your site. You could also contact the webmaster of that site to fix the URL or URLs that are invalid (especially if it would be a valuable link). Lastly, you can do nothing and ignore the errors.
-
RE: Any body hear anything about this new feature in adwords?
I think you have to participate in their new Google Offers program (their version of Groupon), as that "View offer" button goes to an Google offer page. Read more about it here:
-
RE: Domains for regional websites
If you're having issues with Bing as well, the directions for geotargeting are here: http://www.bing.com/community/site_blogs/b/webmaster/archive/2011/03/01/how-to-tell-bing-your-website-s-country-and-language.aspx
-
RE: Submitting multiple sitemaps
I'm not sure about your HTML sitemap; I don't think HTML sitemaps are a supported format for you to submit to Google (I don't see them on sitemaps.org). You just need Google to crawl this page, and all the pages it links to? There is a plain text format (see here) that is allowed for sitemaps. You could probably change your HTML sitemap pretty easily to that format.
I'm pretty sure you're allowed to submit multiple sitemaps, but I can't find anything concrete saying you can or can't. The Google Webmaster Tools UI seems to support it, so my guess is that it would be fine. Try it and see if it works? You could also create a sitemap index file that references both these sitemaps.
You can read more about sitemaps on sitemaps.org. According to the Google help doc here, they adhere to these standards.
-
RE: Too many pages indexed in SEOMoz
Have you looked at the errors in the campaign? I would suspect you have some duplicate content issues. For example, does your client serve the same page for the following instead of redirecting to the same URL?
- example.com
- example.com/
- example.com/index.php (or .html, or just /index)
- www.example.com
- www.example.com/
- www.example.com/index.php (or .html, or just /index)
There are many more that it could be finding. Drill in to this campaign, and click the subtab for Crawl Diagnostics. Then find the Duplicate Page Content error and drill into that. It should give you an idea as to what's going on.