How do I identify what is causing my Duplicate Page Content problem?
-
Hello,
I'm trying to put my finger on what exactly is causing my duplicate page content problem... For example, SEOMoz is picking up these four pages as having the same content:
http://www.penncare.net/ambulancedivision/braunambulances/express.aspx
http://www.penncare.net/ambulancedivision/recentdeliveries/millcreekparamedicservice.aspx
http://www.penncare.net/ambulancedivision/recentdeliveries/monongaliaems.aspx
http://www.penncare.net/softwaredivision/emschartssoftware/emschartsvideos.aspx
As you can tell, they really aren't serving the same content in the body of the page. Anybody have an idea what might be causing these pages to show up as Duplicate Page Content? At first I thought it was the photo gallery module that might be causing it, but that only exists on two of the pages...
Thanks in advance!
-
Ah right - OK then.
With regards to data coming back from SEOmoz's crawler, I might be tempted to ask them what it is seeing. I should really have a look at this myself because I haven't yet.
-
I'm currently getting that information from Moz's own web crawler wherein it tells me the pages of that have Duplicate Page Content and the other URLs that that duplicate content exists on.
With regard to the 301's - I have rewrite rules setup to 1.) set all requests to lowercase 2.) trim off home.aspx 3.) append www. to the beginning of the request, etc. When processed these should function as a single redirect / rewrite.
-
Before looking at the duplicate content (what did you use to find there is duplicate content?)... a quick question - you have a lot of 301's. Just want to check, are these just a single redirect or a redirect of a redirect etc?
-
I would add some content to these pages to help differentiate. None of them are text heavy so it may be hard for spiders to see a difference. Add a summary, maybe a text translation of what is in the vids, etc
-
Thanks for your reply... I guess more specifically I was wondering what it was about these particular page elements that makes search engines incapable of deciphering them from one another.
-
- Search engines don't know which version(s) to include/exclude from their indices
- Search engines don't know whether to direct the link metrics (trust, authority, anchor text, link juice, etc.) to one page, or keep it separated between multiple versions
- Search engines don't know which version(s) to rank for query results
When duplicate content is present, site owners suffer rankings and traffic losses and search engines provide less relevant results.
Hope this helps!
Resources, http://www.seomoz.org/learn-seo/duplicate-content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding rel=canonical on duplicate pages on a shopping site... some direction, please.
Good morning, Moz community: My name is David, and I'm currently doing internet marketing for an online retailer of marine accessories. While many product pages and descriptions are unique, there are some that have the descriptions duplicated across many products. The advice commonly given is to leave one page as is / crawlable (probably best for one that is already ranking/indexed), and use rel=canonical on all duplicates. Any idea for direction on this? Do you think it is necessary? It will be a massive task. (also, one of the products that we rank highest for, we have tons of duplicate descriptions.... so... that is sort of like evidence against the idea?) Thanks!
Web Design | | DavidCiti0 -
How to provide product information without duplicate content?
Hi all, I have an ecommerce website with approx 400 products, these are quite technical products and i use to have helpful information about the products on the product pages... My SEO company told me to remove all this, as i had lots of duplicate content issues... I have since had content writers re-write all product descriptions (about 250 words per product)... and now i am trying to figure out a way of getting the "helpful" information back on but in some kind of dynamic way... There is basically about 5 or 6 blocks of information, that can be added to each product page, these overlap hundreds of products. i was thinking of perhaps creating a separate static page for each block of useful information, and putting links on the product pages to this... however, ideally i would prefer to not keep sending customers to other pages... so wanted to see if others had come across similar issues themselves and how they went about having this "content" available to the user but in such a way it was not duplicate content... Please note using images would not be any good here, as the content varies in size but most of it is text based... regards James
Web Design | | isntworkdull1 -
Should my link href be www or go direct to page?
Hi, just wondering which is the best format for linking to pages. In my navigation at the moment i have links like; Car Repair Services Is this the recommended format or should it be; Car Repair Services Many thanks for any answers. Alex
Web Design | | SeoSheikh0 -
Sites went from page 1 to page 40 + in results
Hello all We are looking for any insight we can get as to why all (except 1) of our sites were effected very badly in the rankings by Google since the Panda updates. Several of our sites londonescape.com dublinescape.com and prague, paris, florence, delhi, dubai and a few others (all escape.com urls) have had major drop in their rankings. LondonEscape.net (now.com (changed after rank drop) ), was ranked between 4th & 6th but is now down around 400th and DelhiEscape.net and MunichEscape.com were both number 1 for several years for our main key words We also had two Stay sites number 1 , AmsterdamStay and NewYorkstay both .com ranked number 1 for years , NewYork has dropped to 10th place so far the Amsterdam site has not been effected. We are not really sure what we did wrong. MunichEscape and DelhiEcape should never have been page 1 sites ) just 5 pages and a click thru to main site WorldEscape) but we never did anything to make them number 1. London, NewYork and Amsterdam sites have had regular new content added, all is checked to make sure its original. **Since the rankings drop ** LondonEscape.com site We have redirected the.net to the .com url Added a mountain of new articles and content Redesigned the site / script Got a fair few links removed from sites, any with multiple links to us. A few I have not managed yet to get taken down. So far no result in increased rankings. We contacted Google but they informed us we have NOT had a manual ban imposed on us, we received NO mails from Google informing us we had done anything wrong. We were hoping it would be a 6 month ban but we are way past that now. Anyone any ideas ?
Web Design | | WorldEscape0 -
Getting a lot more duplicate content warnings than I expected.
I run WordPress on many of my sites and a site crawl has found MANY duplicate content pages on the latest domain I started a campaign for. I expected to see quite a lot on the tag pages that only had one post but even tag pages with multiple posts and author and category pages with many posts are showing as duplicate content. Is this normal for a WordPress site to have so much duplicate content warnings from the taxonomy pages? I have the option to bulk noindex, follow the category and tag pages but should I do it? I get some traffic directly to the tag pages so removing the pages from search results would dent the traffic of the site a little (generally high bounce rate, low engagement traffic anyway) but could removing the apparent duplicate content actually improve the article pages themselves? Or does anyone have any WordPress specific advice for making the pages not duplicate content? I've toyed with the idea of just displaying excerpts but creating manual excerpts for the 4 years worth of posts, some of which I have no personal knowledge of the subject matter so other suggestions are welcome.
Web Design | | williampatton0 -
Duplicate H1 tag IF it holds SAME text?
Hello people, I know that majority of SEO gurus (?) claim that H1 tag should only be used once per page. In the landing page design I'm working with, we actually need to repeat our core message stated in H1 & H2 - at the bottom of the page. Now the question is: Can that in any way cause any ranking penalty from big G? In my eyes that is not attempt to over optimize page as it contains SAME info as the H1 & H2 at the top of the page. Confusing, so I'm hope that some SEO gurus here will share some light on this. Thanks in advance!
Web Design | | RetroOnline0 -
Indexing Dynamic Pages
Hi, I am having an issues among others, regarding indexing dynamic pages. Our website, www.me-by-melia, was just put live and I am concerned the bottom naviagtion pages (http://www.me-by-melia.com/#store, http://www.me-by-melia.com/#facebook, etc) will not be indexed and create duplicate pages. Also, when you open these pages in a new tab, it takes you to homepage. The website was created in HTML5. Please advise.
Web Design | | Melia0 -
Do Pages That Rearrange Set Off Any Red Flags for Google?
We have a broad content site that includes crowdsourced lists of items. A lot of the pages allow voting, which causes the content on the pages (sometimes the content is up to 10 pages deep) to completely rearrange, and therefore spread out and switch pages often among the (up to 10) pages of content. Now, could this be causing any kind of duplicate content or any other kind of red flags for Google? I know that the more the page changes the better, but if it's all the same content that is being moved up and down constantly, could Google think we're pulling some kind of "making it look like we have new content" scheme and ding us for these pages? If so, what would anyone recommend we do? Let's take an example of a list of companies with bad customer service. We let the internet vote them up and down all the time, the order changes depending on the votes in real time. Is that page doomed, or does Google see it and love it?
Web Design | | BG19850