How do I identify what is causing my Duplicate Page Content problem?
-
Hello,
I'm trying to put my finger on what exactly is causing my duplicate page content problem... For example, SEOMoz is picking up these four pages as having the same content:
http://www.penncare.net/ambulancedivision/braunambulances/express.aspx
http://www.penncare.net/ambulancedivision/recentdeliveries/millcreekparamedicservice.aspx
http://www.penncare.net/ambulancedivision/recentdeliveries/monongaliaems.aspx
http://www.penncare.net/softwaredivision/emschartssoftware/emschartsvideos.aspx
As you can tell, they really aren't serving the same content in the body of the page. Anybody have an idea what might be causing these pages to show up as Duplicate Page Content? At first I thought it was the photo gallery module that might be causing it, but that only exists on two of the pages...
Thanks in advance!
-
Ah right - OK then.
With regards to data coming back from SEOmoz's crawler, I might be tempted to ask them what it is seeing. I should really have a look at this myself because I haven't yet.
-
I'm currently getting that information from Moz's own web crawler wherein it tells me the pages of that have Duplicate Page Content and the other URLs that that duplicate content exists on.
With regard to the 301's - I have rewrite rules setup to 1.) set all requests to lowercase 2.) trim off home.aspx 3.) append www. to the beginning of the request, etc. When processed these should function as a single redirect / rewrite.
-
Before looking at the duplicate content (what did you use to find there is duplicate content?)... a quick question - you have a lot of 301's. Just want to check, are these just a single redirect or a redirect of a redirect etc?
-
I would add some content to these pages to help differentiate. None of them are text heavy so it may be hard for spiders to see a difference. Add a summary, maybe a text translation of what is in the vids, etc
-
Thanks for your reply... I guess more specifically I was wondering what it was about these particular page elements that makes search engines incapable of deciphering them from one another.
-
- Search engines don't know which version(s) to include/exclude from their indices
- Search engines don't know whether to direct the link metrics (trust, authority, anchor text, link juice, etc.) to one page, or keep it separated between multiple versions
- Search engines don't know which version(s) to rank for query results
When duplicate content is present, site owners suffer rankings and traffic losses and search engines provide less relevant results.
Hope this helps!
Resources, http://www.seomoz.org/learn-seo/duplicate-content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirects Not Working / Issue with Duplicate Page Titles
Hi all We are being penalised on Webmaster Tools and Crawl Diagnostics for duplicate page titles and I'm not sure how to fix it.We recently switched from HTTP to HTTPS, but when we first switched over, we accidentally set a permanent redirect from HTTPS to HTTP for a week or so(!).We now have a permanent redirect going the other way, HTTP to HTTPS, and we also have canonical tags in place to redirect to HTTPS.Unfortunately, it seems that because of this short time with the permanent redirect the wrong way round, Google is confused as sees our http and https sites as duplicate content.Is there any way to get Google to recognise this new (correct) permanent redirect and completely forget the old (incorrect) one?Any ideas welcome!
Web Design | | HireSpace0 -
Any alternative techniques to display tabbed content without using Javascript / JSON and be SEO Friendly?
John Mueller's input in the EGWMH hangout suggests that Google MAY ignore expandable content served by Javascript. Are there any alternative techniques to display tabbed content without using Javascript / JSON and be SEO Friendly? I do however view these as good for website interactivity and UX - and see many examples of websites performing well and ranking highly whilst using these techniques - are there any Google friendly ways to serve content on a page so that search bots can recognise and choose to crawl / consume the content as legitimate fodder?
Web Design | | Fergclaw0 -
Above the Fold Content - Use of large images
Hi All, Our designers have come to the SEO team to ask if have a large image across the top of the page taking up a large majority of the above the fold real estate will impact our SEO. Our initial thoughts are no as long as we have an optimised H1 visibal to the user landing there which informs them what the page is about. Any thoughts would be appreciated.
Web Design | | J_Sinclair1 -
Best course of action when removing 100's of pages from your site?
We had a section on our site Legal News (we are a law firm). All we did there was rehash news stories from news sites (no original content). We decided to remove the entire Legal News section and we were left with close to 800 404's. Around this same time our rankings seemed to drop. Our webmaster implemented 301's to closely related content on our blog. In about a weeks time our rankings went back up. Our webmaster informed us that we should submit each url to Google for removal, which we did. Its been about three weeks and our Not Found errors in WMT is over 800 and seems to be increasing daily. Moz's crawler says we have only 35 404's and they are from our blog not the legal news section we removed. The last thing we want is to have another rankings drop. Is this normal? What is the best course of action when removing hundreds of pages from your site?
Web Design | | MFC0 -
Could our drop in organic rankings have been caused by improper mobile site set-up?
Site: 12 year old financial service 'information' site with lead gen business model. Historically has held top 10 positions for top keywords and phrases. Background: The organic traffic from Google has fallen to 50% of what it was over the past 4 months compared to the same months last year. While several potential factors could be responsible/contributing (not limited to my pro-active removal of a dozen old emat links that may be perceived as unnatural despite no warning), this drop coincides with the same period the 'mobile site' was launched. Because I admittedly know the least about this potential cause, I am turning to the forum for assistance. Because the site is ~200 pages and contains many 'custom' pages with financial tables, forms, data pulled from 3rd parties, custom/different layouts we opted for creating a mobile site of only the top 12 most popular pages/topics just to have a mobile presence (instead of re-coding the entire site to make it responsive utilizing a mobile css). -These mobile pages were set up in an "m." subdomain. -We used bi-directional tagging placing a rel=canonical tag on the mobile page, and a rel=alternate tag on the desktop page. This created a loop between the pages, as advised by Google. -Some mobile pages used content from a sub page, not the primary desktop page for a particular topic. This may have broken the bi-directional 'loop', meaning the rel=canonical on the mobile page would point to a subpage, where the rel=alternate would point to the primary desktop page, even though the content did not come from that page, necessarily. The primary desktop page is the one that ranks for related keywords. In these cases, the "loop" would be broken. Is this a cause for concern? Could the authority held by the desktop page not be transferred to the mobile version, or the mobile page 'pull away' or disperse the strength of the desktop page if that 'loop' was not connected? Could not setting up the bi-directional tags correctly cause a drop in the organic rankings? -Our developer verified the site is set up according to Google's guidelines for identifying device screen size and serving appropriate version of page. -Are there any tools or utilities that I can use to identify issues, and/or verify everything is configured correctly? -Are we missing anything important in the set-up/configuration? -Could the use of a brand new subdomain 'm.' in and of itself be causing issues? -Have I identified any negative seo practices or pitfalls? Am I missing or overlooking something? While i would have preferred maintaining a single, responsive, site with mobile css, it was not realistic given the various layouts, and owner's desire to only offer the top pages in mobile format. The mobile site may have nothing to do with the organic drop, but I'd like to rule it out if so, and I have so many questions. If anyone could address my concerns, it would be greatly appreciated. Thanks! Greg
Web Design | | seagreen0 -
Splash Pages For App Downlowds
Hi, We currently have a very simple splash page that Android and iPhone users see when they land on our homepage. The screen gives them the option to download our app or move on to the full website. If they choose to go to the site they are redirected to our homepage. Is this going to have any negative impacts on our rankings? I'm not sure how the Google bot treats this type of page. We have also talked about replacing the splash page with a modal window, but I'm concerned that this will increase the load time of the home page on mobile devices. Does anyone have any experience with a similar situation or any advice? Thanks in advance!
Web Design | | Cash4Books0 -
What is the best information architecture for developing local seo pages?
I think I have a good handle on the external local seo factors such as citations but I'd like to determine the best IA layout for starting a new site or adding new content to a local site. I see lots of small sites with duplicate content pages for each town/area which I know is poor practice. I also see sites that have unique content for each of those pages but it seems like bad design practice, from a user perspective, to create so many pages just for the search engines. To the example... My remodeling company needs to have some top level pages on its site to help the customers learn about my product, call these "Kitchen Remodeling" and "Bathroom Remodeling" for our purposes. Should I build these pages to be helpful to the customer without worrying too much about the SEO for now and focus on subfolders for my immediate area which would target keywords like "Kitchen Remodeling Mytown"? Aside from my future site, which is not a priority, I would like to be equipped to advise on best practices for the website development in situations where I am involved at the beginning of the process rather than just making the local SEO fit after the fact. Thanks in advance!
Web Design | | EthanB0 -
Link Pages/Directory
Hello, What is best practise for dealing with alot of links. I was thinking of breaking them download to alphabet pages i.e. all A on one page etc... BUT should I then make the links clickable on this list OR that they load to a sub company page which has a clickable link to there website.
Web Design | | JohnW-UK0