PDF on financial site that duplicates ~50% of site content
-
I have a financial advisor client who has a downloadable PDF on his site that contains about 9 pages of good info. Problem is much of the content can also be found on individual pages of his site.
Is it best to noindex/follow the pdf? It would be great to let the few pages of original content be crawlable, but I'm concerned about the duplicate content aspect.
Thanks --
-
This is what we have done with pdfs. Assign rel="canonical" in .htaccess.
We did this with a few hundred files and it took google a LONG time to find and credit them.
-
You could set the header to noindex rather than rel=canonical
-
Personally I think it would be better not to index, it but if necessary, the index folder root seems like a good option
-
Thanks. Anybody want to weigh in on where to rel=canonical to? Home page?
-
If you are using apache, you should put it on your .htaccess with this form
<filesmatch “my-file.pdf”="">Header set Link ‘<http: misite="" my-file.html="">; rel=”canonical”‘</http:></filesmatch>
-
I think the right way here is to put the rel canonical in PDF header http://googlewebmastercentral.blogspot.com/2011/06/supporting-relcanonical-http-headers.html
-
I thought the idea was to put rel=canonical on the duplicated page, to signal that "hey, this page may look like duplicate content, but please refer to this canonical URL"?
Looks like there is a pdf option for rel=canonical, I guess the question is, what page on the site to make canonical?
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
Indicate the canonical version of a URL by responding with the
Link rel="canonical"
HTTP header. Addingrel="canonical"
to thehead
section of a page is useful for HTML content, but it can't be used for PDFs and other file types indexed by Google Web Search. In these cases you can indicate a canonical URL by responding with theLink rel="canonical"
HTTP header, like this (note that to use this option, you'll need to be able to configure your server):Link: <http: www.example.com="" downloads="" white-paper.pdf="">; rel="canonical"</http:>
-
Hi Keith,
I'm sorry, I should have clarified. The rel=canonical tags would be on your Web pages, not the PDF (they are irrelevant in a PDF document). Then Google will attribute your Web page as the original source of the content and will understand that the PDF just contains bits of content from those pages. In this instance I would include a rel=canonical tag on every page of your site, just to cover your bases. Hope that helps!
Dana
-
Not sure which page I would mark as being canonical, since the pdf contains content from several different pages on the site. I don't think it's possible to assign different rel=canonical tags to separate portions of a pdf, is it?
-
As long as you have rel=canonical tags properly in place, you don't need to worry about the PDF causing duplicate content problems. That way, any original content should be picked up and any duplicate can be attributed to your existing Web pages. Hope that's helpful!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | | timdavis0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Two sites, two domains, two brands, 98% same content
There are two affiliated brick & mortar retail stores moving into e-commerce. For non-marketing reasons separate e-commerce websites are desired. The two brands are based in separate (nearby) cities in the same Canadian province. Although the store name and branding will be different, the content on the site will either be near duplicates or exact duplicates. The more I look into this on Google and SEOmoz QA, the more I am concerned about the SEO implications of this. SEOmoz QA: Multiple cities/regions websites - duplicate content? "So, yes, because you are offering the same services at second location, you are thinking correctly about the need to rewrite all content so it's not a duplicate of site #1." Duplicate content - Webmaster Tools Help "However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic… In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results. ... Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results." Unfortunately, I would say there's very little chance that rewritten content will happen in the foreseeable future. With that said, I'd be greatly appreciative of the concerns and remedies that the SEOmoz community has to offer (even if they're for future use). Thanks in advance.
Intermediate & Advanced SEO | | GOODSIR0 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Amazing decrease of visits in a Good Content Site
Dear Sirs, contributors and aspirants of Seomoz: I have a site called General History (http://general-history.com/) that was created in 2010, and has a current PR of 3, a DA of 23 and a home page authority of 32. It also has 1.690 links, knowing that we have not invested on link building, all the links were built manually via post inserting or viral via social shares. The thing is that in only 5 months, it passed from receiving 14.000 visits/per month to only 1.500. Is that a decrease of 700% in 5 months? I must admint that I earn my life offering SEO to companies, but this is one of my own sites, a site in which my 73 year old father likes to write about General History. I really think, given that he used to be a journalist, that the content not only isn't spam but it is high quality content. As I had Analytics, I started searching for the cause. The first question was... 1.- From what source did I loose the most amount of visitors? Organic, Paid or Social. The answer is organic by far. As I discovered it was an organic loss, I tried to find what content used to have the most visitors. I found 3 posts that brought 80% of the total traffic. How did the people find the content? Well, some of them found the site in the first page of google when searching for "Holocaust facts and figures" for example, but Analytics says that the most people came from image search in Google Images. General history disappeared from the SERPs but progressively, not from one day to another. So then I thought, It can't be a penalization. I contacted google and send them a reconsideration. 5 days later they answered saying that general-history.com is not a spammy site and thus it has not been penalized. For the ones who can read Spanish, here is Google answer: "Estimado webmaster o propietario del sitio http://general-history.com/: Hemos recibido una solicitud del propietario de un sitio para que volvamos a comprobar si http://general-history.com/ cumple las directrices para webmasters de Google. Hemos revisado tu sitio y no hemos detectado acciones manuales del equipo de webspam que puedan perjudicar la clasificación del mismo en Google. No es necesario que presentes una solicitud de reconsideración para el mismo, ya que las incidencias relacionadas con la clasificación que puedan producirse no se derivan de acciones manuales realizadas por el equipo de webspam. Existen otras incidencias relacionadas con tu sitio que pueden perjudicar la clasificación del mismo. Los ordenadores de Google determinan el orden de los resultados de búsqueda a través de una serie de fórmulas denominadas algoritmos. Cada año, se realizan cientos de cambios en los algoritmos de búsqueda, y se utilizan más de 200 señales diferentes para clasificar páginas. A medida que cambian los algoritmos y la Web (incluido tu sitio), se pueden producir fluctuaciones en la clasificación, ya que se actualiza para ofrecer a los usuarios los resultados más relevantes. Si has detectado un cambio en la clasificación y consideras que no se debe simplemente a un cambio de algoritmos, te recomendamos que investigues otras posibles causas, como un cambio importante en el contenido del sitio, en el sistema de gestión de contenido o en la arquitectura del servidor. Por ejemplo, es posible que un sitio no obtenga una buena posición en los resultados de búsqueda si el servidor deja de proporcionar páginas a Googlebot o si el usuario cambia las URL de una gran parte de las páginas del sitio. En este artículo se incluye una lista de otros posibles motivos por los que tu sitio no obtiene una buena clasificación en los resultados de búsqueda. Si sigues sin poder solucionar la incidencia, accede al foro de ayuda para webmasters para obtener asistencia. Atentamente, Equipo de Calidad de búsqueda de Google" They say interesting things like it might be other problems that caused my position decrease like: Site content change, content management, server architecture or change or urls. After receiving this, I thought I should get in the admin panel in wordpress and search for bugs, html or css, php errors and I found that somebody had hijacked my site, entering the wordpress panel and adding a code of into one of my landing pages. That page does not exist anymore. I erased completely. The span code was as follows:
Intermediate & Advanced SEO | | Tintanus
General History | General-History General History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-History I thought that would be the problem ! But it was NOT, because Google did not penalize me as you can see in the letter they sent me. I erased the complete page in which the span appeared, I updated my sitemap, re-check my robots.txt, searched my folders via FTP and mucho more... Conclusion? I have no idea why I General-History has lost 700% of its traffic in 5 months.0 -
My site falls more than 50 in two days, help me!
My site appeared in the top 10 for that link (http://www.vipgoldrj.com/paginas/ensaios.html) and not by this (http://www.vipgoldrj.com), was well at 2 months, and he suddenly disappeared, I wanted to know if he had been penalized and Google told me it was not. What should I do? The. Sorry my English, I am Brazilian and I'm using Google translator. Warning from SEOmoz staff: this is an escort site with full frontal nudity and is not safe for most workplaces.
Intermediate & Advanced SEO | | WebMaster0210 -
Syndicating duplicate content descriptions - Can these be canonicalised?
Hi there, I have a site that contains descriptions of accommodation and we also use this content to syndicate to our partner sites. They then use this content to fill their descriptions on the same accommodation locations. I have looked at copyscape and Google and this does appear as duplicate content across these partnered sites. I do understand as well that certain kinds of content will not impact Google's duplication issue such as locations, addresses, opening times those kind of things, but would actual descriptions of a location around 250 words long be seen and penalised as duplicate content? Also is there a possible way to canonicalise this content so that Google can see it relates back to our original site? The only other way I can think of getting round a duplicate content issue like this is ordering the external sites to use tags like blockquotes and cite tags around the content.
Intermediate & Advanced SEO | | MalcolmGibb0 -
Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?
I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page. The URLs are... mobile:
Intermediate & Advanced SEO | | grayloon
store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content. However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages. I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!0