DO outbound links to manufacture specs, pdfs help or hurt SEO?
-
I am creating an e-commerce site.
All the products have product certification documents/images, PDF docs for instructions, manufacture specs, etc.
Should I host all this content or simply link to the original documents and content? What is the best for SEO?
Thank you,
-
Thanks for the response.
Sorry for the dumb questions -- If I have a bunch of PDF's, why is this useful?
Does this allow Google to crawl them and I get the SEO juice?
-
Can google crawl attachments for keywords? (PDFs, photos, etc)
Lots of manufactures only host links to PDF.
If possible I want to either copy and paste that content on my site so it gets crawled and I get the SEO juice. Will this work?
-
The line is in the ratio of your duplicate content to unique content. I don't know of any hard and fast ratio rule but it is primarily the unique content that will be most helpful as you to build your site's authority against your keyword competitors.
-
Assuming this is OK with suppliers is there any tactics to take that content and get it to help my SEO?
Eg. Copy the PDF words, put it on my own PDF branding and re upload or just take that offline content and paste into my page?
Where is the line between duplicate and unique content?
Thank you!
-
Assuming it's useful and relevant Google will like that and so will your users
-
You could also host PDFs as embeds. This snippet I found the other day deemed pretty useful:
<object data="/pdf/sample-3pp.pdf#page=2" type="application/pdf" width="100%" height="100%">Example fallback content: This browser does not support PDFs. Please download the PDF to view it: Download PDF.</object>
-
I agree with all of Chris's suggestions.
In some situations, you will see that the information for a few popular products is being heavily used. In those situations it may be helpful to create your own content.
You may have a situations where your product knowledge, customer knowledge, content creating ability, ambition, and willingness to invest in your website greatly exceeds that of the manufacturer and all other competitors. In these situations, you have the opportunity to create the best site on the web for a consumer topic or product line.
The investment above is most valuable when you have a product with broad appeal and where its use requires special knowledge or skill that you have the ability to present on a website. It is also important to consider the expected consumer lifetime of the product (not the item), because that determines if you are producing content that will be of temporary or evergreen value.
-
Hi James,
It may help your site as far as engagement numbers if your visitors frequent those documents but the fact that it's duplicate won't be lost on Google. If you think it will benefit visitors by having it on your site, you could canonicalize it so at least it references the original from your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Divi Help!
I've added our phone number and email address in the header settings in Divi. For whatever reason, when I'm editing the header elements I can see it, but when I view the website it's not showing... I cannot figure out what the issue is. I've never run into it before. Also, the menu looks different, it does not match what it shows in the header elements edit area vs live site. XLRpuxghzHUN LxPX4iND6B 2ekykrCH7Pn
Intermediate & Advanced SEO | | LindsayE0 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
We just lost over 20% traffic after google algo update at June 26.
Intermediate & Advanced SEO | | lcourse
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update. The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web. I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well. Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?0 -
Dummy links in posts
Hi, Dummy links in posts. We use 100's of sample/example lnks as below http://<domain name></domain name> http://localhost http://192.168.1.1 http:/some site name as example which is not available/sample.html many more is there any tag we can use to show its a sample and not a link and while we scan pages to find broken links they are skipped and not reported as 404 etc? Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
Best practice for listings with outbound links
My site contains a number of listings for charities that offer various sporting activities for people to get involved in order to raise money. As part of the listing we provide an outbound link for the user to find out more info about each of the charities and their activities. Currently these listings are blocked in the robots.txt for fear that we may be viewed as a 'link farm or spam site' (as there are hundreds of charities listed on the scrolling page) but these links out are genuine and provide benefits and are a useful resource for the user and not paid links. What I'd like to do is make these listings fully crawlable and indexable to increase our search traffic to these listing, but I'm not sure whether this would have a negative impact on our Pagerank with Google potentially viewing all these outbound links as 'bad' or 'paid links', Would removing the listing pages from our robots.txt and making all the outbound links 'nofollow' be the way forward to allow us to properly index the listings without being penalised as some kind of link farm or spam site? (N.B. I have no interest in passing link juice to the external charity websites)
Intermediate & Advanced SEO | | simon_realbuzz0 -
I can't help but think something is wrong with my SEO
So we re-launched our site about a month ago, and ever since we've seen a dramatic drop in search results (probably due to some errors that were made) when changing servers and permalink structure. But, I can't help but think something else is at play here. When we write something, I can check 24 hours later, and if I copy the Title verbatim, but we don't always show up in SERPs. In fact, I looked at a post today, and the meta description showing is not the same, but when I check the source code, it's right. What shows up in Google: http://d.pr/i/jGJg What's actually in the source code: http://d.pr/i/p4s8 Why is this happening? Website is The Tech Block
Intermediate & Advanced SEO | | ttb0 -
WIll Splash Page Triggered Only for iPhones Hurt SEO
Background We are in the process of launching a website where ticket inventory and processing are handled by a third party. Our primary means of traffic generation (at least at first) will be through SEO traffic. One of the things that they require of us is a script that will detect people with an iPhone and, upon entering the site, display a page giving people the option to call for help or continue to the site. (see attached screenshot) We will still get credit for the transaction (tracked through the phone #) and they say that this increases conversion rate, so it is something that we would like to use, unless it will affect our ability to rank in mobile. Problem My concern is that we will be penalized by Google (or rank poorly in Mobile search) because the page that iPhone users (not iPad users) are served is hosted on a different domain and not optimized at all for the keywords people are searching for. This is obviously a non-issue if Google never sees the page, but I have heard that Google will emulate different devices when crawling pages. Question Can anyone provide any insight about this? I feel like we are adding value to customers by giving them the option to speak to customer support, but I'm afraid that Google will think we are cloaking or at best providing the same page to anyone entering with an iPhone. Here is a link to the soon-to-be-launched website:http://dev.concerttickets.com.vhost.zerolag.com/ -- so you can check it out on your iPhone. Is there a possibility that this could effect SEO traffic from other devices? Any suggests will or advice will be greatly appreciated. Thanks in advance! EP1nA EP1nA EP1nA.png
Intermediate & Advanced SEO | | highlyrelevant0