Problem of printer friendly version.
-
For one of our client's side, most of the backlinks are going to printer friendly version page. I recommeded to him to use the canonical tag on printer friendly version pointing to other page.
Luckily, while searching i came across this posts at - http://www.seomoz.org/q/solving-printer-friendly-version
The solution recommended was this -
<link type="text/css" rel="stylesheet" media="print" href="our-print-version.css">
My questions are -
1. what should i write in place of our-print-version.css
Should it be print.css ?
2. Where do i place this code ? in which file ?
-
Correct.
Often a site will refer to numerous CSS files. There are tools which will combine multiple CSS files into a single file and properly compress the files to optimize them for page speed.
-
Thanks once again for clarification.
The only question is whether changes need to be made to optimize the code from a SEO or Page Speed perspective.
You mean to say, css code must reside in an external file and linked from page to minimise code.
-
Do i need to 301 printer friendly page ?
No. Your site's visitors need to access the printer friendly page. If you add a 301, then no one will be able to view the print friendly page.
I should also clarify, if your site currently offers a print friendly page and it works, then your programmer has already taken care of the issue from a website functionality perspective. The only question is whether changes need to be made to optimize the code from a SEO or Page Speed perspective.
-
"It would need to be accessible and declared on the printer friendly version page"
That' what i was looking for. I will ask designer to declare this file in printer friendly version page. So, the solutution will be -
We place this code in printer friendly version page -
<link type="text/css" rel="stylesheet" media="print" href="print.css">
print.css will have the css code for print format pages. and print.css will be a separate file
Do i need to 301 printer friendly page ?
-
How the CSS is presented is up to your web designer. It could be a part of the site's main css, or in a separate file. It would need to be accessible and declared on the printer friendly version page.
As part of speed optimizations, all CSS files may be condensed into a single file.
-
Thanks a lot Ryan.
CSS declarations are made in the of your HTML document
i was not sure, that's why i asked this. Should this declaration be made in printer friendly version page ?
-
Hi Atul.
I looked at the Q&A response link you offered. I will try to offer some clarifications:
Where do i place this code ? in which file ?
CSS declarations are made in the of your HTML document
what should i write in place of our-print-version.css
The name of the file which contains the css code for your print format pages
For one of our client's side, most of the backlinks are going to printer friendly version page. I recommended to him to use the canonical tag on printer friendly version pointing to other page.
Your recommendation is sound, and I agree with it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Verify all versions of site in Bing Webmaster Tools
Hello, We recently migrated our site to a new shopping cart, https, and from www to non-www, and it's been a rough transition. We've lost a lost of traffic particularly in Bing. All the versions of our site are verified Google WMT, sitemaps are submitted correctly, etc. Unfortunately, this was not done for Bing. Currently only the new version of our site (https, non-www) is verified in Bing WMT. Do we have to verify all versions of our site in Bing, the way they are in Google WMT? Also, now that it's been a few months since the switch, should we still submit a site move to Bing WMT or is it too late? Thanks in advance!
Technical SEO | | whiteonlySEO0 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
Friendly URLs for MultiLingual Site
Hi, We have a multilingual website with both latin and non-latin characters, We are working on creating a friendly URL structure for the site. For the Latin languages can we use translated version of the URLs within the language folders? For example - www.site/cars www.site/fr/voitures www.site/es/autos
Technical SEO | | theLotter0 -
Pageflip SEO friendly?
Client of mine utilizes pageflip for their product brochures and would love to have this content be crawl-able by search engines. Is there a way to make them SEO friendly so I may utilize this content?
Technical SEO | | richn330 -
Non WWW. versus WWW. versions, current best practice ?
Hi Im increasingly seeing sites not using the www., but understand from various sources including seomoz that best practice is to be on the www. with the non www version 301'd to the www version. Since alot of sites are clearly doing this the other way round now is that better practice or the former still best ? I appreciate that non www version gives you 3 more characters for url's but apart from that is there any benefit over the www. version ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Friendly URL
Can be Friendly URL installed on a custom made jobsite using mod rewrite / apache without any big interference to the system itself? Thank you.
Technical SEO | | tomaz770 -
URL rewriting causing problems
Hi I am having problems with my URL rewriting to create seo friendly / user friendly URL's. I hope you follow me as I try to explain what is happening... Since the creation of my rewrite rule I am getting lots of errors in my SEOMOZ report and Google WMT reports due to duplicate content, titles, description etc For example for a product detail, it takes the page and instead of a URL parameter it creates a user friendly url of mydomain.com/games-playstation-vita-psp/B0054QAS However in the google index there is also the following friendly URL which is the same page - which I would like to remove domain.com/games-playstation-vita/B0054QAS The key to the rewrite on the above URLs is the /B0054QAS appended at the end - this tells the script which product to load, the details preceeding this could be in effect rubbish i.e. domain.com/a-load-of-rubbish/B0054QAS and it would still bring back the same page as above. What is the best way of resolving the duplicate URLs that are currently in the google index which is causing problems The same issue is causing a quite serious a 5XX error on one of the generated URLs http://www.mydomain.com/retailersname/1 - , if I click on the link the link does work - it takes you to the retailers site, but again it is the number appended at the end that is the key - the retailersname is just there for user friendly search reasons How can I block this or remove it from the results? Hope you are still with me and can shed some light on these issues please. Many Thanks
Technical SEO | | ocelot0 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0