Different HTML based on resolution
-
Is it acceptable in terms of SEO to display different HTML based on a users resolution size?
I feel I'm wasting space on my site catering for all the 1024 x 768ers
-
SEO Wise I don't think there's any issues, though I wonder what resolution the Googlebot reports itself as having. It's important though that you do it the way Chas Blackford states; if you have actual server side code that changes a bunch of things around based on resolution then you might get in trouble. This is an interesting article about using stylesheets to segment mobile layouts (it also mentions Media Queries which are kind smart/new phone specific):
http://www.alistapart.com/articles/return-of-the-mobile-stylesheet
There are some implementation issues, the most important of which is reliably getting the resolution from the agent. Essentially, you can't guarantee it 100% of the time. From what I've read a combination of user agent string matching and resolution detection can probably get you most of the way though.
-
Yes it's acceptable - the key is utilizing CSS and use a DOCTYPE with a DTD to present a different UX based on the device. Work with an experience coder who can structure the page template to put the content first, minimizing scripts, etc - you want to score high on the Google page speed test (https://developers.google.com/pagespeed/).
Then test across all devices you think 80% plus visitors will be using (check your Google Analytics to profile browsers, OS, devices and resolution)
BTW - Here's what Google has to say about SEOmoz (scoring an 83/100)
High priority. These suggestions represent the largest potential performance wins for the least development effort. You should address this item first:
Leverage browser caching
Medium priority. These suggestions may represent smaller wins or much more work to implement. You should address these items next:
Minimize redirects, Optimize images
Low priority. These suggestions represent the smallest wins. You should only be concerned with these items after you've handled the higher-priority ones:
Inline Small CSS, Enable compression, Defer parsing of JavaScript, Minify CSS, Specify a cache validator, Minify JavaScript, Minify HTML, Specify a character set, Optimize the order of styles and scripts, Remove query strings from static resources, Specify a Vary: Accept-Encoding header -
Hi Niall
Responsive design seems to be everywhere now and your point above seems to touch on this. From a UX perspective there really isn't a perfect design that caters for every single display and user. Google Analytics allows you to track screen resolution so I suggest that for any particular track this for a while and if there are mutliple types then think of designing to cater for it.
However even though I am in website design I tend to head for he safety of the middle ground and have not yet fully dived into the HTML5 /Responsive area yet as most customers are not demanding it.
However as the tablet and smart phone become the default device as opposed to the fun one it may become an issue.
hope this helps
Kieran
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Very different serp rankings for different countries
Hi! A client of mine has a global presence and we are therefore tracking serp rankings for multiple Google domains; google.com, google.ie, google.co.uk for example. One page is ranking quite well in Google.ie (position 11 right now) while at the same time ranking extremely bad in both Google.com and Google.co.uk (position 100+). How can that be?? I did do some optimization (for example Title and Desc) before the rankings in .com and .co.uk went really bad (they were higher before (pos 30-40)). But it is the same page Google indexes...but the results are very different. How can this be? What can be done? Thanks!
Technical SEO | | JHultqvist
Jesper1 -
Add trailing slash after removing .html extention
My website is non www ,it has wordpress in subdirectory and some static webpages in the root and other subdirectory 1. i want to remove .html extention from the webpages in the root and
Technical SEO | | Uber_
the others static webpages in subdirectory.
2. add slash at the end.
3. 301 redirect from non slash to url with slash. so it should be http://ghadaalsaman.com/articles.html to http://ghadaalsaman.com/articles/ and http://ghadaalsaman.com/en/poem-list.html to http://ghadaalsaman.com/en/poem-list/ the below code 1. working with non slash at the end **2. **redirect 301 url with slash to non here's my .htaccess <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews RewriteEngine On
RewriteBase /</ifmodule> #removing trailing slash
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)/$ $1 [R=301,L] #www to non
RewriteCond %{HTTP_HOST} ^www.(([a-z0-9_]+.)?domain.com)$ [NC]
RewriteRule .? http://%1%{REQUEST_URI} [R=301,L] #html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L] #index redirect
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/
RewriteRule ^index.html$ http://ghadaalsaman.com/ [R=301,L]
RewriteCond %{THE_REQUEST} .html
RewriteRule ^(.*).html$ /$1 [R=301,L] PS everything is ok with the wordpress , the problems with static pages only. Thanks in advanced0 -
3 Different Websites but Same Keywords
One of my client targeting same (5 Keywords) for 3 sites. Domain & Web Hosting is same for 3 sites. Site A - 50.72.134.29
Technical SEO | | krishnaxz
Site B - 50.72.140.227
Site C- 50.72.19.70 Some time ago, ranking dropped - but don't know if it is because on above things? Is it OK? What is the best way to target same keywords for 3 different sites.0 -
Merge two different domains into one
Hi all SEO folks We want to merge two different domains into one. The products are similar to each other but domain1 is the online product and domain2 is the offline version of the product (new launced product). Both the domains have the domain Authority 47 but domain1 is optimized to the online product and most keywords (branded keywords) are in top 3 , where as the domain2 has been used for another concept which has now been closed down (beginning of may), and replaced with the offline product. To boost the sale of the offline product, we want to move the content from domain1(online product) to domain2(offline product), and off course we want to keep the ranking in top 3. So this is what I know I have to do: Redirect 301 from old content url to new content url, contact external domains and ask them to link to the new domain. Furthermore I read that it is important to place everything from the old site exactly the same way on the new site (and keep the content the same), and that way keep the internal url structure/linkjuice. This is were my question comes in. If we move all the content to the exactly same location on the new site, there will be way to many links on the frontpage and the new offline product won't have the dominating position.
Technical SEO | | Bulpen
Do any of you have experience with moving the content from one domain1.com to domain2.com/online and still be able to keep the ranking in SERP, or would you recommend always to move the content from root to root? Ps: Sorry about my English. English is not my main language. Hope you understand my question anyway :).0 -
HTML Encoding Error
Okay, so this is driving me nuts because I should know how to find and fix this but for the life of me cannot. One of the sites I work for has a long-standing crawl error in Google WMT tools for the URL /a%3E that appears on nearly every page of the site. I know that a%3E is an improperly encoded > but I can't seem to find where exactly in the code its coming from. So I keep putting it off and coming back to it every week or two only to wrack my brain and give up on it after about an hour (since its not a priority and its not really hurting anything). The site in question is https://www.deckanddockboxes.com/ and some of the pages it can be found on are /small-trash-can.html, /Dock-Step-Storage-Bin.html, and /Standard-Dock-Box-Maxi.html (among others). I figured it was about time to ask for another set of eyes to look at this for me. Any help would be greatly appreciated. Thanks!
Technical SEO | | MikeRoberts0 -
Can I optimize two different pages with very similar keywords without hurting SEO?
Hi there, I have often heard that you cannot have multiple pages rank for the same keyword. My question here is more about long tail keywords who have the same keyword phrase repeated on different pages. For Example: I have two webpages with different content. I want to have one page (Homepage) rank for the more generic term such as "innovation management" and another supporting page rank for "innovation management software". Will Google see these two different webpages as competing? Should I avoid repeating the more general term in the phrase? Has anyone ever seen your SEO results decline when doing this? I don't believe this is duplicate content since the pages hold completely different copy and assets but I am not sure if the repeating phrase in the title tags will flag anything to the search engines.
Technical SEO | | Scratch_MM0 -
Basic SEO HTML
Hello Everyone, One place I am weak is coding for SEO. I need to get better. One question I do have is can anyone explain why it's important to place css and java script files in an external file? How do you do this and how do you know if it's already being done? If it has not been done on a site is it hard to go back and do? I understand this is important from a site load time issue Thanks, Bill P.S. Can anyone recommend a resource where I can learn proper html coding for SEO? Thank you!
Technical SEO | | wparlaman0 -
Different version of site for "users" who don't accept cookies considered cloaking?
Hi I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there. They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content. Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content. I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form. From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case. So my question is: would this be considered cloaking/put the site at risk in any way? (They would prefer to not go down the First Click Free route as this will lower their email sign-ups.) Thank you!
Technical SEO | | TimBarlow0