For responsive site what should be lowest Screen Resolution for Desktop?
-
Hello Guys,
Can you please share in details screen resolution I have to define for my responsive site for desktop, tablet & mobile. Your inputs are very valuable to me.
Thanks!
Micey
-
I think it depends on what you wish to accommodate on the screen.
We have adopted a more mobile based design from 300 to 767 which makes certain items more prominent or vertically aligned and then from 768 up to 1600+ is our dedicated "all elements" visible desktop version.
That way our desktop style generally works on the majority of tablets and larger screen sizes where more data easily easier to absorb.
Hope that helps.
-
We have a minimum design width of 300. Add 10 pixels to each side of that for margins, and we are at 320.
I'd like to know what people are using for a max desktop width.
Wikipedia will span the full width of any desktop monitor and be responsive down to at least 300.
The Washington post is about 1200 wide and is totally responsive down to about 300 - by abandoning columns.
The New York Times is about 1000 wide, but does not respond until the browser width gets down to about 500 - allowing content to be lost off the right side of the screen.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site shows up after re-indexing, then disappears.
I have a site, natvest.com, with which I sell real estate in Alabama and Georgia. I need to show up in an "Alabama Land for Sale" search. Same thing for Georgia. If I re-index my site, I show up for roughly one day, before disappearing again. Happens every time I re-index. Ideas?
Intermediate & Advanced SEO | | natvest0 -
My site is being deindexed for unknown reason
A few days ago I noticed that my site gusty.se was not showing up in google, only the subpages. There is no message in the google search console. I requested the site to be reindexed and about a day later the site was showing up in google again. Now another day has past and the site is now again not indexed in google. Question is why the site is being deindexed??? I have worked a bit with getting backlinks to the site and I did recently gain 3 backlinks within a few days (about a week has past since I gained these links). Still I can't believe Google would count this as unnatural link building, especially since I guess it will take some time for Google to detect new incoming links. Another thing I've notice though is that my site about two weeks ago got a high number of incoming links from different spam sites with .gq TLD's (see the attached screenshot). The majority of these sites have however not linked to my main page but to a sub page which still is indexed by Google. Can all these spamlinks be the reason to why Google has deindexed the main page of my site? I've read that Google in general ignore links from spam sites, still I have taken action against these spam sites by submitting a disavow text file containing all these spam domains. I submitted this file about 2 days ago. I have now again requested the site to be reindexed so perhaps will it soon be listed again. Still, I can't keep having my site deindexed and having me reindexing it every second day. I would really appreciate if someone could give me some insight in this problem. moz.jpg
Intermediate & Advanced SEO | | Grodan21 -
Is SEO as Effective on AJAX Sites?
Hey Everyone, I had a potential client contact me about doing SEO for their site and I see that they have an AJAX site where all the content is rendered dynamically via AJAX. I've been doing SEO for years, but never had a client with an AJAX site. I did a little research and see how you can setup alternative pages (or snapshots as Google calls them) with the actual content so the pages are crawlable and will get indexed, but I'm wondering if that is as effective as optimizing static HTML pages or if Google treats AJAX page alternatives as less trustworthy/valuable. Also, does having the site in AJAX effect link building and social sharing? With the link structure, it seems there could be some issues with pointing links and passing link juice to internal pages Thanks! Kurt
Intermediate & Advanced SEO | | Kurt_Steinbrueck1 -
Should I just redirect all my sites to my main site.
Hi, Over the last few years I have built many sites and own a lot of domain names. Some have high page rank some have high domain authority and some have many back links. I'm finding it very difficult to keep up with all the links and being able to provide quality content for everything. Should I just redirect everything to my one site that make the most money as all sites are for the same industry, but in different categories of that industry. So I could 301 redirect all the sites to the relevant page on my money site. Would it be a problem is 1000's if not 10,000's of links all of a sudden pointed in to one site?
Intermediate & Advanced SEO | | cibble030 -
Site been plagiarised - duplicate content
Hi, I look after two websites, one sells commercial mortgages the other sells residential mortgages. We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right. I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word. I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well. I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory? I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view. Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario! In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy. Any advice would be greatly appreciated. Thanks, Amelia
Intermediate & Advanced SEO | | CommT0 -
Strange situation - Started over with a new site. WMT showing the links that previously pointed to old site.
I have a client whose site was severely affected by Penguin. A former SEO company had built thousands of horrible anchor texted links on bookmark pages, forums, cheap articles, etc. We decided to start over with a new site rather than try to recover this one. Here is what we did: -We noindexed the old site and blocked search engines via robots.txt -Used the Google URL removal tool to tell it to remove the entire old site from the index -Once the site was completely gone from the index we launched the new site. The new site had the same content as the old other than the home page. We changed most of the info on the home page because it was duplicated in many directory listings. (It's a good site...the content is not overoptimized, but the links pointing to it were bad.) -removed all of the pages from the old site and put up an index page saying essentially, "We've moved" with a nofollowed link to the new site. We've slowly been getting new, good links to the new site. According to ahrefs and majestic SEO we have a handful of new links. OSE has not picked up any as of yet. But, if we go into WMT there are thousands of links pointing to the new site. WMT has picked up the new links and it looks like it has all of the old ones that used to point at the old site despite the fact that there is no redirect. There are no redirects from any pages of the old to the new at all. The new site has a similar name. If the old one was examplekeyword.com, the new one is examplekeywordcity.com. There are redirects from the other TLD's of the same to his (i.e. examplekeywordcity.org, examplekeywordcity.info), etc. but no other redirects exist. The chances that a site previously existed on any of these TLD's is almost none as it is a unique brand name. Can anyone tell me why Google is seeing the links that previously pointed to the old site as now pointing to the new? ADDED: Before I hit the send button I found something interesting. In this article from dejan SEO where someone stole Rand Fishkin's content and ranked for it, they have the following line: "When there are two identical documents on the web, Google will pick the one with higher PageRank and use it in results. It will also forward any links from any perceived ’duplicate’ towards the selected ‘main’ document." This may be what is happening here. And just to complicate things further, it looks like when I set up the new site in GA, the site owner took the GA tracking code and put it on the old page. (The noindexed one that is set up with a nofollowed link to the new one.) I can't see how this could affect things but we're removing it. Confused yet? I'd love to hear your thoughts.
Intermediate & Advanced SEO | | MarieHaynes0 -
Seo for Q&A site
Hi, I am working on a newly launched Q&A site. We have very few questions and users right now and very very low seo traffic. In order to increase the number of users and seo traffic we intend to create a number of pages containing potential questions. Each page would have the following structure: Question. Ex: "What are the top wholesale suppliers of coffee in China?" Some content. Ex: Are you looking for wholesale suppliers of coffee in China? Post your question here? Question form Some additional content So there would be a page for wholesale suppliers of coffee for every country. We would publish the pages gradually and the content would be unique but yet similar (ex: only the Country changes). What do you think about this approach? Is it a good idea or can it be dangerous? We don't want to incur in any kind of penalization, we just want to give the possibility to people who are looking for specific information to find us and be able to post the request on our website.
Intermediate & Advanced SEO | | Ypsilon0 -
Scapers and Other Sites Outranking
Post panda, there is definitely more talk about scrapers or other (more authoritative) sites outranking the original content creators in the SERPS. The most common way this problem is addressed (from what I've seen) is by rewriting the content and try your hardest to be the first one to be indexed or just ignoring it from an on page standpoint and do more link dev. Does anyone have any advice on the best way to address? Should site owners be looking deeper into their analytics and diagnostics before doing the rewrites?
Intermediate & Advanced SEO | | Troyville0