How vital is it for a site to have a mobile site for mobile SEO?
-
With the exponential growth in mobile device sales and usage and an expected 980% growth in advertising next year for/on mobile devices, we at http://www.mobilewebsitegurus.com decided that it was time to help companies create great looking mobile websites that are user friendly and SEO friendly at affordable rates with tons of features built in from the start. However, when selling our design, how important is it to have a GOOD mobile site compared to a big one to rank on mobile devices? We head that Google was thinking of only showing mobile sites on mobile devices. NOT TRUE. Then we read/heard that the rankings were MUCH BETTER if you had a mobile site, but after a lot of research we found that too NOT to be true. On most sites there were NO difference. So what is the TRUTH about this and is it maybe just that it will happen, just has not happened yet - the different rankings for mobile and regular sites on mobile devices that is? ANY insight in this would be great not only for us but for the entire SEO community Thanks. ALSO, add "Mobile SEO" to the boxes below of "Topics" since mobile SEO will grow in importance.
-
I'm not sure I understand the question, but along the way the original poster seems to be suggesting that a mobile strategy should always and everywhere be a high priority for every business. I'm not sure I agree with that premise.
I recently optimized for mobile the site of a client who mobile traffic has doubled to 25% in the last few months. Much of the site traffic is from 18-35 year old males who are affluent and educated...and access the site daily for updated content. So it was kind of a no-brainer. We just rolled the mobile optimization into an overall site re-design. There is only one site.
But another client is business-to-business. Users access the site only from work, during business hours, from Monday to Friday. It's a very tech un-saavy user basis, with over 65% on IE. Mobile traffic is so small it's hard to measure. The site is not optimized for mobile. We just did a site upgrade, without optimizing for mobile. I recommended we wait another year.
That said, when building new sites from scratch these days, I would always optimize for mobile.
As to the question of whether you should build a mobi version of a legacy site, my answer would mirror the one above: follow Google's recommendation and just have one site.
I'm trying to think of a situation where it would make sense to launch a mobile version of a legacy site with identical content....but I can't think of one.
-
If this is a legitimate question, proper tagging and display, quick page speed times for mobile and user engagement such as bounce rate etc will be factors in helping Google to decide to give one site better mobile rankings over another. Google is recommending one site for both web and mobile. So simply the fact that one site is purely mobile and another is both mobile and web does not mean that the mobile only site should automatically rank better because it's a .mobi or whatever.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing text content on a site affects seo?
HI, i have changed some h1 and h2 , changed and added paragraphs,fixed plagiarism,grammar and added some pics with alt text, I have just done it today, I am ranking on second page QUESTION-1 is it gonna affect my 2 months SEO efforts? QUESTION -2 Do I have to submit sitemap to google again? QUESTION-3 does changing content on the site frequently hurts SEO?
Algorithm Updates | | Sam09schulz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
SEO Value for Visitor Comments on WordPress Blog
Hi all, I was wondering what SEO value, if any, there is from user comments on my WordPress blog. A lot of them seem to be from bots or incredibly generic. So I guess, aside from possibly adding to the 'trust' factor and the possibility of facilitating a potential relationship with another relevant website, is there any value here? I can't wait to hear from you all!
Algorithm Updates | | maxcarnage0 -
Bing SEO
Hi I am seeing a large drop in our traffic from Bing - this is usually a good traffic source for us. The drop seems to be at the same time Google had the slow roll out of Panda 4.2 - would this have anything to do with it? Becky
Algorithm Updates | | BeckyKey0 -
Ecommerce - SEO Quick Wins?
Hi I wanted to find out if anyone had any quick wins for an ecommerce site & SEO. I am the only SEO and we have a small online team and an ecommerce site with thousands of product pages. It's impossible to optimise everything, and we have taken the top 100 products and optimised them - starting from scratch with keyword research. I'm now struggling to prioritize what we need next - I know we need better internal linking, content, social and lots more, but this isn't something I can get through alone. I need a starting point and perhaps something with a quick win initially? Thanks 🙂
Algorithm Updates | | BeckyKey0 -
Using a sites custom code for multiple websites: good or bad?
Is it bad to utilize a custom codebase for multiple websites? Does that play a factor within Google? Also, what about hosting sites with the same custom codebase on the same dedicated server?
Algorithm Updates | | WebServiceConsulting.com0 -
Post penguin & panda update. what would be a good seo strategies for brand new sites
Hi there. I have the luxury of launching a few sites after the penguin and panda updates, so I can start from scratch and hopefully do it right. I will get SEO companies to help me with this so i just want to ask for advices on what would be a good strategies for a brand new site. my understand of the new updates is this content and user experience is important, like how long they spend, how many pages etc social media is important. we intent to engage FB and twitter alot. in New Zealand, not too many people use google+ so we will probbaly just concentrate on the first two hopefully we will try to get people to share our website via social media, apparent that is important should only concentrate on high quality backlinks with a good diverse set of alt tags, but concentrate on branding rather than keywords. Am i correct to say that so far? if that is the principle, what would be the strategy to implement these goals? Links to any articles would also be great please. Love learning. i just want to do this right and hopefully try to future proof the sites against updates as possible. i guess quality content and links will most likely to be safe. Thank you for your help.
Algorithm Updates | | btrinh0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0