Domain and Sitemap Question
-
Hi - I am hoping you can help me with this issue we are currently trying to solve. We are hosting our mobile site's content on a different domain than what the URL of the site is, though owned by same company. In Google Webmasters tool we have the mobile sitemap under "sitemaps.xyz.com", however the URL of the site is "m.xyz.com".
We have submitted 60MM pages in the mobile sitemap, but only 1MM pages have been indexed. Do you think this set up causes confusion with the bots? Does this affect the crawlability of the site?
Any thoughts would be greatly appreciated.
Thank you!
Eva -
Having separate URLS/domains for your mobile site is OK, but not ideal. See https://developers.google.com/webmasters/smartphone-sites/details If possible, I would switch to a responsive design.
Otherwise, make sure you've setup the meta tags as recommended in the above URL.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Question - issue
A while back we had a 'bleed' on one of our sites, which basically meant one of our sites started to leak across pages to another and that site started to rank for the same pages and now we have hundreds of pages ranking for urls that do not exists. It's hard to explain, bare with me. If you were to click on the cached view in Google for the ranked page it would show you the main site, but if you were to click it as usual, then you would be taken to the site but a 404 would show as the intended page was not for that site. We believe we fixed the 'bleed' and have setup 301s for all the affected pages to go to the home page for the site it affected. But these pages have not been removed from Google, which we thought a 301 would do. So we still have hundreds of pages being ranked but are redirected to the home page. Why hasn't these pages been removed?
Intermediate & Advanced SEO | | JH_OffLimits0 -
This url is not allowed for a Sitemap at this location error using pro-sitemaps.com
Hey, guys, We are using the pro-sitemaps.com tool to automate our sitemaps on our properties, but some of them give this error "This url is not allowed for a Sitemap at this location" for all the urls. Strange thing is that not all of them are with the error and most have all the urls indexed already. Do you have any experience with the tool and what is your opinion? Thanks
Intermediate & Advanced SEO | | lgrozeva0 -
Migration Challenge Question
I work for a company that recently acquired another company and we are in the process of merging the brands. Right now we have two website, lets call them: www.parentcompanyalpha.com www.acquiredcompanyalpha.com We are working with a web development company who is designing our brand new site, which will launch at the end of September, we can call that www.parentacquired.com. Normally it would be simple enough to just 301 redirect all content from www.parentcompanyalpha.com and www.acquiredcompanyalpha.com to the mapped migrated content on www.parentacquired.com. But that would be too simple. The reality is that only 30% of www.acquiredcompanyalpha.com will be migrating over, as part of that acquired business is remaining independent of the merged brands, and might be sold off. So someone over there mirrored the www.acquiredcompanyalpha.com site and created an exact duplicate of www.acquiredcompanybravo.com. So now we have duplicate content for that site out there (I was unaware they were doing this now, we thought they were waiting until our new site was launched). Eventually we will want some of the content from acquiredcompanyalpha.com to redirect to acquiredcompanybravo.com and the remainder to parentacquired.com. What is the best interim solution to maintain as much of the domain values as possible? The new site won't launch until end of September, and it could fall into October. I have two sites that are mirrors of each other, one with a domain value of 67 and the new one a lowly 17. I am concerned about the duplicate site dragging down that 67 score. I can ask them to use rel=canonical tags temporarily if both sites are going to remain until Sept/Oct timeframe, but which way should they go? I am inclined to think the best result would be to have acquiredcompanybravo.com rel=canonical back to acquiredcompanyalpha.com for now, and when the new site launches, remove those and redirect as appropriate. But will that have long term negative impact on acquiredcomapnybravo.com? Sorry, if this is convoluted, it is a little crazy with people in different companies doing different things that are not coordinated.
Intermediate & Advanced SEO | | Kenn_Gold0 -
Page and Domain Authority
How much Page and Domain Authority we need to look for to secure a backlink.
Intermediate & Advanced SEO | | ross254sidney0 -
Robots.txt Question
For our company website faithology.com we are attempting to block out any urls that contain a ? mark to keep google from seeing some pages as duplicates. Our robots.txt is as follows: User-Agent: * Disallow: /*? User-agent: rogerbot Disallow: /community/ Is the above correct? We are wanting them to not crawl any url with a "?" inside, however we don't want to harm ourselves in seo. Thanks for your help!
Intermediate & Advanced SEO | | BMPIRE0 -
Help needed for a domain
I have a small translation agency in Brazil (this website), totally dependent on SEM. We are in business since 2007, and we were on top position for many relevant keywords until the middle of 2011, when the ranking for the most important keywords started dropping. In that time, we believed that we needed to redesign the old static website and replace it by a new modern one, with fresh content and with weekly updates, which we did, and it's now hosted on Squarespace. I took care to keep the old links working with 301 redirections. When we made the transfer from the static site to Squarespace (Mar/2012, see the attachment), the ranking dropping became even more serious. Today, we have less than 50 unique visitors per day, in a total desperate situation! To make things worse, we received an alert from Google on 23/September/2012 talking about unnatural inbound links, but Google said that "As a result, for this specific incident we are taking very targeted action on the unnatural links instead of your site as a whole", so we thought we didn't need to worry about. Google was correct, I worked many hours to register our website in web directories, I thought there would be no problem since I was doing this manually. My conclusions are: Something happened prior to Mar/2012 that was making us losing territory. I just don't know what! The migration to Squarespace was a huge mistake. I lost control over the html, and squarespace doesn't do a good job optimizing the pages for SEO. We also were also blasted by Penguin on September, but I believe this is not the main cause of the drop. We were already running very badly at this time. My actions are: a) I generated a DTOX report and I'm trying to clean up the links marked as toxic. That's a hard work! After that I will submit a reconsideration request. b) I'm working on the site: Improving internal link building for relevant keywords Recently I removed a "tag cloud" which I believe was hurting my SEO. Also, I did some redirections that were missing. c) I trying to generate new content to improve link building to my site. d) I'm also considering to stop putting all my coins on this domain, and maybe start a fresh new one. Yes, I'm desperate! 🙂 I would appreciate a lot to hear from you guys, expert people! Thanks a lot, MWcEdPa.png?1
Intermediate & Advanced SEO | | rodrigofreitas0 -
Sitemaps: Alternate hreflang
Hi, some time ago I have read that there is a limit of 50.000 URLs per sitemap file (So, you need to create a sitemap index and separate files with 50.000 urls each). [Source]. Now we are about to implement the link hreflang in the sitemap [Source], and we dont know if we have to count each alternate as a different url. We have 21 different well positioned domains (Same name, different cctlds, a little different content [varies in currencies, taxes, some labels, etc] depending in the target country) so the amount of links per url would be high. A) Shall we count each link alternate as a separate url, or just the original ones? For example, if we have to count the link alternates, that would make us have 2380pages per sitemap, each with one original url and 20 alternate links. (Always being aware of the 50mb maximum filesize) B) Actually we have one sitemap per domain. Using this, shall we generate one per domain using the matching domain as original url? Or it would be the same if we upload to every domain the same sitemap? Thanks
Intermediate & Advanced SEO | | marianoSoler980 -
Advanced Squidoo Question
Hi, I am looking for someone with a lot of experience with building links to your money site using Squidoo. I have a ton of squidoo lenses set up, i recently created back linking reports for a number of squidoos to see if the squidoo was appearing as a link. They were not. Only one out of my 53 lenses is appearing. Tons of them are already featured lenses ( Not work in progress) What does it take to get a squidoo to become an active link in a link profile? Thanks guys
Intermediate & Advanced SEO | | danielblinman0