App indexing with several subdomains
-
Hi,
We are currently in the process of getting our app developed. From an SEO point of view I want the app to get indexed and am currently trying to get a step by step guide for it.
I've come across this helpful guide: https://moz.com/blog/how-to-get-your-app-content-indexed-by-google but need to go a bit deeper now.
Questions I have is related to the "apple-app-site-association" file
As far as I could find out the JSON code of that file has the following syntax:
{_ “applinks”: {__ “apps”: [],__ “details”: [__ {__ “appID”: “9JA89QQLNQ.com.my.bundle.id“,__ “paths”: [ “/myPath1”, “/myPath2/*” ]__ },__ ]__ }_
}
My question is related to the key "appID"
I'm wondering how the entry would look like if I have 10 country subdomains and would like to make sure that the deep link clicked in the browser is going to the right subdomain page within the app.
Current website structure:
uk.example.com/english-path/
de.example.com/german-path/Many Thanks in advance
-
As far as I am aware, the association file is just about which paths the app can handle - I think that if you register the root domain and handle all paths then it will handle all paths across all sub-domains (but you should test this).
When you are actually linking individual web pages to the universal page in the app, it uses a fully-qualified URL with protocol and subdomain so should work fine (see this documentation).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internally linked pages from different subdomain must be well optimised?
Hi all, We have guide/help pages from different subdomain (help.website.com). And we have linked these from 3rd hierarchy level pages of our website (website.com/folder1/topic2). But help.website sumdomain & pages are not well optimised. So, I am not sure linking these subdomain pages from our website pages hurts our rankings? Thanks,
Web Design | | vtmoz0 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Privacy Policy: index it/? And where to place it?
Hi Everyone, Two questions, first: should you allow google to index your privacy policy? Second: for a service based site (not e-commerce, not selling anything) should you put the policy in the footer so it's site wide or just on the "contact us" form page? Best, Ruben
Web Design | | KempRugeLawGroup0 -
Subdomain or Start over?
I have a site that is currently set up geo-targeted in sub-domains. So each geo is princeton.site.com. Each site has the same code, design, but the events listed are different. There are about 30 subdomains and they are looking to expand nationally. After the Panda update, I am seeing that it looks like they should stay as sub-domains rather than redirecting everything or setting up new domains. Any thoughts on if the sub domains should stay and expand exponentially? SEO strategy, etc.? I'm also worried about the possibility of duplicate content. Thanks!
Web Design | | PPI0 -
Duplicate home page /index.asp /index.php etc
We recently moved www.devoted2vntage.co.uk to shopify but seem to have multiple home page variants still in google index. I am concerned that these will be causing duplicate content. I have redirected the offending URLs below to www.devoted2vintage.co.uk/ and have set up a canonical URL but need an expect to tell me if I have taken the current steps and if not, exactly what I need to do. www.devoted2vintage.co.uk/index.php www.devoted2vintage.co.uk/index.htm www.devoted2vintage.co.uk/index.html www.devoted2vintage.co.uk/index.shtml www.devoted2vintage.co.uk/index.aspx www.devoted2vintage.co.uk/index.cfm www.devoted2vintage.co.uk/index.pl www.devoted2vintage.co.uk/index.asp
Web Design | | devoted2vintage0 -
Google indexing Quickview popups
Hi Guys I can't seem to find any info on this. Maybe you can help. We are using xcart as our shopping cart. When you land on a product page you have the option to "Quickview" the item. Google is picking up the quickview urls" and the vote on product urls. I have added the following to the robots.txt file but not sure if this will work. Any help on this would be great. Disallow: /?popup=Y Disallow: /?mode=add Undesired URL Examples: <colgroup><col width="735"></colgroup>
Web Design | | fasctimseo
| http://www.funlove.com/store/6_Pack_Shooter_Beer_Belt/?mode=add_vote&vote=60 | <colgroup><col width="735"></colgroup>
| http://www.funlove.com/store/6_pack_shooter_beer_belt/?popup=Y |0 -
What will you do with the subdomains, keep it or remove it?
A client of us has this webpage http://www.losestores.com/ not very good in design, but it has good products.
Web Design | | teconsite.com
He is having a problem with SEO, he is in page 2 for "estores" keyword. His domain authority is 27 and is upper than other domains that are ranking in page one. The question is the following: Since the webpage began to work the programmer that created it, did the following: a subdomain per section http://estores.losestores.com/ http://cortinas.losestores.com/ http://venecianas.losestores.com/ and so on. This page is not very big one. Will you remove the subdomains and do de following? www.losestores.com/estores www.losestores.com/cortinas What will be best? Thanks Victoria0