SEO & App
-
Hi MOZ Community,
We have a client that is currently ranking well for SEO. We are nearing completion of an app for the client.
There is a fair bit of content that is the same across both the app and website.
I have three questions;
-
Do we want the app to get indexed?
-
How do we avoid duplicate content on the website & app?
-
Do we need both URL's to be the same or different (for website and app)?
-
-
-
It depends on the nature of the app and its purpose. If the app provides unique content or functionality that adds value to search engine users, then having it indexed could be beneficial for attracting organic traffic. However, if the app simply mirrors the content of the website without offering anything substantially different, you may want to avoid indexing it to prevent duplicate content issues.
-
To avoid duplicate content issues, ensure that the content on both the website and the app is unique and offers value to users. If certain content needs to be shared between the website and the app, consider implementing canonical tags on the website to indicate the preferred source of the content for search engines. Additionally, use the rel="alternate" tag in the app to point to the corresponding web pages, indicating to search engines that the content is available in multiple formats.
-
It's generally recommended to have separate URLs for the website and the app, as they serve different purposes and may have different structures. However, you can establish a connection between the website and the app using deep linking. Deep linking allows you to link directly to specific content within the app from the website and vice versa, enhancing the user experience and improving cross-platform visibility.
In summary, consider the unique value proposition of the app, ensure content uniqueness, and establish a coherent cross-platform linking strategy to optimize SEO performance while avoiding duplicate content issues.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How To Stop Google's "Fetch & Render" From Showing Up In Google Analytics
Hi all, Within Google's "Fetch & Render" (found in Google Search Console) is the ability to index certain pages from my website on-demand. Unfortunately, every time I ask Google to index a page, it registers as a bounce in Google Analytics. Also, if it means anything, my website (www.knowtro.com) is a single-page application, functioning similarly to Google. If you guys know of any solution to this problem, please help! I originally thought that Google would know to block its own Fetch & Render crawler from Google Analytics but that doesn't seem to be the case. Thanks, Austin
Reporting & Analytics | | A_Krauss0 -
How to measure Seo Roi with GA?
Do you have any dashboard or suggestion to measure Seo Roi with GA? I want to create a report that makes visible these fields: Monthly increase in visits from Google Time spend on site Number of conversions Other things that can i add?
Reporting & Analytics | | markovald0 -
SEO dealing with a CDN on a site.
This one is stumping me and I need some help. I have a client who's site is www.site.com and we have set them up a CDN through Max CDN at cdn.site.com which is basically a cname to the www.site.com site. The images in the GWT for www.site.com are de-indexing rapidly and the images on cdn.site.com are not indexing. In the Max CDN account I have the images from cdn.site.com sending a canonical header from www.site.com but that does not seem to help, they are all still de-indexing.
Reporting & Analytics | | LesleyPaone0 -
Site relaunch and impact on SEO
I have some tough decisions to make about a web site I run. The site has seen around for 20 years (September 1995, to be precise, is the date listed against the domain). Over the years, the effort I've expanded on the site has come and gone, but I am about to throw a lot of time and effort back into it. The majority of the content on the site is pretty dated, isn't tremendously useful to the audience (since it's pretty old) and the site design and URL architecture isn't particularly SEO-friendly. In addition, I have a database of thousands vendors (for the specific industry this site serves). I don't know if it's a factor any more but 100% of the links there have been populated by the vendors themselves specifically requesting inclusion (through a form we expose on the site). When the request is approved, the vendor link shows up on the appropriate pages for location (state) and segment of the industry. Though the links are all "opt-in" from vendors (we've never one added or imported any ourselves), I am sure this all looks like a terrible link farm to Google! And some vendors have asked us to remove their link for that reason 🙂 One final (very important) point. We have a relationship with a nationwide brand and have four very specific pages related to that brand on our site. Those pages are essential - they are by far the most visited pages and drive virtually all our revenue. The pages were put together with SEO in mind and the look and feel is very different to the rest of the site. The result is, effectively, a site-within-a-site. I need to carefully protect the performance of these pages. To put some rough numbers on this, the site had 475,000 page views over the last year, with about 320,000 of those being to these four pages (by the way, for the rest of the content "something happened" around May 20th of last year - traffic almost doubled overnight - even though there were no changes to our site). We have a Facebook presence and have put a little effort into that recently (increasing fans from about 10,000 last August to nearly 24,000 today, with a net gain of about 2,500 per month currently). I don't have any sense of whether that is a meaningful resource in the big picture. So, that's the background. I want to totally revamp the broader site - much improved design, intentional SEO decisions, far better, current and active content, active social media presence and so on. I am also moving from one CMS to another (the target CMS / Blog platform being WordPress). Part of me wants to do the following: Come up with a better plan for SEO and basically just throw out the old stuff and start again, with the exception of the four vendor pages I mentioned Implement redirection of the old URLs to new content (301s) Just stop exposing the vendor pages (on the basis that many of the links are old/broken and I'm really not getting any benefit from them) Leave the four important pages exactly as they are (URL and content-wise) I am happy to rebuild the content afresh because I have a new plan around that for which I have some confidence. But I have some important questions. If I go with the approach above, is there any value from the old content / URLs that is worth retaining? How sure can I be there is no indirect negative effect on the four important pages? I really need to protect those pages Is throwing away the vendor links simply all good - or could there be some hidden negative I need to know about (given many of the links are broken and go to crappy/small web sites, I'm hoping this is just a simple decision to make) And one more uber-question. I want to take a performance baseline so that I can see where I started as I start making changes and measure performance over time. Beyond the obvious metrics like number of visitors, time per page, page views per visit, etc what metrics would be important to collect from the outset? I am just at the start of this project and it is very important to me. Given the longevity of the site, I don't know if there is much worth retaining for that reason, even if the content changes radically. At a high level I'm trying to decide what questions I need to answer before I set off on this path. Any suggestions would be very much appreciated. Thanks.
Reporting & Analytics | | MarkWill0 -
Is real google bot like "fetch"or more like "fetch & render"?
In GWT we have two options to mimic googlebot visits, "fetch" and "fetch and render", but when the real googlebot visit a page, is he behaving like the former or the latter? I can see fetch does fetch only the html, while fetch and render does fetch .js and .css as well. But what does the real googlebot does? I have checked the web server logs, and I can see the real googlebot sometimes request the .js files too, but not every time it visit a page, sometimes it does, sometimes it does not. Has anyone figured out when googlebot actually request javascript files?
Reporting & Analytics | | max.favilli0 -
Verifying Site Ownership & Setting Up Webmaster tools for clients who use Hubspot
We are a Hubspot partner agency. I'm trying to find the best route for managing Google's tools as an extra resource for insight, not the primary basis for marketing effort. I also want to explore adwords in more depth. Finding a lot of our clients don't have one or the other or both Analytics/Webmaster tools in place. Can I verify site ownership to set up webmaster tools simply by having admin access to their analytics account or will that require ownership of the analytics account? With Google merging things together these days I'm not sure of the best approach to take. Usually clients have their site hosted somewhere and built on some platform and ADD a Hubspot blog and the landing pages/cta's, Hubspot tools on a subdomain hosted by Hubspot. Hubspot has tools in it's website settings for adding google analytics (actually it's just a field to add code to the header area). If a client has universal analytics on their primary domain do I still need to go and add a separate analytics property for the subdomain and go through Hubspot's tools to install it on the subdomain? Or just use the same code from their primary domain and add it to the Hubspot header? What is the best route? Any additional thoughts on this subject are welcome - with so much updating and changing coming from Google (and Hubspot as we implement 3.0 - COS) I'm trying to avoid wasted effort, outdated methods, etc. Thanks!
Reporting & Analytics | | rhgraves651 -
Google Analytics: Trackbacks & Network Referrals?
Buon Pormeriggio from 15 degrees C mostly cloudy Wetherby UK 🙂
Reporting & Analytics | | Nightwing
Whats the difference between Trackbacks & Network referrals within Google analytics social media reporting? I'd like to specifically understand why a link to site i'm working on withing this post:
http://huddled.co.uk/huddled-interviews-nicola-schaefer-from-liverpool-fc-e-l-i-t-e-s-8335/ is classed as Trackback & not a Network refferral 😞 Illustrated here is the link thats being recorded as a track back:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/track-back-query_zpsbab2679b.jpg And here is the data:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/google-trackbacks-2_zps2861fa17.jpg But why is not showing up in Network referrals? Thanks in advance,
David0 -
Best practice SEO/SEM/Analaytics/Social reports
Hi All, does anyone have a best practice excel spreadsheet of a internal report we should be using.... ie what are the main factors we should be tracking? Unqiue views? time spent on site? Where they came from? seo/sem/network/direct to site? social media tracking? amount of +1/fb likes/tweets etc thanks
Reporting & Analytics | | Tradingpost0