SEO & App
-
Hi MOZ Community,
We have a client that is currently ranking well for SEO. We are nearing completion of an app for the client.
There is a fair bit of content that is the same across both the app and website.
I have three questions;
-
Do we want the app to get indexed?
-
How do we avoid duplicate content on the website & app?
-
Do we need both URL's to be the same or different (for website and app)?
-
-
-
It depends on the nature of the app and its purpose. If the app provides unique content or functionality that adds value to search engine users, then having it indexed could be beneficial for attracting organic traffic. However, if the app simply mirrors the content of the website without offering anything substantially different, you may want to avoid indexing it to prevent duplicate content issues.
-
To avoid duplicate content issues, ensure that the content on both the website and the app is unique and offers value to users. If certain content needs to be shared between the website and the app, consider implementing canonical tags on the website to indicate the preferred source of the content for search engines. Additionally, use the rel="alternate" tag in the app to point to the corresponding web pages, indicating to search engines that the content is available in multiple formats.
-
It's generally recommended to have separate URLs for the website and the app, as they serve different purposes and may have different structures. However, you can establish a connection between the website and the app using deep linking. Deep linking allows you to link directly to specific content within the app from the website and vice versa, enhancing the user experience and improving cross-platform visibility.
In summary, consider the unique value proposition of the app, ensure content uniqueness, and establish a coherent cross-platform linking strategy to optimize SEO performance while avoiding duplicate content issues.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Standard pages with Robots.txt (t&c's, shipping policy, pricing & privacy policies etc)
Hi I've just had best practice site migration completed for my old e-commerce store into a Shopify environment and I see in GSC that it's reporting my standard pages as blocked by robots.txt, such as these below examples. Surely I don't want these blocked ? is that likely due to my migrators or s defaults setting with Shopify does anyone know? : t&c's shipping policy pricing policy privacy policy etc So in summary: Shall I unblock these? What caused it Shopify default settings or more likely my migration team? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Is real google bot like "fetch"or more like "fetch & render"?
In GWT we have two options to mimic googlebot visits, "fetch" and "fetch and render", but when the real googlebot visit a page, is he behaving like the former or the latter? I can see fetch does fetch only the html, while fetch and render does fetch .js and .css as well. But what does the real googlebot does? I have checked the web server logs, and I can see the real googlebot sometimes request the .js files too, but not every time it visit a page, sometimes it does, sometimes it does not. Has anyone figured out when googlebot actually request javascript files?
Reporting & Analytics | | max.favilli0 -
Pro's & Con's of Wordpress Categorys & Tags
Good Afternoon! I touched on this question a while back in another post specifically regarding a plethora of duplicate pages that I was finding due to inappropriate tagging in wordpress. As I am going through our website, I am starting to notice it happening again with categories as well. I am including some pictures where you can see the URL structures and titles etc of how everything is laid out. I would like to clarify that I was not the one who did any of this Is it wrong/bad to cross categorize? What I mean by that is put something in more than one category? Would there be any drawback to converting any of these into subcategories? Would that even do anything? Does having two pages that are named the same thing, hurt you? It would seem to me that Google wouldn't like that. I have recently come into the field of thought that Google is getting more and more human, and If it makes a human uncomfortable/confused it will make Google confused. In my pictures you can see we clearly have numerous hard copies of the same thing, not just duplicate elements created by wordpress, that is a separate issue. I personally want to change all of the titles and make everything as different and individual as possible, but i also could be very wrong in my desire to do that. Any thoughts are appreciated! eY4iX2N N3AVqss JZpU7Rq
Reporting & Analytics | | HashtagHustler0 -
Verifying Site Ownership & Setting Up Webmaster tools for clients who use Hubspot
We are a Hubspot partner agency. I'm trying to find the best route for managing Google's tools as an extra resource for insight, not the primary basis for marketing effort. I also want to explore adwords in more depth. Finding a lot of our clients don't have one or the other or both Analytics/Webmaster tools in place. Can I verify site ownership to set up webmaster tools simply by having admin access to their analytics account or will that require ownership of the analytics account? With Google merging things together these days I'm not sure of the best approach to take. Usually clients have their site hosted somewhere and built on some platform and ADD a Hubspot blog and the landing pages/cta's, Hubspot tools on a subdomain hosted by Hubspot. Hubspot has tools in it's website settings for adding google analytics (actually it's just a field to add code to the header area). If a client has universal analytics on their primary domain do I still need to go and add a separate analytics property for the subdomain and go through Hubspot's tools to install it on the subdomain? Or just use the same code from their primary domain and add it to the Hubspot header? What is the best route? Any additional thoughts on this subject are welcome - with so much updating and changing coming from Google (and Hubspot as we implement 3.0 - COS) I'm trying to avoid wasted effort, outdated methods, etc. Thanks!
Reporting & Analytics | | rhgraves651 -
Google Analytics: Trackbacks & Network Referrals?
Buon Pormeriggio from 15 degrees C mostly cloudy Wetherby UK 🙂
Reporting & Analytics | | Nightwing
Whats the difference between Trackbacks & Network referrals within Google analytics social media reporting? I'd like to specifically understand why a link to site i'm working on withing this post:
http://huddled.co.uk/huddled-interviews-nicola-schaefer-from-liverpool-fc-e-l-i-t-e-s-8335/ is classed as Trackback & not a Network refferral 😞 Illustrated here is the link thats being recorded as a track back:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/track-back-query_zpsbab2679b.jpg And here is the data:
http://i216.photobucket.com/albums/cc53/zymurgy_bucket/google-trackbacks-2_zps2861fa17.jpg But why is not showing up in Network referrals? Thanks in advance,
David0 -
Googlebot Visit Time & Date Stamps
I know that in Google Webmaster Central, we can see general Googlebot crawl stats based on dates. Our site is crawled daily. We're trying to find out the time of day Googlebot visits our site - know of any ways to do this or tools out there?
Reporting & Analytics | | Prospector-Plastics0 -
Google Analytics Customer filters & the correct syntax
Good afternoon from torential rain and thunder Wetherby UK 😞 Ive been delving into the world of custom filters in Google analytics and Ive hit a problem. Apart from Googles advice being out of date (wont get into that one) I wanted to set up clone a profile of an existing site and set up a filter that would exclde USA traffic. But a country can be called many things e.g. United Sates USA America so how do you know what is the correct name to put in the filter pattern box? Here is a screen shot of my efforts :http://i216.photobucket.com/albums/cc53/zymurgy_bucket/filter-toexclude-usatrafficcopy.jpg (I chaned it to United States) So my question is where is there a list of the correct syntax for the filter pattern box? Taker for example you wnated to set up a custom filter to include just Palm OS traffic apprently from reading this the correct filter patter is Palm OS, i only know that from this http://www.e-nor.com/blog/web-analytics/tracking-mobile-devices-in-google-analytics Whilst Google has lots of documentation about filed patterns as documented here http://support.google.com/analytics/bin/answer.py?hl=en&answer=1034380&topic=1034830&ctx=topic Where in the name of flying spacial jockstraps is there documentation for the correct filter pattern syntax. Help my head hurts 😞
Reporting & Analytics | | Nightwing0