Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Domain Masking SEO Impact
-
I hope I am explaining this correctly. If I need to provide any clarity please feel free to ask. We currently use a domain mask on an external platform that points back to our site. We are a non-profit and the external site allows users to create peer-to peer fundraisers that benefit our ministry. Currently we get many meta issues related to this site as well as broken links when fundraisers expire etc. We do not have a need to rank for the information from this site. Is there a way to index these pages so that they are not a part of the search engine site crawls as it relates to our site?
-
Glad to be of service!

-
Thank you for this response. This helps me a ton as I discuss with our web team on the best way to set the coding for the site so that we are not registering errors but also not hurting the actual site in any way.
-
This is a good idea, but Robots.txt stops pages being crawled - it doesn't stop pages being indexed. For that you need to fire the Meta No-Index directive on the affected URLs. If you can't edit their code you can fire the same directive through the HTTP header via X-Robots. On that linked post, you'll need to scroll down a little. If possible you could also alter those URLs to serve status code 410 (gone) so that Google knows, those URLs aren't really on your site
Note that you'll need to make the changes on the 'affected' site, not the site which is the 'source' of the masked pages / data. If you make the changes there, that site will have all the Google traffic killed as well (and they'll probably want to punch you!)
I recommend that you lead with hard signals and directives which stop Google indexing the pages on the 'affected' site (which is receiving the masked URLs / content and doesn't want them to rank). Once the pages fall out of Google's index, then you swoop in behind and put the robots.txt stuff in to stop them ever coming back
-
Depends on the CMS you use. Many CMS's can click a quick button to not index in Google search.
If that isnt an option, do it through a robots.txt file in webmaster tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Structure On Site - Currently it's domain/product-name NOT domain/category/product name is this bad?
I have a eCommerce site and the site structure is domain/product-name rather than domain/product-category/product-name Do you think this will have a negative impact SEO Wise? I have seen that some of my individual product pages do get better rankings than my categories.
Technical SEO | | the-gate-films0 -
Schema for Banks and SEO
I'm researching Schema opportunities for a bank, but besides the shema markup available today (like bankorcreditunion) and developments with FIBO, I can find no answer as to the effect of tagging interest rates and such in terms of SERP/CTR performance or visibility. Does anyone have a case study to share or some insight on the matter?
Technical SEO | | Netsociety0 -
How Does Dynamic Content for a Specific URL Impact SEO?
Example URL: http://www.sja.ca/English/Community-Services/Pages/Therapy Dog Services/default.aspx The above page is generated dynamically depending on what province the visitor visits from. For example, a visitor from BC would see something quite different than a visitor from Nova Scotia; the intent is that the information shown should be relevant to the user of that province. How does this effect SEO? How (or from what location) does Googlebot decide to crawl the page? I have considered a subdirectory for each province, though that comes with its challenges as well. One such challenge is duplicate content when different provinces may have the same information for some pages. Any suggestions for this?
Technical SEO | | ey_sja0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Does Bitly hurt your SEO?
I often use bit.ly or Google URL shortener in links when other websites post my articles so I can track clicks. However, I am thinking this may HURT my SEO given that it is taking away a back link to my website. Is that logic correct ? If so, what is a good way to be able to track clicks if a website posts your article without jeopardizing the SEO value?
Technical SEO | | StreetwiseReports1 -
SEO Benefit from Redirecting New Exact Match Domains?
Hi, All! This is a question asked in the old Q & A section, but the answer was a little ambiguous and it was about 3 years ago, so I decided to repost and let the knowledgeable SEO public answer... From David LaFerney: It’s clear that it’s much easier to get high rankings for a term if your domain is an exact match for the query. If you own several such domains that are very related such as – investmentrealestate.com, positivecashflow.com, and rentalproperty.com – would you be able to benefit from those by 301ing them to a single site, or would you have to maintain separate sites to help capture those targeted phrases? In a nutshell – SEO wise, is it worth owning multiple domains to exactly match valuable search phrases? Or do you lose the exact match benefit when you redirect?>> To clarify: redirecting an old domain with lots of history and links to a new exact match domain seems to contain SEO benefit. (You get links+exact match domain, approximately.) But the other way around? Redirecting a new exact match domain to an older domain with links? Does that do anything for the ranking of the old domain for the exact match keyword? Or absolutely nothing? (My impression has been that it's nothing, but the question came up for a client and I just wanted to make sure I wasn't missing something.) Thanks in advance!
Technical SEO | | debi_zyx0 -
Using hyphenated sub-domains or non-hyphenated sub-domains? What is the question! I Any takers?
For our corporate business level domain, we are exploring using a hyphenated sub-domain foir a project. Something like www.go-figure.extreme.com I thought from a user perspective it seems cluttered. The domain length might also be an issue with the new Algorithm big G has launched in recent past. I know with past experience, hyphenated domains usually take longer to index, as they are used by spammers more frequently and can take longer to get out of the supplementary index. Our company site has over 90 million viewers / year, so our brand is well established and traffic isn't an issue. This is for a corporate level project and I didn't have the answer! Will this work? anyone have any experience testing this. Any thoughts will help! Thanks, Rob
Technical SEO | | RobMay0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0