Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Domain Masking SEO Impact
-
I hope I am explaining this correctly. If I need to provide any clarity please feel free to ask. We currently use a domain mask on an external platform that points back to our site. We are a non-profit and the external site allows users to create peer-to peer fundraisers that benefit our ministry. Currently we get many meta issues related to this site as well as broken links when fundraisers expire etc. We do not have a need to rank for the information from this site. Is there a way to index these pages so that they are not a part of the search engine site crawls as it relates to our site?
-
Glad to be of service!
-
Thank you for this response. This helps me a ton as I discuss with our web team on the best way to set the coding for the site so that we are not registering errors but also not hurting the actual site in any way.
-
This is a good idea, but Robots.txt stops pages being crawled - it doesn't stop pages being indexed. For that you need to fire the Meta No-Index directive on the affected URLs. If you can't edit their code you can fire the same directive through the HTTP header via X-Robots. On that linked post, you'll need to scroll down a little. If possible you could also alter those URLs to serve status code 410 (gone) so that Google knows, those URLs aren't really on your site
Note that you'll need to make the changes on the 'affected' site, not the site which is the 'source' of the masked pages / data. If you make the changes there, that site will have all the Google traffic killed as well (and they'll probably want to punch you!)
I recommend that you lead with hard signals and directives which stop Google indexing the pages on the 'affected' site (which is receiving the masked URLs / content and doesn't want them to rank). Once the pages fall out of Google's index, then you swoop in behind and put the robots.txt stuff in to stop them ever coming back
-
Depends on the CMS you use. Many CMS's can click a quick button to not index in Google search.
If that isnt an option, do it through a robots.txt file in webmaster tools.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does having a sub-domain on a different server affect SEO?
I'm working with a company that has a hard-coded website on the root domain, and then a WordPress blog on a subdomain on a separate server. We're planning on implementing a hub and spoke model for their content, hosting the main hubs on the root domain and the linked articles on the blog. Is having the blog on a different server going to hinder our SEO efforts?
Technical SEO | | KaraParlin0 -
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
Redirect typo domains
Hi, What's the "correct" way of redirecting typo domains? DNS A record goes to the same ip address as the correct domain name Then 301 redirects for each typo domain in the .htaccess Subdomains on typo urls still redirect to www or should they redirect to the subdomain on the correct url in case the subdomain exists?
Technical SEO | | kuchenchef0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Clients domain expired - rankings lost - repurchased domain - what next?
Its only been 10 days and i have repurchased the domain name/ renewed. The who is info, website and contact information is all still the same. However we have lost all rankings and i am hoping that our top rankings come back. Does anyone have experience with such a crappy situation?
Technical SEO | | waqid0 -
How Does Dynamic Content for a Specific URL Impact SEO?
Example URL: http://www.sja.ca/English/Community-Services/Pages/Therapy Dog Services/default.aspx The above page is generated dynamically depending on what province the visitor visits from. For example, a visitor from BC would see something quite different than a visitor from Nova Scotia; the intent is that the information shown should be relevant to the user of that province. How does this effect SEO? How (or from what location) does Googlebot decide to crawl the page? I have considered a subdirectory for each province, though that comes with its challenges as well. One such challenge is duplicate content when different provinces may have the same information for some pages. Any suggestions for this?
Technical SEO | | ey_sja0 -
Does Bitly hurt your SEO?
I often use bit.ly or Google URL shortener in links when other websites post my articles so I can track clicks. However, I am thinking this may HURT my SEO given that it is taking away a back link to my website. Is that logic correct ? If so, what is a good way to be able to track clicks if a website posts your article without jeopardizing the SEO value?
Technical SEO | | StreetwiseReports1 -
Do Domain Extensions such as .com or .net affect SEO value?
In the beginning of SEO days, it was going around that .com is the best for SEO and that .net is not as good. Is there any truth to this, and what about .org or .edu? I always hear that .edu sites have high PR. Is there any rhyme or reason to this, or all they all equal? Thank you, Afshin Christian-Way.com
Technical SEO | | applesofgold0