We have a very large Knowledge center that is indexed. Is there any reason I should not exclude this subdomain from indexing?
Thank you
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
We have a very large Knowledge center that is indexed. Is there any reason I should not exclude this subdomain from indexing?
Thank you
A similar question- we have informational, content rich brochures in PDF form on Hubspot. Hubspot does not allow PDFs to be added to the site map. Is there another way to get these indexed and searchable?
I am in the process of optimizing our website and I am having a hard time reconciling two best practices I have found on Moz.
1. You should avoid having multiple pages focus on the same keyword because you will lose some control of which result will show.
2. You should identify your core keywords and weave these keywords multiple times (naturally) throughout your site.
I have spent months identifying our top 7 keywords and am working through the site now. The first piece of advice keeps giving me pause. Can anyone weigh in with other considerations or advice on how I can reconcile these two strategies.
Thank you
HI Everett,
Thank you again for the response. Do you have information on how to block robots.txt in confluence? I have been trying to find out how to block Moz from crawling them, but would definitely consider blocking more.
Thank you,
I will check on that now, I believe I set it up with full URL, but would be very grateful if that would fix part of my issue.
I am not surprised to hear you say subdomains are not so cut and dry... seems like nothing in SEO is If you have time, I would be very interested in hearing any additional insight you have about this. We have large knowledge center/customer facing subdomains, but we also have most of our content on a hubspot subdomain. Essentially I don't want Google to pay attention to any of our knowledge centers, but would love if our hubspot pages could help our root domain authority. thank you!
Hello,
I am working to manage large Confluence/ Atlassian subdomains as well and I am curious if you have any best practices to share? I am currently managing it as these subdomains will not effect our root domain with meta data issues, but I do pay attention to critical crawler issues. It is my understand that subdomains do not really help or hurt your root domain unless there are these errors- have you found that to be true?
I am also trying to get roger bot blocked from these subdomains because they are burying me with crawl errors. I can just ignore them, but it is time consuming and masks the errors I really do need to focus on. do you have any insight in this area?
Thank you!
Can I ask why you use SEMRush vs. Moz to create your keyword baskets.
Thanks!
Our main website white paper page has an image and brief description of the white paper. Once you click the white paper you are redirected to a form to access the gated white paper. Once you complete that form you are redirected to the white paper pdf which is housed on a subdomain/Hubspot.
Because of this, I do not believe our website is getting "credit" for the keywords/content on these pages. Any suggestions on how we can allow the search engines to crawl this content while still keeping it gated? As I understand it a sub domain cannot hep or hurt (aside from critical crawler issues) the main domain.
Thank you
I am in the process of optimizing our website and I am having a hard time reconciling two best practices I have found on Moz.
1. You should avoid having multiple pages focus on the same keyword because you will lose some control of which result will show.
2. You should identify your core keywords and weave these keywords multiple times (naturally) throughout your site.
I have spent months identifying our top 7 keywords and am working through the site now. The first piece of advice keeps giving me pause. Can anyone weigh in with other considerations or advice on how I can reconcile these two strategies.
Thank you
We have a very large Knowledge center that is indexed. Is there any reason I should not exclude this subdomain from indexing?
Thank you
I am trying to determine if my website was impacted by the March 2019 Core Update. Based on the various articles I have been reading, I do not believe my niche (software) was impacted. I see a very small tick up on search console and google analytics, but it is well within the normal range. Where else should I be looking to see if we were impacted?
Thank you!
A similar question- we have informational, content rich brochures in PDF form on Hubspot. Hubspot does not allow PDFs to be added to the site map. Is there another way to get these indexed and searchable?
Looks like your connection to Moz was lost, please wait while we try to reconnect.