Is there any harm to SEO having a homepage url that is not clean like www.domain.com. For example citi uses https://online.citi.com/US/login.do Does that matter in any way? Would a company like citi benefit from changing to www.citi.com as their homepage?
Posts made by kcb8178
-
Does the url in for your homepage impact SEO
-
Sub Domain Usage
I see that the gap uses gap.com, oldnavy.gap.com and bananarepublic.gap.com. Wouldn't a better approach for SEO to have oldnavy.com, bananarepublic.com and gap.com all separate? Is there any benefit to using the approach of store1.parentcompany.com, store2.parentcompany.com etc? What are the pros and cons to each?
-
RE: Is there a limit to how many URLs you can put in a robots.txt file?
Great thanks for the input. Per Kristen's post I am worried that it could just block the URLs altogether and they will never get purged from the index.
-
RE: Is there a limit to how many URLs you can put in a robots.txt file?
Yes, we have done that and are seeing traction on those urls, but we can't get rid of these old urls as fast as we would like.
Thanks for your input
-
RE: Is there a limit to how many URLs you can put in a robots.txt file?
Thanks Kristen, thats what I was afraid I would do. Other than Fetch is there a way to send Google these URLs in mass? There are over 100 million URLs so Fetch is not scalable. They are picking them up slowly, but at current pace it will take a few months and I would like to find a way to make it purge faster.
-
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda.
Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.