Hi George,
I use wordpress for one - and will look for a plugin. Thanks!
We use an XML CMS called Pubman for the other. For improved search and delivery, we use Marklogic. Do you think just writing a robot.txt would work best?
Regards,
David
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi George,
I use wordpress for one - and will look for a plugin. Thanks!
We use an XML CMS called Pubman for the other. For improved search and delivery, we use Marklogic. Do you think just writing a robot.txt would work best?
Regards,
David
Hi,
I have two sites showing (crawl report from SEOMoz.org) extremely high numbers of duplicate titles and descriptions (e.g., 33,000). These sites have CMSs behind them and so the duplicate titles, etc., are a result of auto-generated pages. What is the best way to address these problems?
Thanks!
David
Thanks for the response.
So, you believe Is it better to go deep with a few top domains (e.g., many articles) vs. going broad with many high domains? For instance, by publishing more PR releases through PR web, I can get links back from many media providers.
I believe the sites I referred to are high-authority and pr value domains (e.g., major publications) where I can write and submit articles. I have the opportunity go deep with these domains (e.g.., write several articles).
Thanks!
David
I understand it is important to get links from many quality domains. Currently, I do have links from top domains (PR, Trust) and it I can get more from (high rank) pages on these same domains. Would it be better to focus on expanding my reach (find additional domains to link from) or to continue to build links from the current domains I have a connection with? What is weighted more? I realize doing both is important, but trying to figure out how to best use my time.
Thanks!
David