We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages).
Should we be concerned? Is there anything we can do about this?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages).
Should we be concerned? Is there anything we can do about this?
Choose one site. 301 re-direct all content from the site you want to remove to the new site.
Make sure to re-direct to each corresponding page (don't re-direct to all URLs to the home page).
Has anyone ever done any testing on setting "priority' and "frequency" in their sitemaps? What was the result? Does specifying priority or frequency help quite a bit?
Which of these services is the best? Does anyone have experience with all three?
SeoMoz is a bit behind GWT, especially for smaller sites. GWT is a more accurate measure of your backlinks.
For duplicate pages created by the "print" function,
seomoz says its better to use noindex (http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not)
and JohnMu says its better to use canonical http://www.google.com/support/forum/p/Webmasters/thread?tid=6c18b666a552585d&hl=en
What do you think?
What is the best research tool for finding search data specifically for Google Image search?
I've done quite a bit of searching, but can't seem to find a time efficient way to accurately analyze keyword difficulty for large sets of keywords.
All of the keyword difficulty tools out there that I've tried are either 1) not accurate 2) slow (like seomoz's kw difficulty tool only allows 5 entries at a time).
Can anyone recommend any shortcuts / tools / processes to analyze kw difficulty for large sets of keywords?
If we decide to put a variety of our images under a creative commons license, can we require that individuals who choose to utilize these images link back to us?
If you plan on doubling the size of your site with original, unique content, is it better to publish it all at once or over a period of time? Is there any penalty for publishing it all at once?
I've tried using Xenu, but this is a bit time consuming because it only tells you if the link sin't found & doesn't tell you which pages link to the 404'd page.
Webmaster tools seems a bit dated & unreliable. Several of the links it lists as broken aren't.
Does anyone have any other suggestions for compiling a list of broken links on a large site>
Google says these types of references are generated algorithmically and that users should include a table of contents & descriptive anchor link text. Is there anything else we should take into consideration?
Also, does anyone know how this works with pagination? Due to the design of our site, we can't make one really long article, but would need to divide it up into several 'pages'--even though it would all live on one URL (we'd use the # for pagination).
Thank you in advance for your feedback.
I know that branded searches are a large component of whether sites were hit by Panda or not, and I wonder if moving forward, I should always include the name of my site (domain) in the name of the product.
For example, if I have a product with a unique name such as 'history maps' should I change the name to include my brand name, i.e '[domain] history maps'? Or, if users search for the unique product name, is that sufficient?
What's the best way to get video thumbs to appear in SERPs?
One of our clients wants to use about 200 of our articles on their site, and they're hoping to get some SEO benefit from using this content.
I know standard best practices is to canonicalize their pages to our pages, but then they wouldn't get any benefit--since a canonical tag will effectively de-index the content from their site.
Our thoughts so far:
add a paragraph of original content to our content
link to our site as the original source (to help mitigate the risk of our site getting hit by any penalties)
What are your thoughts on this? Do you think adding a paragraph of original content will matter much? Do you think our site will be free of penalty since we were the first place to publish the content and there will be a link back to our site?
They are really pushing for not using a canonical--so this isn't an option. What would you do?
In fact in this post http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html, they mention using a canonical when syndicating content, if the content is similar enough--not sure why they don't mention a canonical in the webmaster guidelines link I included above.
Can anyone provide some guidance on both how to submit your site to Yahoo News as well as some tips for how to get accepted into Yahoo News?
We want to move our author byline to the bottom of the page, but we are wondering if this will affect how authorship displays on a page?