Our title tags are dynamically generated, and some have over 140 characters.
Does having a large quantity of URLS with an excessive number of characters hurt you in any way?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Our title tags are dynamically generated, and some have over 140 characters.
Does having a large quantity of URLS with an excessive number of characters hurt you in any way?
We have a long list of URL parameters in our Google Webmasters account. Currently, the majority are set to 'let googlebot decide.'
How important is it to specify exactly what googlebot should do? Would you leave these to 'let googlebot decide' or would you specify how googlebot should treat each parameter?
Is it possible to block all but one URL with robots.txt?
for example domain.com/subfolder/example.html, if we block the /subfolder/ directory we want all URLs except for the exact match url domain.com/subfolder to be blocked.
In Google's webmaster guidelines, they mention to view your site in a text browser to ensure all text is visible. All of our text is visible, but is very messy and is all jumbled on the page. I've noticed most sites text browser layout is clean. H
How important is it to SEO that the site views cleanly in a text browser? Does anyone know of any feedback from Google engineers about this point?
In Google's SEO Guide, they say avoid the use of drop-down menus, page 12: http://static.googleusercontent.com/external_content/untrusted_dlcp/www.google.com/en/us/webmasters/docs/search-engine-optimization-starter-guide.pdf
But, is this always true? What if you create the drop down purely using HTML & CSS? Is it fine to use a bit of javascript to create the drop-down menu, or should it only be HTML & CSS?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database
Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Is there any advantage to using "noindex, nofollow" over robots.txt? I've read that noindex, nofollow still accumulates pagerank for the page with the tag, but if we don't care about accumulating pagerank, is there any other advantage to using noindex, nofollow over robots.txt?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Does a 301 re-direct pass social signals such as 'likes' 'tweets' and '+1s"?
Agreed, that is a better solution, but, I am still wondering if you block something with robots.txt, will that lead to a decrease in traffic? What if we have some duplicate content that is highly trafficked, if we block it with robots.txt, will the traffic numbers change?
I've tried using Xenu, but this is a bit time consuming because it only tells you if the link sin't found & doesn't tell you which pages link to the 404'd page.
Webmaster tools seems a bit dated & unreliable. Several of the links it lists as broken aren't.
Does anyone have any other suggestions for compiling a list of broken links on a large site>
I know you can use robots.txt to tell search engines not to spend their resources crawling certain pages.
So, if you have a section of your website that is good content, but is never updated, and you want the search engines to index new content faster, would it work to block the good, un-changed content with robots.txt? Would this content loose any search traffic if it were blocked by robots.txt? Does anyone have any available case studies?
If we have content on our site that is found on another site, what is the best way to know which site Google views as the original source?
If you search for a line of the content such as "xyz abc etc" and the other site shows before yours in search results, does that mean that Google views that site as the original source?
Everyone knows subdomains worked for Hubpages to recover from Panda.
Does anyone know of other examples of sites that have recovered from Panda using subdomains?
By "complaining" alot do you mean posting in Google Webmaster forums or something else?
Does any one know how you can try to make authorship profiles show in SERPs other than just making sure the profiles were installed correctly?
If we have one page on our site that is is only linked to by one other page, what is the best way to block crawler access to that page?
I know we could set the link to "nofollow" and that would prevent the crawler from passing any authority, and we can set the page to "noindex" to prevent it from appearing in search results, but what is the best way to prevent the crawler from accessing that one link?
Search results on our own domain
If we have pages on our site that link to search results is that a bad thing? Should we set the links to "nofollow"?
In "links to your site" how does Google Webmasters determine the order of the URLs? By influence? Quality?
We have multiple freelance writers that work for us. Should we require all of them to create a google + profile with authorship?
What is the best site / tool to use to view referral traffic on a third party site?
Here, google mentions that article URLs should contain a 3 digit number:
http://www.google.com/support/news_pub/bin/answer.py?hl=en&answer=68323
Why is this?
Also, do you think it is good to have both a news sitemap and 3 digits at the end of a URL? Or, will one do?
I read this article on SEJ:
http://www.searchenginejournal.com/scrapers-and-the-panda-update/34192/
And, I'm a bit confused as to how a scraper site can be successful post Panda? Didn't panda specifically target sites that have duplicate content & shouldn't scraper sites actually be suffering?
Here's a couple that show a fairly decent search volume on Wordtracker & 0 on Adwords KW tool:
multiple sclerosis links with bipolar disorder
ank3 and bipolar disorder
depression and bipolar link
Thanks!
When I find keyword opportunities in Wordtracker, I'll sometimes run them through Adwords Keyword tool only to find that Google says these keywords have 0 search volume. Would you use these keywords even though Google says users aren't searching for them?
If you want to add both "noindex, follow" and "noopd" should you add two meta robots tags or is there a way to combine both into one?
Can you send the link? I see http://www.seomoz.org/article/search-ranking-factors#predictions but don't see a break down of facebook vs twitter
If we host the video on a third party player & use their player (such as Brightcove), will we get credit for the video (will it show in SERPs on our domain? Or, do we need to host it on our own site?
If we host a video on a third party site and use an iframe to display it on our site, when the video is indexed in SERPs will it show on our site or on the third party site?
If another site hosts the video and you show it on your site in an iframe, would the video show up in SERPs on your URL? I believe it would show up on their URL.
Do you need to host the video on your own server in order for it to show up on SERPs?
If you could choose between a tweet from an influencer or a post from a Facebook fan "influencer"? Which would you choose?
Has anyone ever done any testing on setting "priority' and "frequency" in their sitemaps? What was the result? Does specifying priority or frequency help quite a bit?
We are looking for a way to quickly find blogs for a particular keyword that have a high number of subscribers / followers. Is there a tool that can do this?
A blog equivalent of followerwonk / wefollow would be ideal.
Are there any tools that will let you see your competitor's bounce rates?
So, basically, we are thinking of putting all content related to one keyword in a subdirectory. Currently all content is in one main subdirectory. For example, url.com/un-related-subdirectory/tons of content to url.com/keyword/content-related-to-this-keyword.
I recently started working on a site that is 8 years old and the currently URLs/ site structure is not SEO friendly. We are concerned that in re-structuring the site, we may loose our rankings.
Has anyone ever completely re-structured their site? Was it worth it?
We are thinking of restructuring the URLs on our site and I was wondering if there is a penalty associated with setting up so many 301 re-directs.