Hi Tom, thanks for the response but that doesn't work.
There is no link to a Google+ profile on this page - the Author, though, is verified by the domain name and the page includes "by", causing this.
Any other thoughts?
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Tom, thanks for the response but that doesn't work.
There is no link to a Google+ profile on this page - the Author, though, is verified by the domain name and the page includes "by", causing this.
Any other thoughts?
Do you let the old URLs 404 or do you redirect them?
So after a year or two there would be thousands of 301 redirects - would you eventually delete these after a year or two?
Suppose you have a website with a blog on it, and you show a few recent blog posts on the homepage. Google see the headline + by Author Name and associates that user's Google+ profile.
This is great for the actual blog posts, but how do you prevent this from happening on the homepage or other blog roll page?
For a real estate website, when a house is sold or taken off of the market. What should happen to the listing? 301 redirect it to the grouping (such as zip code or city) which that listing resides in? 404 it?
You are fine, don't worry about that. In the instance you provide, it looks like the link back to the website is a nofollow anyhow. This is not a problem at all - things like Sitejabber.com, digsitevalue.net, websitelooker.com, dawhois.com and all the rest may publish non-editorial links to pages and its really not an issue. You can't control who links to you and not all links are editorial in nature.
If someone paid for links, then they tried to control who links to the site, and you are now in the process of removing that manipulation/control.
I'm looking for a way to track change in search volume for dozens or hundreds of keywords but Google Trends doesn't have an API as far as I can tell.
Does anyone know of a good tool or method of extracting large amounts of data from Google Trends?
This is a good response but doesn't really answer the question..
How would KD/Volume compare with typical KEI or KOI?
Hello all,
What do you think about using Keyword Difficulty divided by Search Volume as an alternative to keyword efficiency indexes?
ETA: Obviously this wouldn't be a hard and fast metric, but a general indicator to be taken into account along with other data.
Matt Cutts recently addressed this: http://www.stonetemple.com/matt-cutts-and-eric-talk-about-what-makes-a-quality-site/
Eric Enge: Let’s switch gears a bit. Let’s talk about a pizza business with stores in 60 cities. When they build their site, they create pages for each city.
Matt Cutts: Where people get into trouble here is that they fill these pages with the exact same content on each page. “Our handcrafted pizza is lovingly made with the same methods we have been using for more than 50 years …”, and they’ll repeat the same information for 6 or 7 paragraphs, and it’s not necessary. That information would be great on a top-level page somewhere on the site, but repeating it on all those pages does not look good. If users see this on multiple pages on the site they aren’t likely to like it either.
Eric Enge: I think what site owners may argue is that if someone comes in from a search engine and lands on the Chicago page, and that is the only page they see on the site, they want to make their best pitch on that page. That user is also unlikely to also go visit the site’s Austin pizza page.
Matt Cutts: It is still not a good idea to repeat a ton of content over and over again.
Eric Enge: What should they put on those pages then?
Matt Cutts: In addition to address and contact information, 2 or 3 sentences about what is unique to that location and they should be fine.
Eric Enge: That won’t be seen as thin content?
Matt Cutts: No, something like that should be fine. In a related situation, I had a writer approach me recently and ask me a question. He has this series of articles he provides to gyms that own websites. He wanted to know if there was a limit to how many times he could provide the same content to different gyms, yet still have it be useful from a search perspective for his customers. Would it be helpful, for example, if he kept on rewriting it in various ways.
It gets back to your frog site example. The value add disappears. Imagine 4 gyms in the same small city all offering exactly the same advice. Even before you get to what search engines think, users aren’t going to understand what the difference is between these 4 places. As a user, after reading your content, why would I pick one over the other? For search engines, it’s the same challenge.
Find a way to differentiate and stand out, so that people want to try your product or service and see what they think. When they try it, give them something outstanding and earn yourself a customer.
Is there ever a circumstance to request a reconsideration through Google Webmaster Tools without having first received some type of warning message from Google?
For instance, if its discovered that the site was violating Google's policy for a long time inadvertently, the problem is fixed now, just wanted to let you know. Or is it sort of like don't wake the beast and never communicate with Google unless they initiate the conversation?
What you want is for Page 2, Page 3, etc. to automatically show up in the page title, making different page titles for each page.
Depending on what CMS and plugins you are using, this may be easy to add. Otherwise, a PHP script could detect what page number and add it to the title automatically.
As an example from your site....are you concerned with for instance this duplicate content:
http://dailyfantasybaseball.org/category/daily-fantasy-baseball-picks/
http://dailyfantasybaseball.org/2012/06/27/june-27-2012/
I suggest you set WP to only show an excerpt of the content on pages that aren't the actual content page.
I've never purchased fans, likes, followers, or anything else. Buying Facebook fans will cause a drop in EdgeRank as nobody is interacting with the posts. I don't see any negative repercussions for purchasing Twitter followers other than being called out like this publicly.
Check out this article.....
USA Today “Handicapper” Danny Sheridan Bought Thousands of Twitter Followers: http://www.wagerminds.com/blog/danny-sheridan/usa-today-handicapper-danny-sheridan-bought-thousands-of-twitter-followers-6096/
Seems pretty clear Sheridan, who is not a particularly compelling celebrity, is purchasing Twitter followers. I remember when he first launched his Twitter account, he offered (on live radio spots) to donate $1 to charity for every follower he received for a few months.
From the article: "Buying followers is, if you think about it, the height of narcissism. You’re buying followers to dupe casual Twitter users into thinking you’re more popular and influential than you really are. In turn, you’re hoping the phony popularity will help translate into actual popularity as legitimate users flock to a seemingly ‘popular’ Twitter user."
Do you believe this is becoming a problem on Twitter, and do you think it actually can help him or anyone else?
First, Google Analytics reporting does not, to my knowledge, influence SERP rankings. Altering the data collected through Google Analytics should not affect SEO indicators.
Second, this is from here: http://briancray.com/posts/time-on-site-bounce-rate-get-the-real-numbers-in-google-analytics/
Once this code is installed, your site will update Google Analytics every 10 seconds under the Event Category "Time", the Event Action "Log", and the Event Value will be based on the pattern of 0:10, 0:20, 0:30, 0:40, 0:50, 1:00, 1:10, etc.
The script does not change your bounce rate, it just gives you additional information.
I think Google wants authors only. Based on the first of the two ways of implementing author information in search:
This suggests that Google wants authors, not brands. Google doesn't appear to scan for a company name in a by-line, but I guess it might be possible.
Having not ever used this tool, I will go out on a limb and say that any software or service that includes the phrase "Top 10 Rank Guarantee" in the title tag is blackhat.
Also, anytime you see this or a similar phrase "be ready to pile money", just don't. Obvious scam is obvious.
Based on this video: http://www.youtube.com/watch?v=r1lVPrYoBkA&feature=player_embedded
I believe there is nothing wrong with redirecting entire directories. Matt seems to suggest one could even redirect multiple pages to a homepage, and while this isn't ideal, its a legitimate method recognized by Google.
So I was browsing Monster.com and realized that they have a pop-up floating AdSense ad when you scroll halfway down the page....check it out: http://jobsearch.monster.com/search/Full-Time_8?jt=142&q=Search-Engine-Optimization-__28SEO__29&sort=dt.rv.di
How are they allowed to do this?
As I understand, this is not allowed. https://support.google.com/adsense/bin/answer.py?hl=en&answer=48182
In order to ensure a good experience for users and advertisers, publishers participating in the AdSense program may not:
Sounds like a pretty clear 301 redirect
I'm working on a site with tons of great, useful content....the owners of the site implemented a new site layout and design (complete overhaul) and they were lacking basics such as meta descriptions, 301 redirects, and, shockingly, they had the same Title tag for every single page on a site with thousands of unique how-to articles.
Unsurprisingly their traffic dropped by about 300%.
They generate most of their traffic from people learning how to build stairs, how to install crown molding, and other related matters.
Beginning last Thursday I've been performing basic on-site SEO, things like having unique titles for each page and similar tasks. The week from Thursday when I began until yesterday (Wednesday), Google traffic dropped -29.73% - 17,715 vs 25,210
I believe this is a normal part of the "Google Shuffle" -- does anyone have a Matt Cutts link or similar proof that this is a normal part of the process?
That's how I see it....the old homepage was at ..../af/index.asp -- this will definitely get 301ed, its just about the other 800+ random URLs.
There is no penalty for redirecting a page that could otherwise have a 404? Or, once GoogleBot views a 404 on a particular page, is it too late at that point to 301 it? With the "index.asp" page, it still makes sense to redirect because there are links around the web pointing to that location, and those visitors need the redirect.
I'm helping on a site that has tons of content and recently moved from a 10 year old .ASP structure to WordPress. There are ~800 404s, with 99% of them in the same directory that is no longer used at all. The old URL structures offer no indication of what the old page contents was.
So, there is basically no way to manually redirect page by page to the new site at this point.....is there any reason not to redirect that entire old directory to the new homepage?
Matt Cutts seems to think its OK to point an entire old directory to a new homepage, but its not as good as the 1:1 redirects: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93633
Any thoughts?