I think that "nofollow" should be used when links are "paid for" or the destination of the link is "not trusted" or "potentially not trusted".
I would link to my social accounts with followed links.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I think that "nofollow" should be used when links are "paid for" or the destination of the link is "not trusted" or "potentially not trusted".
I would link to my social accounts with followed links.
Keyword domains used to be a lot more powerful than they are right now. Based upon watching a lot of their rankings, I believe that Google turned down their value in early 2011 - shortly after Matt Cutts said that their value would likely be turned down. Maybe they will do that again, I don't know.
I agree that some of these sites rank well with very little content and poor user experience. That generally occurs where competition is rather low in Google or where it is low to moderate in Bing. Where competition is high a domain only makes a fractional contribution to the rankings. So when you see them ranking well for highly competitive terms they are doing the same type of SEO as any other site with poor onsite assets.
I own several keyword domains. Some of them have top rankings for their exact match query and some of them don't. There is no special formula for beating them. Just compete against them as you would any other domain.
If you ask me the bigger problem is google giving easy top rankings to weak content on ehow, about, wikipedia and other powerful sites.
How hard is it to get these very annoyingly favoured domains off 1st?
This really seems to bother you. But if you turn that around you would consider them to be a huge asset. So, maybe you should just go out and buy one. Find the guy who owns one and ask "what will it take for you to sell it to me?"... or hire a pro to do that for you. I've done it a few times and am happy with most of the results. They seem to produce a higher conversion rate too.
I link out a lot. Every page that I link to is better than my own site in some way. I don't worry about linking out and have zero hesitation to linking to six, eight, ten great sources of info on a single page.
I put everything that I have into the page.
Six nice photos, two videos, a couple of charts and 3000 words - plus ads.
Be sure that you optimize the photos and have a good server with caching.
Nobody knows for sure but I am betting that diminishing returns begin at about the second link.
However, if you are getting links from the Pope's site then you can probably get an awful lot of them before the value depletes to a really low level.
That is a good idea, Alan.
So far I have not been using schema... but google does grab some of my tabled content for display in the SERPs.
I have not used schema because I honestly don't want to figure it out and procrastinate that job by writing content.
I wake up in the morning and look at my job list and say... "I should do schema today." .... then.... say... "I don't want to do that, I'll work on an article instead".
I really should do it... thanks for the push.
Yes.... we place a lightly colored box under each image and use that as a space to give a generous keyword-rich description. We also use that space to attribute the image to source or creator - sometimes with a link.
A typical article might have 2000 words, six images and several hundred words of image descriptions.
We also love to include data tables in our articles. These could be locations, numbers, names, etc... .whatever small data summaries that might add interest to the article.
Yes. We have a few niche retail sites and an info site that are all competitive in their niches.
It is not about quantity. It is all about quality. When we put up content it is usually the best page on the web for the keyword that we are targeting. We post it onto the site, it usually does not rank very well at first, but a year or so later many of the pages that we have made are at the top of Google. Some are not. Put up the pages and don't worry about them. If they are good their rankings usually rise.
If I was going after local terms I would be using local search optimization methods. Most of the queries that we are going after have nothing to do with a local service or product.
Post everything that you do on your homepage. Send it to everybody who knows you (and has opted in). We have a couple of blogs that each get several posts per day. All of those posts automatically displayed on relevant pages across out site. They go out to twitter and FB.... we also have email and RSS feed subscribers. We use addthis buttons all over the site to make it very easy for visitors to share.
A few times per year we have a topic that we are excited about and will submit it to sites like reddit, digg, slashdot. That usually doesn't work - but when it does the traffic is great.
I only work on my own sites.
We have a lot of short articles on our site that were first posted several years ago. We are enhancing them with much more substantive text, more images and captions. All of this information is on the same topic - just greater detail.
As we add this new information we see an immediate increase in long tail traffic as search incorporates the new words that appear on the page and new images get into image search. We also usually see improved rankings.
My question is.. how much can this really help you to rank for 'insurance london ontario' if all your doing is building links to that blog article, and not the main page?
Using Wikipedia as an example.... When they place a new page on their site it will often rank well in competitive SERPs based almost totally on links that hit other parts of their website. Domain level factors such as authority and trust make these new pages rank well.
I know it helps the overall domain authority, but is it enough to get you ranking for your goal phrase, or is it just a supporting method?
Is it enough to get you ranking? This method is like trying to sink a boat, but instead of dropping bombs that will blast a hole in the boat you instead drop bombs beside the boat hoping to splash enough water in to sink it. Not very effective by itself.
However, I think that it is a good supporting method because every one of those links helps and a website that has 100% of the links pointed to its homepage will not look like as fine of a resource as a website that has a diversity of domains linking to pages all over the site.
This is the method that I am using on one of my sites. I don't do link building. Instead, I simply produce content that slowly accumulates links and causes the entire site to slowly rise in the rankings. It's not a fast way to get your homepage ranking but I am slowly saturating the SERPs for all of the minor queries. The long-term results work well.
I would make the development of some unique marketing statements that fit the community of the franchisee a required part of the franchise application.
This will force the franchisee to learn about the business, think about how it will work, how it should be perceived by the public and do that in the unique context of their community location.
Get this information and work out of them while they are still hungry for the opportunity.
Give them a template to make it easy. Then congratulate them on developing all of the information needed for their new website.
This will benefit both you and the franchisee.
I would be demanding about them doing a great job on this.
For the question that you asked, yes, to rank for a term you need a good relevant page of content with very strong social signals (links, likes, etc.) to promote it.
Thanks Matt! That makes sense.
Right... those resources would be content and an ability to get social attention such as links, mentions and likes. Those links could be earned on the merits of the content, attracted by the popularity of the brand or purchased.
Some companies spend a hideous amount of money on links.
After your pages are optimized (which requires smarts) then you need to devise a way of demonstrating the importance of your site to the search engines, which requires smarts to plan and resources to achieve.
Matt,
What do you think about humorous video? That would be great for getting links and attention... but would that attract customers?
Sun Tzu says that you can "know how to conquer without being able to do it".
Even if you have an advanced knowledge of the algo, in an extremely competitive space the competition is a "battle of resources" .
"I am working with a pest control company that is looking to branch out into social media. Their primary goal is to connect with their existing customers to improve retention."
I don't think that the average person is looking to link up with and stay connected to their bug guy about bed bugs, rat droppings and flea problems. They want a bug guy who will fix that problem silently and get out of their space ASAP. If bug guy can do that then they will call him if they ever have another problem.
My message to people would be that we fix your problem quickly and with minimal visibility.
Posting about bugs on a daily basis gets pretty boring.
ABSOLUTELY!
I would go with informative content that people can share when conversations arise.
Reading your list of possible topics.....
bicycle maintenance
wildlife photography
tennis racket re-stringing
brewing beer
windsurfing holidays
I would register a domain like FunHog.com and attack.
You need two things.... First you need smarts. Then you need resources. If you fail in either of those then you will be ineffective.
Two possible exceptions to the above would be replacing smarts with "dumb luck" and replacing resources with something "incredibly viral" - both of which have a very low probability of happening.
I didn't know that google was complaining about your links.
If you or someone you hired bought links, traded links or made arrangements to have links created for your site then you might try contacting those webmasters, tell them that google doesn't like the links and ask them to remove them.
Good luck.
Lots of these types of links are simply part of the content on spammy websites. They are often produced by a robot that scrapes websites and republishes snips of their content, including links. The more popular your website becomes the more of these types of links you will acquire. In some industries there are enormous numbers of spam websites. My site has a million of these links and I am not going to do anything about it. It would cost thousands of dollars in labor just to make "an attempt" at contacting these people and I bet that the response rate will be really really low. Google knows that these sites are trash. I don't think that those links are helping or hurting very much at all.
Just a couple of questions....
What work have you done with these clients to understand their business and determine where their products and services should be visible in search? Perhaps they need a marketing plan before linkbuilding?
Also, for each query where they hope to be visible they will need a page of attractive, informative, and compelling content. Perhaps they need content development so that each built link will hit a page that effectively promotes their business?
The above might be part of the linkbuilding proposal. It will help educate them and identify the targets and content needed for the linkbuilding to be successful.
If this is an important page on the site that has many links into it from pages across your site then I would not worry about exceeding the 100 links.
Watch Matt Cutts here... http://www.youtube.com/watch?v=l6g5hoBYlf0
Philipp makes good points that ads can divert attention from your brand and your sales products. I agree with him.
However, my site is still selling a lot of merchandise. I don't have ads in people's face on merchandise pages. If there is an ad on a merchandise page it is at the bottom - most don't have ads. My ads focus on article pages.
Finally, you can block certain types of ads and also ads from competing domains. Adsense, tribalfusion and most other ad networks have a variety of ad blocking methods.
What is a good amount of pages to have?
Your long-term goal can be to have pages for all of the important keywords of your niche. I have a site that I have worked on every day for years with employees and there is still an ocean of content that we can produce. Endless job.
I have another site in a much smaller niche that we are producing content to the point of keyword cannibalization. The goal is to absolutely saturate the keyword range of the niche.
Lots of retail sites have extensive article libraries that attract traffic, likes, links and make the site popular. These articles often describe how the products are used and are especially valuable on sites in do-it-yourself, personal improvement and hobby niches. I have a retail site with a lot of how-to-do-it, historical and review content and those articles account for about 1/2 of the traffic. They also produce some sales. In addition, I monetize them with ads.
On a more powerful scale is an information site with a store. These can be really popular and be monetized with house ads that funnel traffic into the store and third party ads that produce income. I have one of these that is supported by ad revenue and a store that sees revenue growth in proportion to the traffic - as most of the purchases are impulse. In addition, your sales will be tied the the effectiveness of your ads and their placement - experimentation is essential if you want to get the most out of them.
I found a great writer by going out and looking at blogs in my niche. This allowed me to see the quality of the writing and the productivity of the person over time.
Also, if you need general writing you might be able to hire a person who graduated with honors from an English writing program. I ran an ad on Craigslist - wrote the qualifications really steep - and was very surprised at the quality people who applied.
"indexing time stamp"
I can say with absolute confidence that an "indexing time stamp" does not exist with Google - or - it does not work at least 1/2 of the time.
I thought Panda was supposed to enhance that ability rather than do just the opposite and punish the origins of the content.
From what I have seen, Panda is a domain-level throttle that impacts sites that trigger an invisible trip wire. The presence of duplicate content (you duped somebody or somebody duped you), slapping visitors faces with ads are possible locations for the wire.
Google has no reliable way to know the originator of content unless author pages and recip rel="me" links are in place - and those can be forged by others.
Thank you Alan. I think that is a really good idea.
On the site with the long articles, all of the content that I have written appears there without attribution. I have hundreds of articles on the site and enjoy them being anonymous.
However, I know that your suggestion might fix or reduce the problem. I've thought about doing it in the past. I need to put some thought into claiming the content. It would probably increase my income.
Thanks for making me think about this again. You deserve more than one thumbs up for your reply.
I have some pages that once performed well and suddenly started to decline. They had been on my site with product descriptions that I had written a few years ago. I looked for dupe content and found that lots of domains that feature products that are "made in China" had copied my text. I don't know if that was the cause but maybe.
I also have a couple articles of a couple thousand words on a topic that currently is getting a lot of search. I noticed that my long tail traffic for these pages was declining. I found that a lot of spam sites (dozens and dozens) had grabbed a few sentences from my site and a few from several other sites, slapped them together and were now ranking for very long tail queries.
I see.
Everything after the ? is a parameter.
You can use rel=canonical to pass the credit for these links to the base url on your website. Or, you can add code to your .htaccess to 301 these URLs to a genuine address on your website.
I see a lot of URLs with garbage characters in my link reports.
My guess is that these are links generated by spammer programs that contain errors.
Anybody agree with that?
I visited random pages on the site and grabbed a random block of text, placed it between quotations in the google search box and submitted the query. I did this five times and four of them returned duplicate content on other domains. These grabs of content were not Bible quotes. They were information that I would assume to be unique to this site as they were unattributed.
A site that has a lot of duplicate content - even if done with permission or under license or plagiarized by others can suffer from Panda.
There is no standard answer to your question. In addition to Mcarle's great answer, you can see what the creator of an image has to select from here... http://creativecommons.org/choose/
If you simply move a site from Server A to Server B or from Hosting Company A to Hosting Company B there should be no change in your rankings.
Sometimes slowness is not the fault of the server or the host. There are lots of things that you can do to speed up your site. These include: optimizing images, removing code bloat, caching pages on the server, reducing the number of files called per page, and much more.
We cut average load time by nearly two seconds per page by doing the above.
If you are calling in widgets, images or data from other domains they can really add to your load time.
I have 301 redirects on my sites. Every one that I have done is still out there in htaccess. I am not taking chances on how search engines handle these.
Other than deleting useless pages, I rarely change URLs. If I don't change URLs I don't have to worry about this stuff.
I have not emailed other sites to change the URL in their links. I have only changed the URLs on my own sites. I would worry that asking someone to edit a link might result in a loss... so I am happy with the redirected link.
Your redesign guy could have used enormous-size images in the design. He could have added code that takes time to execute. There could be calls from other websites for ads or widgets or images that are slow.
Let's say that my website has ten links to your old URLs that deliver hundreds of visitors per day.
What will happen to all of those visitors if you remove the 301 redirects?
With a name like that, even the PhD's at Google will be able to tell that they are trying to manipulate SERPs.
wondering why these companies pick names that paint targets on their butts
What are your objectives?
If you have "a message to get out" then you should be sharing your content eveywhere with everybody.
If you are trying to make money and develop a reputation for your site then ask yourself these questions....
- what will happen to my income if ten other sites have each of my articles in the Google index?
- who will get the social sharing for all of my hard work?
- why are these people asking to use my content instead of linking to it or tweeting about it?
The only option that you list that I would do is... "Only let people cite the introductory paragraph sentence, with a link allowing them to read more at our site (with good anchor text)"
Only let others use the exact content well after we've been crawled, so we'll be seen as the first publisher.
You publish first and then an authoritative site publishes.... they will outrank you. They will receive your traffic. They will receive your links, likes, tweets, etc. Thanks!
I used to have lots of little sites. Lots of them.
Then I built a big site and it beat all of the little sites and almost every one of their competitors. Traffic went up. Sales went up. Average shopping cart size went up. Profits were higher.
When people land on a little site that sells brass widgets they might like what you have. But if they land on a big site that has that same information about 50 different widget it is a lot more impressive. They really know that you are in the business. They don't get that with a hotdog stand site.
Lots of people believe that they should have a large number of sites because they believe that it will produce a "linking scheme" that will give them an advantage. It doesn't. If you can get 500 links it is better to have all 500 pointing at a big site than 10 pointing to each of fifty sites.
That's my two cents.
If I had a project like this I would make one big asskicking site.
In my opinion, that will be better for SEO, better for cross-selling and better for efficient use of your time.
There is a website... http://whichtestwon.com/
They show the results of lots of testing. There you can look at two different versions of a page and find out which performed better.
If you view lots of their tests you will slowly start to get an idea of what types of change can be effective , although there will always be results that surprise you.
Here is the question...is there an acceptable ration between the percentage growth in visits and the percentage decrease in conversions?
I have found that ratio to be highly variable. For example, you start out with a retail site that tightly targets the products that you sell. Then you branch into articles that inform people how to use those products. This free content can attract a lot of visitors who are looking for information. However, these people may have already purchased or are considering the purchase. Perhaps they are even long tail keyword visitors with queries that are unrelated to a purchase.
So, the conversion rate can really fall.
What I try to do is to develop a different type of conversion. Such as identifying pages that pull lots of nonconverting traffic and placing adsense on those pages and house ads for your own products.
If you sell at a good discount this can really be effective because the visitor sees your kickass price... then visits your advertising competitor who sells higher... so visitor comes back to you for the purchase. That way you make the sale and take a piece of the competitors advertising budget.
Ads don't hurt my sales much at all and are easier money.
If this solution works you have a lot of potential customers.
Here is a thought... Since google is demoting sites that have lots of duplicate content if you use name="robots" content="noindex, follow" /> on all of the duplicate pages then the pages that remain in the index will have a better chance of ranking.
There are also other ways to keep them out of the index.
2. Given the massive product database (35,000) and retailers (2,500) it is not feasible to re-write 87,500,000 pages of content to sate unique content needs. Is there any way to prevent the duplicate content penalty?
Everybody everywhere is asking this question. "I have twenty-five websites that sell the same product and I use the same product description, photos, captions, title tags, etc. on every one of them. Is there anyway to fool google into believing that these are unique?"
Your problem is 100 times larger.
Looking at the history..... Google has been killing "instant storefront" websites for the past several years
If you figure out a way to do this you will be able to make a lot more money selling the solution than you are going to make from your 35,000 products.
I think that is the money making opportunity.