That refers to the number of inbound links, linking domains, and linking C blocks that point to a specific page.
Basically getting more links to your page about blue widgets will increase that pages rankings.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
That refers to the number of inbound links, linking domains, and linking C blocks that point to a specific page.
Basically getting more links to your page about blue widgets will increase that pages rankings.
Our site is doing the same. EZWatch-Security-Cameras.com
Restructuring allows you to organize your content into "Silos" and eliminate some of the unnecessary links through JSON objects, iframes, and nofollows (some debate there on which is optimal).
The Downfalls: There will probably be a 30-90 day dip in traffic but if you do page-to-page 301's it won't hurt nearly as much. Anytime you change URL's you are likely to see some inbound links drop off, but that happens when the pages don't change, so it's going to be minimal. Sure, 301's might drop some of your PR during transfer, but only a minimal amount according to matt cutts.
The Benefits: Organizing your content into silos and pruning cross-category links will allow you to control the flow of pagerank and anchor text much more effectively. Bruce Clay has a huge amount of resources on this, and they even cover it on their blog.
Will it be painful? Yes.
Will it be worth it? Yes.
If you use the Asynchronous tracking code it shouldn't cause any noticeable lag in page load times
Placing the code inside the body tag won't validate through W3C. There's really no reason I know of to place it anywhere else.
Using CSS Absolute Positioning.
You need to add a hook to the P tag, such as a class or ID. Then make room for the paragraph at the bottom of the page by placing some sort of spacer there, like an empty div with fixed width and height.
I don't think that google would penalize one site because it shares a common owner with another site of poor value. However, it's likely that the common owner utilized the same tactics across all sites and something THEY have done could have caused the penalty.
As far as the ban on AdWords, again, I don't think it would negatively affect a site right out of the gate.
If you've been banned from AdWords and you have multiple sites suffering a penalty, It sounds like there's an issue with the overall approach to SEO.
Google says they don't count nofollowed links, but of course they say a lot of things. If it's a link you really expect absolutely zero traffic from, don't waste the time working for it. That being said, if you're building links that will never drive traffic, maybe you should rethink your link strategy.
Recently google has really downplayed the importance of exact match domain names. If the domain name is newly registered it probably hasn't earned any links yet, and therefore wouldn't pass any SEO value.
Copyright notices are really just a signal for the users, and to say "don't steal our stuff" in somewhat of a non-forceful way. People can generate the year automatically, but sometimes they just forget to include it.
Is it a big deal to have an older date? Probably not. Google has a number of other ways they could check a page's freshness aside from a completely arbitrary copyright date.
An interesting note about copyright, you don't really have to display a copyright notice in the first place. As soon as you create something, it's already considered "copyrighted."
The most common services we see through addthis are actually facebook and email.
zharriet has a good point. You could really do the same thing with a few icons and skip the javascript.
I've seen some minimal numbers, probably just a few +1's on my company's homepage. I think as google+ becomes more popular (which I think it will), people will start actively using the +1 button.
What is the size of a retargeting pixel and who places it on the site ? Is it the retargeting company ? Can we place it ourselves ? Does a code have to be added to the site ?
The retargeting pixel is simply a way to record a cookie to the users computer. When you setup the retargeting campaign either through google or some other method, you should be given a code to add to your site.
What do you mean by burn pixel ?
When a user converts and you no longer wish to target them, you can send another pixel (cookie) to identify which users have already converted, somewhat disabling the previous cookie.
How do we come to know that a burn pixel has fired ?
The burn pixel is placed on the users computer in the same way that the original pixel was placed, they load the screen with the page with the "burn pixel" and it's stored on their computer.
Is there a reason you have 25 different pages? Could you consolidate everything into one page or does each page offer a separate download item?
If they are different tasks, they should have different titles. If they are the same, then there should probably only be one page.
I would request that they create a special email address for you, such as nomad(at)site(dot)com and use that for the facebook business admin.
As mentioned above, fan pages have a lot more useful tools. You can see some cool examples at facebook.com/redbull and facebook.com/cocacola
Not specifically, but with the rel=canonical tag, it should ignore them.
Google views the canonical tag as a guideline, not necessarily a command. They do this to prevent people from shooting themselves in the foot by misusing or improperly applying the tag. It may be useful to still have pages crawled regardless of the canonical tag.
I suggest that you ignore the warnings for canonical pages and focus on the others. Perhaps SEOMoz could build some logic into their crawls to handle this in the future.
The search engine spiders will look at a page to determine if something has changed. If there is a 10% change in body content, then it might not require a re-index for the spiders. If the heading tags change, then it might deserve a refresh.
I think your link will be indexed eventually, but if nothing else on the page changes, it's going to take longer.
In that case, I would probably focus my efforts on a parent page. The keywords you are probably targeting won't have a lot of competition, and a high ranking top level landing page will also contribute to the rankings of lower level pages.
In addition, the concept of "Evergreen" comes to mind. When you cycle through model numbers or inventory, you would lose the value you've built into the lower level pages. If, instead, you focus on your top level "Evergreen" pages, you can update those pages without losing rankings.
1) Best way to redirect all new URLs sitewide
Page to Page redirects is the best solution, even though it's a major pain. I would take a look at your traffic data for at least the past 90 days, possibly even further, and create a 301 map from every page with inbound traffic to the correlating page on the new site.
2) The prudence of heavily editing product listings at the same time of redirecting the URL (i.e. updating product descriptions)
If you're up to it. However if rankings drop then you won't be able to track it back to a particular change.
3) Site structure: Should I strive to keep the new site link structure as similar to the old as possible?
If the current site structure converts well, then maybe you should stick with it. The nice thing about switching to a new platform is it's a great opportunity to reinvent your site if you feel like it needs it.
4) Resources or guides on transitioning a site from a SEO perspective
This article might be a little dated, but it should point you in the right direction. http://www.seomoz.org/blog/10-things-relaunch-your-website
Good luck!
'm thinking that by linking out to Mobile Casinos and Polish Rock Bands, he's probably losing credibility.
A great article. I'd suggest you read through that post completely.
If you're in position one, you probably want to hold onto it. I don't think it would hurt to do something like that but I wouldn't worry if you're fairly competitive already.
I think the real benefit might be in picking up some long-tail keywords if that's what you're going for.
PDF's are crawl-able, see this article from Webmaster Central
With that in mind, it's also a good idea to have the content available online as HTML. It's accessible from a wider variety of devices, and doesn't require you to update your adobe reader.
It sounds like you want to find some way of integrating the videos with their current e-commerce site, but you don't want to push the category links below the fold and ruin the user experience. Is that correct?
I think the best way to do that would be to way to do that is to have a separate page for the video and transcription, as well as the sharing icons, etc. Then, create a thumbnail image to place on the relevant landing pages and include a short description so that it fits nicely within the site flow, but doesn't move the content too far down.
Another option is to have a link to the video page using a header banner (468x60) or similar with a clean call to action. That would only push the page content down 70 or so pixels.
I think the description and alt tags may count towards the keyword count/distribution, but google ignores the keywords tag for most search purposes.
Were you building inbound links to those pages?
If it was only down for a few days, it's likely that you won't lose any of the inbound links. If the links are still down you should probably create 301 redirects so that visitors don't hit a 404 page.
If there weren't any inbound links to those pages then you should be okay link-wise.
Yahoo and Bing both use the same Index, that's the portion that yahoo has licensed from bing.
As far as the ranking algorithm is concerned, yahoo uses a large amount of bing's algo, but they have built (or are going to build) a customization layer on top of the bing algorithm to change how different metrics are weighted.
Yahoo's results could differ drastically from Bing results based on the fact that they weight metrics differently.
The SEO Toolset from Bruce Clay is $29.95/mo and it includes an on-demand ranking checker - http://www.bruceclay.com/seo/tools.htm
I'm not sure about the PDF export, but it definitely does Excel.
I think that the number of links you're gaining might cause a problem, not to mention the quality of those links. Google assumes a natural pattern in link acquisition, so when you suddenly jump from 5 links per week to 500, that triggers a red flag because it doesn't appear natural.
How many hours are you spending on this link directory? Perhaps you could take the same amount of time to write a well-researched guest post, or some other type of op-ed piece.
That type of content is much more likely to earn the type of high quality targeted links you're looking for, without junking up your link profile.
Open Site Explorer should display the Page Authority and Domain Authority scores for the inbound links listed. Those scores are out of 100, and the higher they are the better. A quality link could have either a comparatively high Page Authority, Domain Authority, or Both.
Tagging posts provides a good user experience, allowing them to scan through a number of related posts. It does create a number of duplicate content issues though.
To solve this on my own wordpress websites, I've added a noindex tag to each of the "archive pages", which includes tags, categories, authors, and dates. This effectively cuts out the duplicate content areas and I haven't seen any unusual fluctuations in search traffic.