Quite correct. I unfortunately assumed that everyone already knows this bit of information.
- Home
- FrankWickers
FrankWickers
@FrankWickers
Job Title: President & CEO
Company: Digizle
Favorite Thing about SEO
Ever evolving challenge.
Latest posts made by FrankWickers
-
RE: Redirecting multiple websites to a single website
-
RE: Redirecting multiple websites to a single website
You should be able to mass redirect all of the sites at one time, and be fine. I've done this with a client situation once before, but I'll grant that during that case we redirected only 11 domains at one time. I can't fully vouch that it will be the same as 30 domains... hopefully someone else may be able to lend their experience with a higher number like that.
I'd say something else you really want to take into consideration here is do you want to do a domain wide redirect, or a page based redirect? One would entail far more work than the other, but depending on circumstance it might also offer greater benefit.
With a site-wide redirect, you will of course be funneling everything from the old site onto whatever target page you choose on the main site (likely the home page I'd guess). When Google notices that a domain has a site-wide 301 on it, that domain will quickly start to fall out of the index, and the link power it had can also surprisingly quickly start to degrade. Pretty straight forward stuff over all.
If you were to use a page by page redirect method, you could sort of custom tailor what new pages you'd like the old link juice to flow to. Say on one affiliate site you have a hub page for custom truck grilles, and it has a large amount of links coming into it. You could specifically 301 the affiliate page to its "sister" page on the main site, which would then boost the target page by a greater amount than if you simply did a mass site redirect.
Even another situation to take into account... If the content across your affiliate sites and main site are all similar, you could use technique #2 with rel canonical tags, rather than 301 redirect.
-
RE: Front page dropped to PR1 - thoughts?
Another idea that you need to examine... the company hasn't bought any links, or become involved with what might be considered 'Black Hat' SEO, has it?
I say this only because if any Black Hat SEO occurred, and Google slightly caught on to it, they might have intentionally lowered your toolbar PageRank to send you a message before taking any sort of direct action.
-
RE: Need solid, no jargon definitions
Linking root domains refer to links from any page on the domain.
I, personally, would be somewhat careful about the foreign sites thing, especially those which may be in different languages. My view on it is that 1) You might begin to seem manipulative with your linking tactics to the SE's and 2) you might unintentionally start to pull off a bit of "international SEO" which could have consequences that, while not necessarily bad, wouldn't be the best for the issues you need to solve at hand.
That's my own view on the subject, though, so certainly don't take it as Gospel.
-
RE: Linkscape update tracking
Hi Vinny,
In general LinkScape updates every 30 or so days. So if you gain a new link or 3 part way through LinkScape's cycle it's very possible that it will not be included in the most recent update. Also, as awesome as it would be if LinkScape's directory were as large as Google's, it isn't. This is mostly just due to technology constraints I'd guess, and not having dozens of ENORMOUS data centers like Google. Because of this there is a chance that some links will not be picked up (especially if the links might be on really new sites, or maybe rather spammy ones). So it really turns into a waiting game to see if those links you know exist are picked up on the next LinkScape cycle or not.
Finally, unless I've managed to totally avoid it this whole time, the index doesn't have any sort of history. The amount of data storage required to maintain such a history would be crazy in conjunction with how often they currently update their index. To name drop one of SEOMoz's competitors (sorry guys XD ) majesticseo.com does actually offer historic indexing services. I don't know how in-depth they are when you're fully subscribed, as I've never been a paying member of their services. It may be something worth checking out.
And if someone comes into this question and corrects me about SEOMoz having historic index access, definitely do that instead! I know I will =P.
-
RE: Need solid, no jargon definitions
As Craig stated, it's a good idea to read the Beginner's Guide if you haven't already.
However, if you're just trying to make sense of the metrics you put up I'll see if I can help.
Total Links - This is the total number of links coming into a given page or domain (depending on which option you've selected to view.. page based or domain based).
Ext. Followed Links - The number of links pointing to a page which are not from your own domain (ie internal links), and do not have the rel="nofollow" attribute appended to them.
Linking Root Domains - Really it's as simple as it says. It's the number of the individual, unique root domains that are linking to a site. 2 different root domains would be something like site1awesome.com and thisisawebsite.com.
Followed Linking Root Domains - Same principle as the above "Linking Root Domains", but this one takes out all of the links from the domains which have added the rel="nofollow" attribute to those links. It only counts followed links from root domains.
Linking C-Blocks - Sort of difficult to explain. I'd recommend reading the following thread on the Search Engine Watch forums, with particular interest to the first reply. It should hopefully sum up C-Blocks for you.
Hopefully I've helped you understand the metrics better than before reading all of my chicken scratch.
-
RE: Link Acquisition Assistant
That would go hand in hand with what is discussed in the 2 videos I just linked in the above answer (which I edited in afterwards, so you may have missed them before). Definitely I'd recommend watching both of the videos. I believe they'll help answer a good number of your questions.
-
RE: Link Acquisition Assistant
Hi Peter,
The answer would be both yes, and no.
If the directory is ranking high on the Google SERP then it stands to reason that the page and domain in question likely have good ranking metrics and signals. If that is the case then there is a good chance it could be a strong link to help your site.
However, here is where the "no" part of the answer comes in to play. It would depend where the link is actually being placed on the site. If the link is not going to be on the exact page that you're landing on from the SERP page, then you aren't getting the entire story. You would have to do some digging around to be able to fully understand which page the link is going to be placed on, and what the metrics of that page are.
Depending on where the link would end up, the #6 result on the SERP could actually provide a stronger link for you than the #1 result. Other things to take into consideration of course are topic relation between your page and the linking page, and the amount of total links on any given linking page.
I'd suggest watching the following recent Whiteboard Friday videos, if you haven't already. They can help you better evaluate the potential link value from individual pages and/or sites.
1) http://www.seomoz.org/blog/which-link-metrics-should-i-use-part-1-of-2-whiteboard-friday
2) http://www.seomoz.org/blog/which-link-metrics-should-i-use-part-2-of-2-whiteboard-friday
-
RE: Should I try to optimize for SEO a site that lives only for 5 days
If the site is only going to be in effect for 5 days I wouldn't put much effot into heavy optimization. I would only do basic on page stuff that requires all of 5 minutes of your time - title tags and such.
Reason being there is a high chance the Google bots won't even come by and index the site within the 5 day period.
-
RE: Viewing Specific Links in OSE - For a Specific Month
I actually don't know if OSE gives that option, and if it does I've been missing it this entire time, but I've used majesticseo.com for awhile now to do that sort of tracking/research. Their tool gives a pretty accurate report of link discovery, including how many links were discovered in a given month.
Best posts made by FrankWickers
-
RE: SEO on a mature site - diminishing returns?
Are there diminishing returns for optimizing a site? No. At least not as far as being penalized for organically achieving new rankings.
Now then, you say you're getting good high rankings for various search terms, but you're not seeing increased traffic. Let's figure out why that could be.
First of all, when you're checking you're rankings, make sure you're not receiving personalized results for having clicked on your own website too many times. Do things like make sure you are logged out of any Google accounts you may have, and add the code &pws=0 to the end of the URL for the searches you do.
If personalization is not affecting you, and you are achieving those ranks, then you likely have some other problems. There are 2 things most likely affecting you.
1st) Are you using the right keywords? If you're a company that offers a 'credit card fraud investigation service', you don't really want to be ranking for terms like 'credit card fraud'. That term isn't specifically related in its intent to what you are needing. Someone searching for credit card fraud is probably looking for information about, can you guess it, credit card fraud! They're probably not so much looking for people who investigate credit card fraud.
- Your search description snippets might not be working well for you. Make sure you have meta descriptions declared on every page that is ranking, or is of importance, which clearly describes the content on that page. If you don't have meta descriptions when someone see your site in a search result, they could be seeing a description that looks something like "We have been in service since...... keyword is what we do if..... we don't think that way". Basically they're seeing something that makes no sense in relation to what they searched for.
If you declare targetted meta descriptions, you're likely to have a better click through rate, and thus more traffic.
-
RE: How do I get Google Places to pick up my reviews from other sites?
The Google bots automatically crawl sites such as Yelp, Insiderpages, and Citysearch, so in time they will manage to reach your profile on those sites. Once they do reach the profile, they will usually automatically scan the reviews there, and add them to your Google Places account's reviews.
Basically it's all automatically done. You just have to sit back and wait. It can take up to 3 months from personal experience, and even potentially longer. If nothing else you could use the time to find a nice bar on Yelp, and go hang out there until the Google Bots decide to get their gears in motion.
Also, make sure that the information listed on your Yelp/Insider/Citysearch profile is pretty much the same as the information on your Google Places profile listing. Things like the business title/name and address should be the same, so that way Google can be sure that the profile it is looking at on Yelp is for the same company it has in its own Google Places databse.
-
RE: Need solid, no jargon definitions
As Craig stated, it's a good idea to read the Beginner's Guide if you haven't already.
However, if you're just trying to make sense of the metrics you put up I'll see if I can help.
Total Links - This is the total number of links coming into a given page or domain (depending on which option you've selected to view.. page based or domain based).
Ext. Followed Links - The number of links pointing to a page which are not from your own domain (ie internal links), and do not have the rel="nofollow" attribute appended to them.
Linking Root Domains - Really it's as simple as it says. It's the number of the individual, unique root domains that are linking to a site. 2 different root domains would be something like site1awesome.com and thisisawebsite.com.
Followed Linking Root Domains - Same principle as the above "Linking Root Domains", but this one takes out all of the links from the domains which have added the rel="nofollow" attribute to those links. It only counts followed links from root domains.
Linking C-Blocks - Sort of difficult to explain. I'd recommend reading the following thread on the Search Engine Watch forums, with particular interest to the first reply. It should hopefully sum up C-Blocks for you.
Hopefully I've helped you understand the metrics better than before reading all of my chicken scratch.
-
RE: Redirecting multiple websites to a single website
You should be able to mass redirect all of the sites at one time, and be fine. I've done this with a client situation once before, but I'll grant that during that case we redirected only 11 domains at one time. I can't fully vouch that it will be the same as 30 domains... hopefully someone else may be able to lend their experience with a higher number like that.
I'd say something else you really want to take into consideration here is do you want to do a domain wide redirect, or a page based redirect? One would entail far more work than the other, but depending on circumstance it might also offer greater benefit.
With a site-wide redirect, you will of course be funneling everything from the old site onto whatever target page you choose on the main site (likely the home page I'd guess). When Google notices that a domain has a site-wide 301 on it, that domain will quickly start to fall out of the index, and the link power it had can also surprisingly quickly start to degrade. Pretty straight forward stuff over all.
If you were to use a page by page redirect method, you could sort of custom tailor what new pages you'd like the old link juice to flow to. Say on one affiliate site you have a hub page for custom truck grilles, and it has a large amount of links coming into it. You could specifically 301 the affiliate page to its "sister" page on the main site, which would then boost the target page by a greater amount than if you simply did a mass site redirect.
Even another situation to take into account... If the content across your affiliate sites and main site are all similar, you could use technique #2 with rel canonical tags, rather than 301 redirect.
-
RE: Viewing Specific Links in OSE - For a Specific Month
I actually don't know if OSE gives that option, and if it does I've been missing it this entire time, but I've used majesticseo.com for awhile now to do that sort of tracking/research. Their tool gives a pretty accurate report of link discovery, including how many links were discovered in a given month.
-
RE: Why are we not seeing similar traffic patterns in a new market?
First, are you being outranked by competitors? Try doing some long tail searches similiar to what you get for areas like Atlanta.
If you seem to be ranking for those searches just as well as you normally would, it could be that areas like Nashville have less heavy internet users. Keep in mind that you're likely to have more results for a good ranking result based in NYC compared to the boondocks of Oregon... simply because of the number of people in the area searching for the item in quesiton.
-
RE: Best approach to launch a new site with new urls - same domain
Let me make sure I have this straight... you're not going to be directing the new site format to a subdomain permanently, right? You were only using the sub domain for beta purposes?
The way I see it, when I go to Sierra Trading Post's site now I can make out what looks like 2 different types of architecture structures. You have one link on the page pointing to Men's clothing which executes at a single defined .htm file. Then you can see that you have the "Men's Classics" (still general men's clothing?) which points to a directory which I'm guessing is your new site. Correct me if I'm wrong on this, or if I'm right but have the old vs. new reversed.
If that is the case your best bet to try and minimalize any ranking impact would be to 301 redirect pages from the old catalog architecture to the new. That way you could remove the old site files completely and let the server take care of the direction.
If you need to leave the old site up for throttling purposes like you said - you could use canoniclazation tags to refer the old pages to the new ones. That along with employing 301 tags would help train the search engines into understanding what you're doing.
I'm sorry if I didn't answer your question as you needed. I'm still not sure if I understood your issue as intended. =P
-
RE: Best SEO structure for blog
Well the problems you're trying to overcome are the exact reasons why a good CMS blog system pulling and storing posts from a databse is extremely effective.
Doing things your way, all static HTML/CSS with no databse, it would definitely make sense to only list the most recent posts on any given page/chategory, and then come up with an archive system for the rest.
You should have a search feature you can put on your site so as to let people easily pull up older buried posts. I don't personally have any experience with it, but you could try Google's Custom Search Engine to see if it could accomplish what you need.
As far as the hierarchy of the domain levels goes I would never go deeper than 4 levels with your categories/posts. You can almost never have too flat of a hierarchy... example being looking at SEOMoz's structure. They are a massive blog with a large number of posts, and yet almost all of the posts trace to a URL structure of seomoz.org/blog/post-name. So in theory that would create a very broad and flat structure, yet they don't seem to have much if any indexing issues.
-
RE: Why are we not seeing similar traffic patterns in a new market?
All thing being equal, the age of a domain accounts for a very tiny amount of ranking weight, if any at all. It's not really ever a reason I would give for why a website is or is not ranking for any given set of terms.
-
RE: Best approach to launch a new site with new urls - same domain
First of all - I love the new design. It looks great!
The absolutel best way to go about it in my opinion would be to simply have the new site ready, and then launch it fully under the base domain (no subdomain) while 301 redirecting important old pages on the site to their related new versions. That way the search engine will have the easiest time of discovering the new site and indexing it, while making sure you don't lose anything in the transition via proper 301'ing.
I can't say it would provide you with a massive benefit to set up a way for the search engines to start crawling the new site for now, as you're just going to be moving all of those URL's off of the subdomain in the near future anyway - where they will then need to be recrawled on the parent domain as if they were brand new.
If the traffic diverter you have set up automatically 301's requests for old site pages to their new beta URL version then you might as well let those new versions be indexed for the time being. Just make sure that when you transfer the beta site to the parent domain that you 301 the old beta URL's to their new permanent home.
Like every single person in this industry, I'm constantly working to improve my SEO & inbound marketing abilities. I have a background in doing traditional (boring!) marketing work.
Looks like your connection to Moz was lost, please wait while we try to reconnect.