Quite correct. I unfortunately assumed that everyone already knows this bit of information.
Posts made by FrankWickers
-
RE: Redirecting multiple websites to a single website
-
RE: Redirecting multiple websites to a single website
You should be able to mass redirect all of the sites at one time, and be fine. I've done this with a client situation once before, but I'll grant that during that case we redirected only 11 domains at one time. I can't fully vouch that it will be the same as 30 domains... hopefully someone else may be able to lend their experience with a higher number like that.
I'd say something else you really want to take into consideration here is do you want to do a domain wide redirect, or a page based redirect? One would entail far more work than the other, but depending on circumstance it might also offer greater benefit.
With a site-wide redirect, you will of course be funneling everything from the old site onto whatever target page you choose on the main site (likely the home page I'd guess). When Google notices that a domain has a site-wide 301 on it, that domain will quickly start to fall out of the index, and the link power it had can also surprisingly quickly start to degrade. Pretty straight forward stuff over all.
If you were to use a page by page redirect method, you could sort of custom tailor what new pages you'd like the old link juice to flow to. Say on one affiliate site you have a hub page for custom truck grilles, and it has a large amount of links coming into it. You could specifically 301 the affiliate page to its "sister" page on the main site, which would then boost the target page by a greater amount than if you simply did a mass site redirect.
Even another situation to take into account... If the content across your affiliate sites and main site are all similar, you could use technique #2 with rel canonical tags, rather than 301 redirect.
-
RE: Front page dropped to PR1 - thoughts?
Another idea that you need to examine... the company hasn't bought any links, or become involved with what might be considered 'Black Hat' SEO, has it?
I say this only because if any Black Hat SEO occurred, and Google slightly caught on to it, they might have intentionally lowered your toolbar PageRank to send you a message before taking any sort of direct action.
-
RE: Need solid, no jargon definitions
Linking root domains refer to links from any page on the domain.
I, personally, would be somewhat careful about the foreign sites thing, especially those which may be in different languages. My view on it is that 1) You might begin to seem manipulative with your linking tactics to the SE's and 2) you might unintentionally start to pull off a bit of "international SEO" which could have consequences that, while not necessarily bad, wouldn't be the best for the issues you need to solve at hand.
That's my own view on the subject, though, so certainly don't take it as Gospel.
-
RE: Linkscape update tracking
Hi Vinny,
In general LinkScape updates every 30 or so days. So if you gain a new link or 3 part way through LinkScape's cycle it's very possible that it will not be included in the most recent update. Also, as awesome as it would be if LinkScape's directory were as large as Google's, it isn't. This is mostly just due to technology constraints I'd guess, and not having dozens of ENORMOUS data centers like Google. Because of this there is a chance that some links will not be picked up (especially if the links might be on really new sites, or maybe rather spammy ones). So it really turns into a waiting game to see if those links you know exist are picked up on the next LinkScape cycle or not.
Finally, unless I've managed to totally avoid it this whole time, the index doesn't have any sort of history. The amount of data storage required to maintain such a history would be crazy in conjunction with how often they currently update their index. To name drop one of SEOMoz's competitors (sorry guys XD ) majesticseo.com does actually offer historic indexing services. I don't know how in-depth they are when you're fully subscribed, as I've never been a paying member of their services. It may be something worth checking out.
And if someone comes into this question and corrects me about SEOMoz having historic index access, definitely do that instead! I know I will =P.
-
RE: Need solid, no jargon definitions
As Craig stated, it's a good idea to read the Beginner's Guide if you haven't already.
However, if you're just trying to make sense of the metrics you put up I'll see if I can help.
Total Links - This is the total number of links coming into a given page or domain (depending on which option you've selected to view.. page based or domain based).
Ext. Followed Links - The number of links pointing to a page which are not from your own domain (ie internal links), and do not have the rel="nofollow" attribute appended to them.
Linking Root Domains - Really it's as simple as it says. It's the number of the individual, unique root domains that are linking to a site. 2 different root domains would be something like site1awesome.com and thisisawebsite.com.
Followed Linking Root Domains - Same principle as the above "Linking Root Domains", but this one takes out all of the links from the domains which have added the rel="nofollow" attribute to those links. It only counts followed links from root domains.
Linking C-Blocks - Sort of difficult to explain. I'd recommend reading the following thread on the Search Engine Watch forums, with particular interest to the first reply. It should hopefully sum up C-Blocks for you.
Hopefully I've helped you understand the metrics better than before reading all of my chicken scratch.
-
RE: Link Acquisition Assistant
That would go hand in hand with what is discussed in the 2 videos I just linked in the above answer (which I edited in afterwards, so you may have missed them before). Definitely I'd recommend watching both of the videos. I believe they'll help answer a good number of your questions.
-
RE: Link Acquisition Assistant
Hi Peter,
The answer would be both yes, and no.
If the directory is ranking high on the Google SERP then it stands to reason that the page and domain in question likely have good ranking metrics and signals. If that is the case then there is a good chance it could be a strong link to help your site.
However, here is where the "no" part of the answer comes in to play. It would depend where the link is actually being placed on the site. If the link is not going to be on the exact page that you're landing on from the SERP page, then you aren't getting the entire story. You would have to do some digging around to be able to fully understand which page the link is going to be placed on, and what the metrics of that page are.
Depending on where the link would end up, the #6 result on the SERP could actually provide a stronger link for you than the #1 result. Other things to take into consideration of course are topic relation between your page and the linking page, and the amount of total links on any given linking page.
I'd suggest watching the following recent Whiteboard Friday videos, if you haven't already. They can help you better evaluate the potential link value from individual pages and/or sites.
1) http://www.seomoz.org/blog/which-link-metrics-should-i-use-part-1-of-2-whiteboard-friday
2) http://www.seomoz.org/blog/which-link-metrics-should-i-use-part-2-of-2-whiteboard-friday
-
RE: Should I try to optimize for SEO a site that lives only for 5 days
If the site is only going to be in effect for 5 days I wouldn't put much effot into heavy optimization. I would only do basic on page stuff that requires all of 5 minutes of your time - title tags and such.
Reason being there is a high chance the Google bots won't even come by and index the site within the 5 day period.
-
RE: Viewing Specific Links in OSE - For a Specific Month
I actually don't know if OSE gives that option, and if it does I've been missing it this entire time, but I've used majesticseo.com for awhile now to do that sort of tracking/research. Their tool gives a pretty accurate report of link discovery, including how many links were discovered in a given month.
-
RE: How do you rank in the "brands for:" section in Google's search results ?
First you need Google to see you as a brand. Then you need to get Google as familiar as possible with your brand, what it does, and what it offers.
If you are an SEOMoz Pro member I strongly recommend watching this webinar on becoming a brand in Google's eyes. If you're not an SEOMoz Pro member, I recommend you become one, and then watch the webinar I've linked.
-
RE: Why are we not seeing similar traffic patterns in a new market?
All thing being equal, the age of a domain accounts for a very tiny amount of ranking weight, if any at all. It's not really ever a reason I would give for why a website is or is not ranking for any given set of terms.
-
RE: Why are we not seeing similar traffic patterns in a new market?
First, are you being outranked by competitors? Try doing some long tail searches similiar to what you get for areas like Atlanta.
If you seem to be ranking for those searches just as well as you normally would, it could be that areas like Nashville have less heavy internet users. Keep in mind that you're likely to have more results for a good ranking result based in NYC compared to the boondocks of Oregon... simply because of the number of people in the area searching for the item in quesiton.
-
RE: SEO on a mature site - diminishing returns?
Research has been done which shows that having a top ranking for both the organic and paid versions of a search term results in a higher click through chance. The increase is usually anywhere between 1% and 20%. Nothing dramatic, but it helps.
People are more likely to click on an organic result over a paid one, though. Searchers are generally slightly distrustful of paid ad results. So I'd say your problem may lie with my 1&2 advice tips.
Check your adwords account to be sure of the search terms your ads are displaying for. Are they displaying for very broad searches, and not the exact keywords you specified? Are they being clicked on and converting for those broader searches? If so you might want to change up the exact keywords you use both for Adwords and your Organic SEO.
-
RE: SEO on a mature site - diminishing returns?
Are there diminishing returns for optimizing a site? No. At least not as far as being penalized for organically achieving new rankings.
Now then, you say you're getting good high rankings for various search terms, but you're not seeing increased traffic. Let's figure out why that could be.
First of all, when you're checking you're rankings, make sure you're not receiving personalized results for having clicked on your own website too many times. Do things like make sure you are logged out of any Google accounts you may have, and add the code &pws=0 to the end of the URL for the searches you do.
If personalization is not affecting you, and you are achieving those ranks, then you likely have some other problems. There are 2 things most likely affecting you.
1st) Are you using the right keywords? If you're a company that offers a 'credit card fraud investigation service', you don't really want to be ranking for terms like 'credit card fraud'. That term isn't specifically related in its intent to what you are needing. Someone searching for credit card fraud is probably looking for information about, can you guess it, credit card fraud! They're probably not so much looking for people who investigate credit card fraud.
- Your search description snippets might not be working well for you. Make sure you have meta descriptions declared on every page that is ranking, or is of importance, which clearly describes the content on that page. If you don't have meta descriptions when someone see your site in a search result, they could be seeing a description that looks something like "We have been in service since...... keyword is what we do if..... we don't think that way". Basically they're seeing something that makes no sense in relation to what they searched for.
If you declare targetted meta descriptions, you're likely to have a better click through rate, and thus more traffic.
-
RE: How do I get Google Places to pick up my reviews from other sites?
The Google bots automatically crawl sites such as Yelp, Insiderpages, and Citysearch, so in time they will manage to reach your profile on those sites. Once they do reach the profile, they will usually automatically scan the reviews there, and add them to your Google Places account's reviews.
Basically it's all automatically done. You just have to sit back and wait. It can take up to 3 months from personal experience, and even potentially longer. If nothing else you could use the time to find a nice bar on Yelp, and go hang out there until the Google Bots decide to get their gears in motion.
Also, make sure that the information listed on your Yelp/Insider/Citysearch profile is pretty much the same as the information on your Google Places profile listing. Things like the business title/name and address should be the same, so that way Google can be sure that the profile it is looking at on Yelp is for the same company it has in its own Google Places databse.
-
RE: Interlink suggestions while writing?
There are many different plug-ins the Wordpress community have developed to do "related post searching" for you.
I'd run a search through the Wordpress plugin database to see if you can find anything that matches your needs before running off and developing your own PHP to do the same thing. One plugin I know of is Yet Another Related Posts Plug In. See if that happens to fit your needs.
-
RE: Best approach to launch a new site with new urls - same domain
Ah ok. I understand now. I wasn't picking up on what you were saying before.
If with the soft launch you are already putting the "new" version of the site on their intended final URLs then yes, you can let the engines start crawling those URLs. For each new URL you let the search engines crawl make sure to 301 its corresponding old URL (the old site) to the new version to minimize any duplicate content issues.
If for whatever reason you can't quite 301 the old URLs yet (like if you still need instant access to reroute traffic back to them) you could try using rel=canonical on the old pages and point them to their new counter part only if the main content on each of the pages is almost exactly the same. You don't want Google to think you're manipulating them with rel=canonical.
-
RE: Best approach to launch a new site with new urls - same domain
As I'd said, there really isn't a reason to let them get a head start. The URL's will be changing when you transition the new site out of the subdomain (ie beta.sierratradingpost.com/mens vs sierratradingpost.com/mens - those are considered 2 completely different URLs) and the engines will have to recrawl all of the new pages at that point anyway.
-
RE: Best approach to launch a new site with new urls - same domain
And when you drop the sub domain you definitely want to 301 all of the old site structure's URLs to their corresponding new page's URLs. That way nothing gets lost in the transition.
-
RE: Best approach to launch a new site with new urls - same domain
Yeah, just refer to our conversation above as I think it will pertain better to your situation.
-
RE: Best approach to launch a new site with new urls - same domain
The only issue is that you have to keep in mind that Google/Bing defines pages on the internet through their URL's, not the content. The content only describes the pages.
So if you let the engines pre crawl the pages before dropping the subdomain - simply for the reason of letting them have a "sneak peek" - you won't really be doing yourself much of a favor, as the engines will just be recrawling the content on the non subdomain URL as if it were brand new anyway.
The reason to do it the pre crawl way would be if you're already building back links to the new beta pages. Then it could make sense to let the engines index those pages and 301 them to their new non subdomain versions later. In my opinion the benefit from this route would outweigh any potential duplicate content issues.
-
RE: Am I missing something?
Good catch. Or you could try adding &pws=0 to the end of the URL string for your search, and that should eliminate the personalized results from being delivered to you.
-
RE: Am I missing something?
First you might want to post the non shortened version of the URLs. I know many people do not trust shortened links from sources they don't know well enough.
Your competitor may be buying links en masse with customized anchor text for the links. I see it daily with the competitors of my client's sites. If you aren't in an overly competitive keyterm space, a few thousand bought - anchor text optimizied - low quality links can easily be enough to get someone ranking for terms even though the rest of their SEO is abysmal.
-
RE: Best approach to launch a new site with new urls - same domain
What YesBaby is talking about is somehting like Google's Website Optimizer. When someone goes to sierratradingpost.com/mens-stuff, for example, it will give 50% of the people the old version of the site for that page, and the other 50% the new version. It will eliminate any duplicate content issues as the 2 page variations will still be attached to the same exact URL.
Definitely a viable option if it fits with your game plan of how you want to do things.
-
RE: Best approach to launch a new site with new urls - same domain
First of all - I love the new design. It looks great!
The absolutel best way to go about it in my opinion would be to simply have the new site ready, and then launch it fully under the base domain (no subdomain) while 301 redirecting important old pages on the site to their related new versions. That way the search engine will have the easiest time of discovering the new site and indexing it, while making sure you don't lose anything in the transition via proper 301'ing.
I can't say it would provide you with a massive benefit to set up a way for the search engines to start crawling the new site for now, as you're just going to be moving all of those URL's off of the subdomain in the near future anyway - where they will then need to be recrawled on the parent domain as if they were brand new.
If the traffic diverter you have set up automatically 301's requests for old site pages to their new beta URL version then you might as well let those new versions be indexed for the time being. Just make sure that when you transfer the beta site to the parent domain that you 301 the old beta URL's to their new permanent home.
-
RE: Best approach to launch a new site with new urls - same domain
Let me make sure I have this straight... you're not going to be directing the new site format to a subdomain permanently, right? You were only using the sub domain for beta purposes?
The way I see it, when I go to Sierra Trading Post's site now I can make out what looks like 2 different types of architecture structures. You have one link on the page pointing to Men's clothing which executes at a single defined .htm file. Then you can see that you have the "Men's Classics" (still general men's clothing?) which points to a directory which I'm guessing is your new site. Correct me if I'm wrong on this, or if I'm right but have the old vs. new reversed.
If that is the case your best bet to try and minimalize any ranking impact would be to 301 redirect pages from the old catalog architecture to the new. That way you could remove the old site files completely and let the server take care of the direction.
If you need to leave the old site up for throttling purposes like you said - you could use canoniclazation tags to refer the old pages to the new ones. That along with employing 301 tags would help train the search engines into understanding what you're doing.
I'm sorry if I didn't answer your question as you needed. I'm still not sure if I understood your issue as intended. =P
-
RE: What are the best paid directories today?
I understand. In that case as I stated in my edit I don't think most of the "well known" directories of that sort have a whole lot of their link value passed through by Google. I would say the money could definitely be invested in another tactic that will provide a better ROI, but that is of course an opinion which could be incorrect.
-
RE: Best SEO structure for blog
Well the problems you're trying to overcome are the exact reasons why a good CMS blog system pulling and storing posts from a databse is extremely effective.
Doing things your way, all static HTML/CSS with no databse, it would definitely make sense to only list the most recent posts on any given page/chategory, and then come up with an archive system for the rest.
You should have a search feature you can put on your site so as to let people easily pull up older buried posts. I don't personally have any experience with it, but you could try Google's Custom Search Engine to see if it could accomplish what you need.
As far as the hierarchy of the domain levels goes I would never go deeper than 4 levels with your categories/posts. You can almost never have too flat of a hierarchy... example being looking at SEOMoz's structure. They are a massive blog with a large number of posts, and yet almost all of the posts trace to a URL structure of seomoz.org/blog/post-name. So in theory that would create a very broad and flat structure, yet they don't seem to have much if any indexing issues.
-
RE: Did Google's Farmer Update Positively/Negatively Affect Your Search Traffic?
On many of our client sites we've actually seen positive impacts from the farm updates.
Previously, some of our clients were being beat out for the top few positions by some content farms for various long tail search terms.
After the Farm Update we've seen those content farm page results drop off into oblivion and our client sites have stepped right up into their positions. Our client's have gained dozens of new 1st, 2nd, and 3rd position SERP results on long tail keywords.
So in summary - none of the websites we manage/seo have been hurt by the Farmer Update. We have, instead, been rewarded from it.
-
RE: What are the best paid directories today?
Hi David,
First of all you need to keep in mind that paid directories may be able to take your money, but they can never take YOUR FREEEDOOMMMMM!!!!!
Now that I've got that out of the way (and hopefully delivered a mild chuckle at least) on to the business at hand.
There is a fundamental problem with "paid" directories. That being they have 2 options when it comes to the links they provide from their directory listings. They either need to 'nofollow' the links to not have Google penalize them as a link buying service, or they need to hide the fact that they are indeed basically just a link buying service... otherwise Google will likely penalize them.
You need to look at a paid directory from Google's point of view. If the directory is openly advertising that you can pay them for inclusion, and thus get a link from their site for the money then Google has just about no choice but to flag the directory as a link buying service.
If the directory already links to your site for free as a service to their customers, Google can consider them a legitimate service and not penalize the directory. Many directories of course do this and then offer premium type services that business can pay for and not receive any Google penalty from (yelp and even the BBB being great examples). In this case, though, you're not really receiving any extra "link juice" for the service you're buying. You're usually just receiving extra visibility or an endorsement of some sort.
So in short, if you're trying to find a paid directory for the purposes of acquiring links I would strongly suggest you rethink your strategy. If you are, however, simply looking to "buy links" then disregard this advice.
EDIT - Something I realized I need to add. You stated you do recognize that BotW rarely shows up in back link reports. Site's such as BotW and others (JoeAnt and so on) I think don't pass much link juice through to websites from Google and Bing. The search engines recognize that these sites offer paid review services, and while it's certainly not like link buying because instead there are editors supposedly quality checking the URL's submitted, I would guess the search engines still do not pass full link value on to websites.
I could be very wrong of course - I've just never seen any noticeable impact from going through directories such as those.