Glad to help. Curious to learn how it unfolds over time too though.
Posts made by AlanBleiweiss
-
RE: Non-www home page indexed, but www for rest of site
-
RE: We are changing ?page= dynamic url's to /page/ static urls. Will this hurt the progress we have made with the pages using dynamic addresses?
Not sure what you mean about hurting progress you've made so far. If you implement 301 Redirects, and if your new method is based on SEO best practices (in other words not going TOO far with keyword stuffing), and you've got sound content organization, there should be no problem whatever in the switch.
-
RE: Non-www home page indexed, but www for rest of site
Only thing I can figure is it's a lag in their update due to the switch in March. You're doing everything you need to / can / should for this type of situation, and I see you even have the rel=canonical set.
-
RE: Single domain or Multiple Keyword Domains
While consolidation is good, too many 301 Redirects in too short a period of time can be a hassle. I'd suggest building out other content initially, and letting the floor mats site build up some value over several months before considering a migration of that. Jumping too soon on a recently redirected site could cause too much loss of original site value as it passes through two hops.
-
RE: Matt Cutts on Article Marketing
True. And it IS fun watching the 90% out there with their fishing gear, their custom fishing trawlers, all set to catch fish, then throwing nets into the water where the holes in those nets are bigger than almost all the fish in the ocean
-
RE: Matt Cutts on Article Marketing
The only value article marketing ever really had from a long-term perspective was to boost article marketing sites. They're the only ones who could ever possibly win. Except even Panda has slapped most of them now as well.
It's a quality win all around.
Of course there are those that would argue against such thinking. Which is fine by me. Let them continue living in the illusion. The rest of us will continue to drive quality and relevance.
-
RE: Google counting numbers of products on category pages - what about pagination ?
In addition to increasing the number of products as EGOL suggests, I'd also include the number of products directly in the Meta Description field for the primary landing page. Pagination is important for user experience. Using Javascript could cause problems with search engines properly discovering all the products.
-
RE: Single domain or Multiple Keyword Domains
I agree with wildner-akademie - become THE authority site on all things related to truck parts. With proper content organization (sub-folders), the long-term value will be worth the effort.
One great example of how this works so well - Real Estate - for nearly every city in the country, Trulia and Zillow consistently come up in the top five results. It's all in sub-folders. And neither one has "real estate" in the domain name.
-
RE: I want to create a new web site, I have to choose between:....
A very common myth exists related to the age of the domain. If the new site is going to have different content, the age of the domain is not going to be helpful in the long run. If you have any content on the old domain that you can use on the new site, you can make use of that value. How much value you get and then keep over time depends on how much of the content you can keep, and how many of the links pointing to that content you can retain.
The further away from the older domain's topical focus you go, the less value it has as well.
-
RE: Impact of removing category sidebar with keywords?
There's no one right answer here. So many factors to consider.
1. Real Word Testing
For example, have you ever done heat maps or click tracking with a solution like CrazyEgg? I ask because it could turn out to be that the "recent free reports" box gets hardly enough clicks to warrant it's being kept as compared to the "Browse Topics" links.
2. Existing On Page Optimization
From a different perspective, You've only got "fair" optimization on the individual topic landing pages. For example, when I go to the link for the Debt Collection topic, it sends me to the url http://www.insidearm.com/browse-topics/?topicid=2515
When I look at this page I see the following flaws in your page level SEO:
You've got a weak page Title
No URL optimization
The overall keyword saturation across the entire page (including header area, main content area,k sidebar and footer areas causes significant dilution.
3.Section Level Sub-Navigation
When I'm in this section, the sub-category links do show up within the sidebar Topic link area for this section, yet the whole Topic nav box is way below the fold, which already causes a user experience problem. Taking the further action of removing that box entirely would cause even more topical and user confusion.
4. Crawlability
At the code level, the page I reviewed has 385 links. (65 of these are purely pointing to archives, and they're only at the code level, not seen by visitors, yet because they're there, search indexing crawlers do see them). With so many links on a page, it's quite possible that search engines are not crawling your site as efficiently as they could be, which itself could be further harming your sites overall SEO in a way that needs to be factored when deciding which links on a page should or shouldn't be removed.
5.Perceived Duplicate Content Factor
All those links in the sidebar and footer - for the most part they're the same across the entire site correct? If so, that's causing not only a topical dilution issue, it's also contributing to weakness in perceived unique page level content.
6.Sectional Navigation
SEO best practices DO call for a section level sub-navigation system up top or on the sidebar, though it should at least mostly be limited to links within that section.
As you can see there are many factors to consider before just randomly ripping out one thing because you want the visual space for something else you or someone else in the organization perceives to be important. It's not something to take lightly because of that very impact you express concern over.
-
RE: Registering a domain for multiple years
Quite often. At least once a year I get a call or email from someone saying "where'd my site go?" and it turns out the domain expired, they didn't see or get the email notificaitons, and some scraper stole the domain. 80% or more of those turn out to be CRS
-
RE: Registering a domain for multiple years
Seriously - what EGOL said. I've never once seen a case proven where length of registration helped or harmed a site. It's a false flag factor.
-
RE: Google, Links and Javascript
It's definitely a judgment call. There is legitimate reasoning to have a link to the widget author/creator if it's a single link. And providing the option to not have the link be there helps. Yet it's not 100% clear that Google does or does not penalize for this.
-
RE: Google, Links and Javascript
The data you see where AddThis is at position # 19 is not a signal that they're being ranked based on all those links, only that they have that many links. It's heavily debatable as to whether Google discounts all those links or not, however it probably is confirmation that Google might see them.
Now as to whether it's wise to do so or not - that too is debatable. I've personally had a brief discussion with Matt Cutts a while back asking about a specific type of link pattern that had to do with people offering "free services" that ultimately embedded "hidden" or otherwise "questionable" links on pages of sites, fairly similar to the scenario you describe. At the time, he said that those particular links were not a significant factor in the ranking of the sites where those links were pointed to.
Here's the bottom line I take on all of this - creating a service, plug-in or component that embeds links where the purpose of those links are purely for SEO is risky at best, and potentially harmful in the long-run. It may be of some SEO value today, yet it's a primary candidate for being hammered eventually.So personally I always advice clients to not even play that game.
-
RE: Is "last modified" time in XML Sitemaps important?
Glad to be of help Sha
-
RE: Is "last modified" time in XML Sitemaps important?
Sitemap.xml files are one of many "hints" search engines use to evaluate, classify and otherwise associate relevance, importance and freshness of individual pages, and in turn, an entire site.
When the entire file flags every page with the same date/time it can have a negative impact, purely from the single-point signal perspective. If the actual pages themselves have different date/time stamps at the HTML code level, those would counter the sitemap.xml file reporting, and either resolve it or just cause confusion.
Any time search engines have a potential conflict that needs to be resolved, the potential for less than maximum value exists.
Because of these combined potential problems, SEO best practices dictate that this issue be resolved, so as to ensure it does not, in fact, lead to problems, however minor they might be on a per-page basis. If resolving the issue takes an extensive amount of time, an evaluation of how important the issue is to overall SEO. At a certain point, you cross into the realm of diminishing returns.
-
RE: Is z-indexing a black-hat trick?
You're good then. It's the negative z index implementations that have issues.
-
RE: Ecommerce SEO with 130 keywords
Just my experience here - unless you've got an already existing highly ranked site with massive amounts of content, your best bet is to limit assigning two or at most three highly related keywords to any single page. You can then include two or at most three secondary phrases to that page but where those additional phrases are only mentioned on the page - this all goes to support the primary phrases.
And you'll only be successful even with that work if all other optimization factors are implemented properly.
Another concept - don't worry about long tail phrases - if you write quality content unique to each page well enough, long tail phrases will take care of themselves. But at the same time, don't shoot initially for 1st page Google for the most competitive phrases in your market if your competitors have hundreds or thousands of well established pages. Find the balance.
-
RE: Is z-indexing a black-hat trick?
you say its for a floating bar that scrolls vertically. Is it hidden from a site visitor or visible? If it's a negative z index, it could be "perceived" as a problem by Google's automated system. They claim that your intent counts. Except I've never seen Google guarantee that they'll assign a human to review every single site that is flagged with potential problems. This is why you "should probably be okay if you're not using it to fool Google" but where they consistently communicate "we have a right to do what we want to do" and thus, when in doubt, consider the risk/reward to make your decisions.
-
RE: Shall Google index a search result?
Georg,
What I communicate as "best practices" to clients is to noindex/follow all on-site search, however you can look at your analytics to determine if you think you're getting enough value out of the current system or if wiping that out will help with long term refinement of your content. The concept for not having those indexed is many less pages would be given much more strength and weight long-term. . Unfortunately there's no exact method to determine if this will be the case in your unique situation.
Given that your search results provide depth of content (not just a bunch of links, but actual, relevant text linking to details pages, I'd be curious to see if any of your site, or specific phrases you were previously found for that led to those search pages were hit by Panda.
-
RE: Is there a way to see all SEOMoz questions, comments, posts, responses that I've given a "thumbs up" to?
I'm on board for wanting such features.
-
RE: Google Docs Paranoia
Welcome to the tin foil hat crowd. I've often suspected Google of looking at all the data, including gTalk discussions even. Yet at the end of the day, with so many other things needing our attention, and until someone comes along and shows real data on how they actually used such info from any of their services to penalize someone's SEO, I personally choose to store such thoughts deep down in the far reaches of the "maybe" file bin. Just my own take on it.
-
RE: 4xx Client Error
then the only other time I've seen this happen was from inbound links that Google found on other sites. You can try searching Google itself, however you may or may not be able to discover where they exist.
-
RE: 4xx Client Error
Google picks up links several ways. If you are 100% sure these links are not in your own site somewhere (even on a single page, either intentionally or accidentally), check your sitemap.xml file, but be aware that they also pick up links that come from 3rd party web sites.
If you can't find the source, consider setting up a 301 redirect to resolve both of them, or alternately don't worry about them if you only have 2 listed and your site is otherwise doing well.
-
RE: Ashton Kutcher talks about SEO?
somebody probably tipped him off along the way about all the "evil people who scam the search engines with SEO"
-
RE: Does Google gropup similar phrases together as teh same search phrase ?
You're welcome Alan. Getting past the basics and understanding the nuances can make all the difference over the long term.
-
RE: Ashton Kutcher talks about SEO?
He apparently tweeted it at some point, then it appears he ultimately deleted it from his twitter stream. You can also see the tweet here and here. And here's a reference from two days ago with more info
-
RE: Does Google gropup similar phrases together as teh same search phrase ?
Google does group phrases to a certain extent. The search results are supposed to be based on "show exact match first" then cascade down from there in terms of "closest but not exact" all the way down.
Since that criteria then gets re-sorted based on Google's individual site authority and value criteria, the results can get jumbled.
This is why it's vital to consider the most likely way your unique market thinks about your services and how THEY would be most likely to conduct a search. Because it's better to go with the variation(s) you think most resemble or match that thinking.
You gave a difficult example when it comes to evaluating that issue, because to me, all but the first one are viable as real world people actually looking to hire web designers. The 1st one may be good, but seems a bit less likely. Only real data studied over time can narrow down the best choice in terms of conversion potential. Yet it's important to consider.
This issue is a great way to use PPC ad testing to see which ads get the most clicks to a landing page - which of those then actually fill out a form, and which of those forms is more than just someone exploring as opposed to ready to purchase?
-
RE: 301 redirect to 1 of 3 locations based on browser languge? Is this ok?
If you feel you need to do this regardless of language, run a PHP if/then script looking for browser language. Based on the language you can then set the 301 right in the script loop.
Header( "HTTP/1.1 301 Moved Permanently" );
Header( "Location: http://www.matchware.com/yourlanguagefolder/" );
?>My question though is why would you not have at least one language be the default, and have that at the root home level (www.matchware.com/)
-
RE: Meta tags - better NOT to have?
In 2011, true SEO professionals could care less what you put in your meta Keywords field, since that field has not been used as an SEO method in a long long time. All they need do is look at your page Titles. Or they can run a bot to scrape your page Titles, h1 tags, link URLs, internal and inbound link anchor text, image alt attributes, to generate a neat spreadsheet. All automatically.
-
RE: Pre-Launch SEO?
Advance market competitive analysis work is critical. Check out my SEO audit article series, especially part 4, the SEO Sweet Spot. It will give you an indication as to how much effort you'll need in the long haul and in what areas you need to focus on most.
-
RE: Does Google Places reward unique icons for multi location businesses?
It's important to have every physical location represented in Google Places. Ideally you should have a corresponding page on your site for each of those locations as well, that can be navigated to from your site home page through standard HTML links internal to the site (not through javascript or AJAX, etc).
You should submit a separate google place entry for each - use the bulk submission process. Each should have it's own local phone number and each should be given maximum Google Places optimization treatment. The title for each should all match your business name though - don't try and get a unique title in each individual location's places entry.
The more actual locations you do this with, the more likely you will come up in exponentially more local searches.
Don't forget to follow this up with submitting them all to YP.com, SuperPages.com, Yahoo Local, Bing Local, etc.
-
RE: Why are search results different for 'Yahoo Search' and Powered by Yahoo search?
I believe they are, yes.
Here's another concept. Yahoo's own search results come from Bing. What would be the point of Yahoo displaying the exact same results as Bing though? So it's a known fact that Yahoo takes Bing's data and changes it for the main Yahoo search.
So in that same way, I am saying I believe Yahoo then goes even further and provides yet a different result set for "powered by Yahoo".
-
RE: Why are search results different for 'Yahoo Search' and Powered by Yahoo search?
It's kind of like like having a different set of product descriptions for your own site vs. product descriptions you provide to shopping aggregation sites.
Why would you want to give away your best, most current, stuff? If you did, what would motivate people to go to your site? They don't TELL people that, but that's the reality of business.
-
RE: Expanding into different territories
Dylan,
Building up depth of unique quality localized content, combined with links and citations from sites that themselves are devoted to each location / area pointing to that localized content is vital. How much you'll need depends on the competition in each local market. Some might be easier to break into than others.
It's also easier to go after regions as opposed to individual cities or towns. So here where I live/work, it's "Bay area" and "San Francisco Bay Area", instead of just San Francisco or just Marin County.
-
RE: SEO sites were blasted by Panda
I'd like to say "you can only trust numbers if you have access to an individual site's anlaytics. Except every analytics program is suspect.
Oddly because they're all suspect, I use data for finding trends in the data. Which brings us full circle to Google Trends. Apparently Google can't even spot accurate trends.
-
RE: SEO sites were blasted by Panda
I've never trusted compete.com data - every test I've done with their data has ranged from "wildly under-reported" to "wildly over-reported", with the occassional "lucky guestimate" thrown in.
-
RE: On-Page Keyword Optimization Question
Yes, they give the content area the most weight in regard to individual page topical focus, however search engines do evaluate every word on a page, including content within the source view that visitors don't see. This is why having too much content in header, sidebar and footer areas, or too much code at the source view level causes topical confusion / topical dilution and is considered during the duplicate content evaluation process as well.
The best I can offer in regard to how often a phrase should appear on a page is "does this feel like I repeat this phrase too much?" If you've got the same phrase repeated fifteen times just in the content area, there should be a valid reason other than just SEO reasoning for that. And a LOT of text around those.
-
RE: Are In bound links to a Custom URL ineffective
If you have multiple variations of www.domain.com/index.aspx linked, each one would, in my understanding of your situation, have one inbound link. One link pointing to a designated "unique" page is very rarely going to be able to provide any high value ranking.
In addition, each of those "pages" would be indexed and seen as a unique page by search engines. Since the links would only come from 3rd party sites and NOT from within your site, they'd be even weaker.
So if you prefer to focus on SEO for your site, then links should point to your home page without the index.aspx page name in the URL.
-
RE: How deep should I go with a directory site?
the past couple years, industry thinking had it that architecture should be as flat as possible. I never held that belief, and all my clients who employed the "2 - 3 layers, 4 at most" policy all sailed through the Panda update.
The issue is as much about deep internal linking as it is about topical organization and a major signal about topical relationships is in folder keyword seeding, along with breadcrumb navigation, supported by inbound links pointing to that 2nd and 3rd layer.
-
RE: URL Length
glad to share what I've found. I'm blessed in that I get to experiment with methods on several sites that get millions or tens of millions of visits each month
-
RE: URL Length
Yes from that perspective, you're correct. Links outweigh folder relationships as signals because links are the "almighty final say".
Of course, if you've got them all in the same folder, then you should also have a sub-nav unique to that folder where each of those pages has the links to each of the other pages.
Then throw in microformat encoded breadcrumbs and that combination is the most potent.
-
RE: What should I set my domain setting to?
It's based on whether you want your site to be associated with the www version or the non www version. So if you routinely promote your site with it, focus on inbound links with it, and have that as your default user experience (such as with setting up a redirect from the non-www version) then that's what you should set in GWT.
It's essentially a way for Google to help properly assign the value of non www inbound links to the www version if that's what you designated.
Personally I no longer use www versions and recommend clients skip them as well because that's 4 less characters to have to get into ads, on business cards, type...
The most important factor though is to ensure you've got 301 Redirect functionality set up from the one you don't want to have used. GWT just adds a bit more value if you set it there.
-
RE: URL Length
Link structure vs. folder structure? Not sure I understand - links point to content. Content is located in folders (or just the root folder). can you clarify?
-
RE: SEO sites were blasted by Panda
yeah unless we can get input from site owners, I am not convinced those are reflective of actual traffic. Just not sure.
-
RE: URL Length
Let me add the reason for the answers you got.
A) Distinct organization of content is critical to communicating topical relationships and topical separation to search engines
B) URLs of this type provide 1 of the many ways Users can understand clear relationships of content
C) if you go too far down the rabbit hole, it becomes too diluted for search engine value
D) if you go too far down, it crosses the line into confusion for users
-
RE: SEO sites were blasted by Panda
And so much for BlackHats having an upper hand in the game
http://trends.google.com/websites?q=blackhatworld.com&geo=all&date=2011
Or those who believe it's all about backlinks
http://trends.google.com/websites?q=backlinksforum.com&geo=all&date=2011
-
RE: Google not providing all competitor site's external incoming links?
Google has communicated that the list of links shown during a link: operation are a sampling only, and in fact just as likely to show low value links as high value links.
As for Yahoo, I've compared their link: operator to OpenSiteExplorer.org and the Yahoo result is consistently much smaller than what I get from OSE.
Then again - even with OSE, it's not 100% perfect either.
Just as in any analytics you work with, best practices state that you should a) use at least two sets of data for your evaluations and b) use data mostly to observe and spot trends rather than trust data as being 100% accurate.
-
RE: Targeting Keywords from the homepage
Garry,
In my experience maximum opportunities come from assigning no more than two or at most three primary phrases per page, then including up to three or at most four very closely related additional phrases within the body. Those additional phrases go to support the primary phrases mostly, however the total combination eventually leads to exponential long-tail opportunities that you don't need to figure out on your own if the content is really well written for the user.
Trying to get more than that in there almost always leads to topical dilution.
-
RE: How would you deal with eCommerce sorts?
I'd expand on Ryan's suggestion as a consideration to just say that you need to be aware that Google continues to work on figuring out how to tap into AJAX accessed content. They may be bad at it now (they are), yet one day they could eventually have it figured out, at which point the duplicate content issue comes back.
Might not be in the next short while - maybe never, yet something to be aware of.