referral or affiliate, you set yourself up to be penalized if the links and anchor text aren't natural. You're actually better off with the 302'd skimlinks than hundreds of straight links with the same anchor text. I know this isn't what you're looking for but read through this from Yoast: http://yoast.com/cloak-affiliate-links/
Posts made by Chris.Menke
-
RE: Organic Links and Skimlinks Affiliate Program
-
RE: Organic Links and Skimlinks Affiliate Program
What is it that you want get out of the non skimlink and what is it that the skimlinks are doing that you think you don't like?
-
RE: Organic Links and Skimlinks Affiliate Program
You won't be getting any link juice through those links but you shouldn't be looking for any from your affiliates either, as best practice for aff links is that they are not followed links.
-
RE: How is this possible? A 200 response and 'nothing' to be seen? Need help!
I got the page but it was slow to be found. Maybe it is timing out for you and could be an issue with the quality of your host.
-
RE: Meta tags
Google will crawl your site at an interval it sees fit based on the authority of your site and the dynamics of your content. Sites with high authority and constant changes like this one are crawled constantly, while new sites may only be crawled once every month or more--even if you submit it to google. The crawl rate choice in GWT pertains to how quickly Google will go through your site at each crawl--if the crawl rate is increased, it can bog down some web servers but it will not decrease the interval between crawls as you're hoping.
At this point, be sure you have the meta tags as you want them and continue work on your other on-page issues and work on building your authority by social networking, creating content, authorship, and back links. There's not much that happens quickly in SEO.
-
RE: SERP position confusion
Rand did a Whiteboard Friday on Why competitors may rank higher when you have higher moz scores than they do.
Essentially it pointed to these factors.
*Poor search result snippet
*Perceived lower value brand
*Lower design/user experience value
*Citations: (links, social shares, mentions)
quality
quantity
variety
acceleration rate
*Usefulness or quality of content
searcher intent fully addressed
unique value of content
*Results Biasing
local
mobile
verticalsBut the 15000 back links that you mention smacks of manipulation and such a page may have a hard time getting up in the results. It's not the number of links your page has but the quality of links. Have you investigated what kind of links those are?
-
RE: Punctuation at the Start of Page Titles
Normally, I would say don't do it because wasting character space in the title area is a pet peeve of mine but maybe it helps you with click through--maybe not. At number 10, it doesn't seem to be hurting your rankings but maybe it is-- have you tried it without the exclamation point to see if your result moves up?
-
RE: Too many page links warning... but each link has canonical back to main page? Is my page OK?
Webjobz,
The crawl diagnostics summary warning for too many links occurs at 100 links on the page and is based on this info: http://moz.com/blog/how-many-links-is-too-many. You're not likely to be penalized but there is science behind the number so you should look to be more frugal with the number of followed links on your page.
-
RE: Why won't my sub-domain blog rank for my brand name in Google?
By the way, has the blog ever ranked for instabill?
-
RE: Why won't my sub-domain blog rank for my brand name in Google?
Not knowing how much traffic the blog gets or how much of that converts to new business, you might try experimenting with some things.
If you're thinking of moving the site, you might first see if your problem is domain related by just copying and rel=canonicalizing the whole blog to the new domain and seeing if it eventually shows up for "instabill", while keeping the original in place. If it does, you can then put in your 301s and eventually delete it. If it doesn't then it's likely architecture or content related. Personally I think it's content related.
If the new domain doesn't rank for instabill, delete the blog software on the new domain and remove the rel=canonicals to it from the old domain, throw up a wordpress blog, and put a couple of posts on it with a link from your website's homepage and see if that works.
-
RE: Authorship showing in SERPs for non-blog pages
Yes, the tool will show if your business is using publisher markup.
-
RE: Authorship showing in SERPs for non-blog pages
Actually, StuBabble, any page can contain authorship markup and may show up in the search results with an author's head shot. You can use Google's Structured Data Testing Tool to verify that mark up is installed and how the snippet will look for any of your web pages.
-
RE: Duplicate pages with http and https
Diana,
1.There are good reasons to limit your 301, especially regarding preserving link juice, but you are OK with chaining three 301s, as Matt Cutts describes here http://www.youtube.com/watch?v=r1lVPrYoBkA
2. yes, you can canonicalize those page to the http version to bring them back into the search results instead of the https version. If you can 301 everything but the payment pages, you could use that method too.
-
RE: Brand Search Results- how do you make sure spammy links don't hurt your brand
How long has that site been ranking for your brand and how long have you been at the reputation management? If your brand's site has some authority and it is linking out to 10 of your brand's well established social profiles and you have at least a couple of content pieces focused on your brand name published somewhere online, you should be able to push a weak resource down off of page 1 in a relatively short amount of time.
-
RE: What makes a "perfectly optimized page" in 2013?
For example, social sharing links and authorship markup are good additions to a well-created landing page. While they may not technically be on-page factors, they are things that you implement on-page that can have a big impact on the overall strength of the page.
-
RE: Duplicate Page Content - Shopify
according to these, they will always show as duplicate in the report (but won't count against you in your search engine results).
-
RE: Weird SERPS
Ah huh, I misunderstood. If the content is good and unique, and the only duplication is as you describe then it wouldn't seem to be a duplication issue. However, it's not unusual to see such drastic changes in ranking from time to time and that page could pop back up to its previous position at any time.
When you say you haven't done any link building, does that mean you don't have any natural back links either? If that's the case I'd recommend you do some work to build your authority.
-
RE: Weird SERPS
I might see that as a portent of things to come for your other pages. I mean, from what you describe, none of those pages should be ranking due to either thin and/or duplicate content--Panda type stuff. Sorry to say, there's really no reason for you to believe your rankings for the other pages will hold up unless you get original content on those pages.
-
RE: Duplicate Page Content - Shopify
From the issue you described, the rel=canonical is still the right choice.
According to Moz Documentation on Fixing Crawl DIagnostic Issues: "Keep in mind that that canonicals will stop the pages from ranking against each other, but they will still show up as duplicate content from a UI perspective, so we will still count them as duplicate."
Also from Moz documentation on How does Rogerbot calculate duplicate content?: "Two documents are considered duplicates if they have a 95% overlap. Furthermore, we should not count duplicates across pages that specify one or the other as the canonical version. The canonical version should not recognize the other as a duplicate, and the other version should not recognize the canonical as a duplicate."
-
RE: Agency footer link, do we keep it ?
While those footer links back to the agency that created the website used to be commonplace and sought after, it's shaky ground today--even just one or two no-followed links from a client's site could get you in trouble--in the future.
Today, a followed link or two from the footer of a client's home page and another one from some an interior page (each link having different, non-exact match anchor text) may still provide your site some advantage--and that's the problem. Using those links to your advantage may help your authority in the present but you shouldn't be surprised if that authority is stripped at some point in the future. You know what it looks like when authority is stripped? A penalty.
If you are still getting a followed footer link or two from each of your clients, that shouldn't be your only means of building authority--it should be a supplemental means. You really need to work just as hard as any other site in any other industry to build editorial links back to your site in order that you not suffer an authority adjustment down the road.
The best practice is to nofollow those links in order to prevent them from giving you problems in the future.
-
RE: Best way to contact webmasters for link building
Hey Rob,
Michael King just did an interesting Whiteboard Friday a couple of days ago that relates directly to your question. If you haven't seen it already, it's worth a watch...
-
RE: Any thoughts on how to reverse engineer a well-performing page?
Bryan,
First thing to do is to go to Google Analytics and Google Webmaster Tools and check out the keywords that the page is bringing in search traffic for and compare that against the other pages of the site.
Then run a report Open Site Explorer and look at the back links that page has as, compared to the back links pointing to other pages.
Then, run the on-page Optimization report on your pages and see if there are substantial differences in the optimization of your money page compared to your other pages that are not bringing in traffic.
Then go to Google keyword tool and research the keywords that your other pages are optimized for and how much traffic you should expect to get if you were at the top position for them and compare it with how much traffic it says you should be getting for that page that's bringing in all your traffic.
That will start to give you an idea of how that one page is different from the others on your site.
-
RE: Google Local Places and Organic Listing?
AWCthreads,
Be sure to take into account that Google algorithmically places a certain subset of local results on the first page of a local-oriented search, thus some computational logic goes into prioritizing those results, making them much different than a directory.
Check this out: http://www.davidmihm.com/local-search-ranking-factors.shtml
-
RE: Google Local Places and Organic Listing?
If it is even possible for your specific search, work on optimizing a second page for the keyword /geo-modifier search. It should contain solid vocabulary referring to the product/service as well as the location. Make the page a strong resource for the those who would be searching for a combination of those terms. Local references and back links to that page will benefit you a lot as will outbound links to local external sources.
Check out Dr. Pete's recent post on determining the make up of page-1 organic/blended/local search results. I highly recommend that you go through the exercise that he lays out--it's very informative.
-
RE: Google Local Places and Organic Listing?
Yes it is, if it is a blended result and the local/blended results show at the top of the page. It's also possible for two different pages to rank--with the home page showing in the local results further down on the page and a second page showing algorithmically in the #1 organic spot. I've seen this a number of times.
-
RE: What makes a "perfectly optimized page" in 2013?
I'd say as a testament to how well is was put together, it is still a very good guide for SEOs to follow, even 4 years later (a long time in our field). There may be additional information that's useful today but you won't go wrong following what's on that page.
-
RE: Does google know every time you change content on your page
Yeah, it seems you may be a bit more focused on a symptom, rather than on a cause. Authority is a function of your link profile and is Moz's interpretation of PageRank. The greater your authority, the more often you get crawled, and the greater your opportunity to rank higher for more search queries.
-
RE: Does google know every time you change content on your page
The frequency Google crawls your site/pages tends to be based on authority, rather than on how often it is updated. So if your authority dictates that your page gets crawled on a weekly basis and you change the content on a daily basis, not all of that content will get indexed.
You can see when the last time a page was crawled by using this search cache:example.com/page and you'll get the last crawl date at the top of the page. You can also look through your server log files to see crawler activty.
-
RE: Should I consolidate multiple domains to a single site with 301 redirects?
smdesign,
I can be a good idea to consolidate from different domains to directories if there a business case for it and the overlap between the content isn't extensive. There may not be an overriding reason to do though--especially if the the domains are fairly mature and have strong back link profiles. If the decision has been made to do it. Go through the analytics and the back links, verify which pages to be moved are getting search traffic and which ones have external links and be sure that you're at least 301ing all of them to the new pages on the consolidated site. The more redirects you place, it's likely the sooner your new pages will get crawled.
Google has published some suggestions on this here: https://support.google.com/webmasters/answer/83105?hl=en and Moz has a good post on dealing with your redirects here:http://moz.com/learn/seo/redirection.
-
RE: Crawl errors: 301 (permanent redirect)
Lauren, you should note that the 301 redirects are "Notices" in in the Crawl Diagnosis Summary and not actually warnings. As noted in the report, Notices are interesting facts about your pages Moz found while crawling your site.
As Bereijk stated, your redirects from URLs without trailing slashes to ones that have them is fine. As a point of information however, such redirects are not necessary, as either version (but not both at the same time) is acceptable but you do want to be consistent in how you deal with them on your site.
Your redirects from ugly URLs to "seo friendly" URLs are also fine.
The "Warning" regarding Too Many On-Page Links trips at 100. You might call this a soft threshold, as there is no hard rule held up by Google as what is actually too many but it's been suggested that the lower a page's authority, a fewer number of links are recommended. Dr. Pete wrote a post on this here: http://moz.com/blog/how-many-links-is-too-many
-
RE: Duplicate Content for Men's and Women's Version of Site
As Matt said, boilerplate stuff is a tossup but your about page should be rankable for some unique facet of your business and for that reason, I'd be sure to 301 (if the pages are different) or canonicalize (if the page content is the same) to your preferred version.
-
RE: PR 6 Redirect to a brand new domain name
Yiannis,
The client will lose the value those external links (if there were any) brought to the internal pages of the old site . Those deep links helped those internal pages rank better against competitors.
Some may also say that the value that the 301'd links bring to the home page may not be as strong as the value they used to bring to the originally linked-to pages because the relevancy of the linking page and the linked-to page may not be as close as it used to be.
Indexation of the new pages may take longer than if the 301s had been created to those new pages.
-
RE: Ranking Factors
Even with the right name, you may not be able to reproduce what you did with mosquito, due to algorithm changes pertaining to exact match keywords--that is, unless you happen to accidentally stumble on the perfect combo of name/keyword (I suppose that's what you're hoping for). Today however, if you're coming out with a hot new product, focus your SEO and social networking on the brand/name. While you may not be able to take advantage of the keyword helping the product achieve viral status, you can still take advantage of social media to build and capitalize on engagement around content you create for the product. Focus on brand awareness over keyword today.
-
RE: How to rank in Google Places
Danny Dover at Moz did a whiteboard Friday that goes through the basics and is a good resource: http://moz.com/blog/the-basics-of-local-seo-whiteboard-friday
There are also several good blog posts on the topic that will help, including this one: http://moz.com/blog/40-important-local-search-questions-answered
And David Mihm, now at moz put together a list of ranking factors that you should be aware of: http://www.davidmihm.com/local-search-ranking-factors.shtml
-
RE: How to rank in Google Places
Jason, there are different algorithms in use for the organic listings and the local listings. The local listings aren't (as) dependent on PR (DA) as the organic listings.
Borehamhouse happens to be the strongest of the local results and since local results in this search show at the top of the page (they show up differently depending on the search query), they make it top of the page via the local search algorithm.
quendonpark and fennes are being held back from the top of the local results for any number of reasons, but because they have sufficient organic authority and relevance, they make in onto page one via the organic algorithm. Due to poor performance according to the organic algorithm, however, even though it's at the top of the local results, Borehamhouse doesn't even show up in the first few pages of the organic results
-
RE: GWT Duplicate Content and Canonical Tag - Annoying
As long as you did implement the rel=canonical tags correctly then it should happen the next time the page is crawled but don't be dismayed that the data isn't yet showing up in your GWT as 7 day delays or more is not unheard of.
-
RE: Is this ok for content on our site?
Personally, I wouldn't put 1200 words there, it would look awkward. If you were to create additional text, why not just put it in the body--it would look more appropriate to the user. If you really wanted to put something there, maybe think about putting some of your clickable gallery images there to give visitors an opportunity to engage a bit more with the page.
-
RE: Varying Internal Link Anchor Text with Each New Page Load
What comes to me is this: I don't think you'll get the value out of links with dynamic anchor text that you would get with anchor text that is static. A page's overall value and the value it passes on to other page via links is iterative--it's not assigned after just a single pass of the bot. The dynamism would devalue the links, if not render them worthless all together.
And even if you had one thousand variations of anchor texts for each link and they did pass some sort of value, what do you think that footprint would look like after a year or two of google crawls? Upon a manual review, someone there would say, "Huh, look at this, their links change all the time and each one is focused around a specific money term--I think it's obvious that they're trying to manipulate their rankings. Smack--here's a penalty for you."
-
RE: Missing meta descriptions from Google SERPs
No, as long as you're not also spamming your page copy with your specific target keywords. Nonetheless, it doesn't look very professional.
-
RE: Link worth?
I agree with Lynn. In fact, you may have already boosted the authority of your site by ditching those 1000 links--without even adding any more links.
My thoughts on second list that they're not all the same value. Designwebkits doesn't look bad to me but the others, not a whole lot more valuable than the first, other than 1000 links from sites like these might be less likely to get you penalized. I say that not necessarily because of the PR of those sites (because even low PR sites can be very good links and/or grow into very good links) but because they just have an air of the type of generic sites that really don't have a brand, personality, real reason for existing other than to help some other site rank better in some way. Links from sites like those aren't ones that you can hold up in front of a group of your peers and announce proudly that your site has--and if you can't do that, their value to you is dubious.
That's my take on them without running a OSE report on any of them.
-
RE: Missing meta descriptions from Google SERPs
More relevance while leaning towards click through rate/call to action. It's a balance.
-
RE: Occurrences of Keyword - Report card
I wonder if you might be thinking that placing keywords in the keyword meta tag is what you are supposed to be doing. Be sure that the keywords you are optimizing for are in the body of the page, not just in the meta description tag.
Also as a matter of semantics:
- Google indexes a page regardless of what the page was optimized for (provided, as you mention, it's not prevented from doing so). You can check if your page is indexed by typing this search: site:[your domain.com] or site:[your domain.com/specific_page]
- Google provides results for pages that are optimized for specific keywords and that it determines are the most appropriate for the search query.
If you haven't done so yet check out The SEO Guide From Moz for more helpful info
-
RE: Missing meta descriptions from Google SERPs
suchde, as says premiooscar, it is not a given that Google uses your meta description in the snippet. In fact, in Google's own words, "Google will sometimes use the meta description of a page in search results snippets, if we think it gives users a more accurate description than would be possible purely from the on-page content." In fact, you can find that for each different search query that your page comes as a result, you can have a different snippet--it could be the fully from the meta description, fully from the page copy, or a combination of page copy and meta description info.
-
RE: On-Site Directory - Delete or Keep?
Without knowing more than what you've written, I'd first ask if those directories have any back links going to them and before you'd be able to answer, I say just delete the darn things--they're not helping you. 1. If they've got back links going to them, those back links are likely to be of low quality and by deleting the directories, you negate those links 2. Even if they don't have a pile of low quality back links, it sounds like they're poorly curated, they're not doing anyone any good. 3. You've got a Google penalty on both sites, you've got a directory on both sites, you've got to clean house--just get rid of them.
If the sites in the directory are are industry specific but you can't vouch for most of them and your sites aren't about being a resource for people trying to find those industry-specific sites, just get rid the darn things.
-
RE: Simple Wordpress Question regarding Footer Link
It could be a footer widget. If you go to the Appearance| Widgets section, you may a footer widget section--look in there for your code. It may also be in the settings section of your theme's options--be sure to look through that section too.
-
RE: Impact of rogue keyword in content
If you have a landing page other than your homepage ranking for a term for which most of the competitors' results are their homepages, then I'd say your page is quite well zeroed in. Be sure to analyze and document specifically what you're doing on that page before making changes so you can go back to it if necessary. At this point, it's probably authority that's that's going to push you higher in the rankings. Even one or two links to that landing page from good resources will pay high dividends for you.