I would like opinions on Brian Dean's training courses and his advice -- is it useful?
-
I would like opinions on Brian Dean's training courses and his advice -- has anyone used it successfully? Is it worth the cost? And useful?
-
I have taken his "SEO That Works" class and it is definitely worthwhile if you are serious about content-driven SEO. His methods definitely take an investment of time (and money if you are outsourcing) but the results he shows are pretty amazing. The SEO That Works class is also great if you are concerned about legitimate, white-hat ways to get authority backlinks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How good/bad the exit intent pop-ups? What is Google's perspective?
Hi all, We have launched the exit intent pop-ups on our website where a pop-up will appear when the visitor is about to leave the website. This will trigger when the mouse is moved to the top window section; as an attempt by the visitor to close the window. We see a slight ranking drop post this pop-up launch. As the pop-up is appearing just before someone leaves the website; does this making Google to see as if the user left because of the pop-up and penalizing us? What is your thoughts and suggestions on this? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
Advice on links after Penguin hit
Firstly we have no warnings or messages in WMT. We have racked up thousands of anchor text urls. Our fault, we didnt nofollow and also some of our many cms sites replicated the links sitewide to the tune of 20,000 links. I`m in the process of removing the code which causes this problem in most of the culprit sites but how long will it take roughly for a crawl to recalculate the links? In my WMT it still shows the links increasing but I think this is retrospective data. However, after this crawl we should see a more relevant link count. We also provide some web software which has been used by many sites. Google may consider our followed anchor text violating spam rules. So I ask, if we were to change the link text to our url only and add nofollow, will this improve the spam issue? We could have as many as 4,000 links per website, as it is a calendar function and list all dates into the future.......and we would like to retain a link to our website of course for marketing purposes. What we dont want is sitewide link spam again. Some of our other links are low quality, some are okay. However, we have lost rankings, probably due to low quality links and overuse of anchor text.. Is this the case the Google has just devalued the links algorythmically or is there an actual penalty to make the rankings drop? As we have no warnings in WMT, I feel there isnt the need to remove the lower quality links and in most cases we havent control over the link placements. We should just rectify that we have a better future linking profile? If we have to remove spam links, then that can only be a good reason to cause negative seo?
White Hat / Black Hat SEO | | xtopher660 -
Will my association's network of sites get penalized for link farming?
Before beginning I found these similar topics here: http://www.seomoz.org/q/multiple-domains-on-same-ip-address-same-niche-but-different-locations http://www.seomoz.org/q/multiple-domains-on-1-ip-address We manage over two dozen dental sites that are individually owned through out the US. All these dentists are in a dental association which we also run and are featured on (http://www.acedentalresource.com/). Part of the dental associations core is sharing information to make them better dentists and to help their patients which in addition to their education, is why they are considered to be some of the best dentists in the world. As such, we build links from what we consider to be valuable content between the sites. Some sites are on different IPs and C-Blocks, some are not. Given the fact that each site is only promoting the dentist at that brick and mortar location but also has "follow" links to other dentists' content in the network we fear that we are in the grey area of link building practices. Questions are: Is there an effective way to utilize the power of the network if quality content is being shared? What risks are we facing given our network? Should each site be on a different IP? Would having some of our sites on different servers make our backlinks more valuable than having all of our sites under the same server? If it is decided that having unique IPs is best practice, would it be obvious that we made the switch? Keep in mind that ALL sites are involved in the association, so naturally they would be linking to each other, and the main resource website mentioned above. Thanks for your input!
White Hat / Black Hat SEO | | DigitalElevator0 -
Question #1 - My Cherry's Popped!
I recently acquired rights to a URL that is one of our keywords. Instead of developing a landing page with that URL and then only linking it back to the company root, I was thinking about adding a link within the company's global nav that pushes to this new URL (and new page content of course). Are there any Pros or Cons to doing it that way? Thank you so much!
White Hat / Black Hat SEO | | GladdySEO0 -
Anchor text penalty doesn't work?!
How do you think, does the anchortext penalty exactly work? Keyword domains obviously can't over-optimize for their main keyword (for example notebook.com for the keyword notebook). And a lot of non-keyword-domains do optimize especially in the beginning for their main keyword to get a good ranking in google (and it always works). Is there any particular point (number of links) I can reach, optimizing for one keyword, after what i'm gonna get a penalty?
White Hat / Black Hat SEO | | TheLastSeo0