I would like opinions on Brian Dean's training courses and his advice -- is it useful?
-
I would like opinions on Brian Dean's training courses and his advice -- has anyone used it successfully? Is it worth the cost? And useful?
-
I have taken his "SEO That Works" class and it is definitely worthwhile if you are serious about content-driven SEO. His methods definitely take an investment of time (and money if you are outsourcing) but the results he shows are pretty amazing. The SEO That Works class is also great if you are concerned about legitimate, white-hat ways to get authority backlinks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using PURL.org/GoodRelations for Schema Markup
Hello awesome MOZ community! Our agency uses JSON-LD for our local business schema markup. We validate our markup using Google's Structured Data Testing Tool. All good! Recently, I discovered a competing agency using our similar JSON-LD markup (that's ok) and "http://purl.org/goodrelations" markup. The latter appears to be–potentially–black hat SEO. Why? According to MOZ, "there is no conclusive evidence that this markup improves rankings." BUT, the purl.org markup has provided an opportunity for "keyword stuffing". Using purl.org markup, the agency has stuffed/used 66 of the same keywords into the validated markup. I would love to get feedback from the MOZ community. Can schema markup–of any kind–be used to "keyword stuff"? If so, why aren't sites getting penalized for this? Is this practice flying under the elusive algorithm radars? Thanks! Your feedback, insight, and snarky remarks are welcome 🙂 Cheers!
White Hat / Black Hat SEO | | SproutDigital0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
How to remove trailing slashes in URLs using .htaccess (Apache)?
I want my URLs to look like these: http://www.domain.com/buy http://www.domain.com/buy/shoes http://www.domain.com/buy/shoes/red Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0 -
So what's up with UpDowner.com?
I've noticed these guys in link profiles for several sites I manage. They'll usually show up around 1,000-10,000 times in the backlink profile. From what I can tell they index websites, build up keyword relationships, and then when you search for something on their site (e.g. poker) they'll present a list of related sites with stats about them. The stats seem to be yanked straight from Alexa. Where the backlink comes from is that every time 'your' site shows up for a search result they'll put a little iframe that contains your site. This means if your site's name/keywords are pretty broad, you could be showing up thousands and tens of thousands of times as being linked from these guys on their pages that Google indexes. And Google indexes, boy do they ever. At the height, they had over 53 million pages indexed. That has apparently shrunk now to around 25 million. I believe their strategy is to generate a crap-load of automated content in the hopes they can cash in on obscure long tails. So my questions for you guys are: Are you seeing them in your backlinks too? Should I block their spider/referrers? What is their deal man?
White Hat / Black Hat SEO | | icecarats0 -
Redirecting doesn't rank on google
We are redirecting our artist's official website to copenhagenbeta.dk. We have two artists (Nik & Jay and Burhan G) that top ranks on Google (first on page 1), but one of them (Lukas Graham) doesn't rank at all. We use the same procedure with all artists. http://copenhagenbeta.dk/index.php?option=com_artistdetail&task=biography&type=overview&id=49 Doesn't rank but the old artist page still does. Is it the old page that tricks Google to think that this is the active page for the artist?
White Hat / Black Hat SEO | | Morten_Hjort0 -
Can't figure out how my competitor has so many links
I suspect something possibly black-hat is going on with the amount of inbound links for www.pacificlifestylehomes.com ( http://www.opensiteexplorer.org/links?site=www.pacificlifestylehomes.com ) mainly because they have such a large volume of links (for my industry) with their exact targeted keyword. Can anyone help clear this up for me?
White Hat / Black Hat SEO | | theChris0 -
Are unusual chinese links likely to be black hat?
Hello, I noticed that this onepage website: http://www.clearpixel.net/ ranks at #11 for web design London so I did some research using SEO Moz Pro. Turns out that alll their links are from Chinese directory style sites. Does this demonstrate black hat SEO? If not, how do I go about getting links on Chinese sites with .gov urls. Many thanks, Martin Hofschroer
White Hat / Black Hat SEO | | MartinHof0