How do you optimize a blog post for SEO after the RankBrain?
-
Hi Guys
Just curious to hear what you guys do to rank blog posts in the top in Google especially onsite, after the RankBrain update? Do you still use SEO tools to optimize this or are the SEO tools outdated for this? If yes which tools do you use to get success with?
Cheers
John
-
I am also doing SEO using rank brain psychology. The website is totally based on the Panda helper.
Other things that I want to mention are mcdvoice con, iosemus, blacksheep and hipstore.
Can I track the ranking when applying for rankbrain technique.
-
I am also doing SEO for one of the client websites. The website is totally based on herbal clean qcarbo32.
-
-
-
SEO tools are still invaluable. There is dozens of factors that you have to get right, so seo tools audit your site and keep you informed. Tech is more complicated than people think, an update breaks something etc etc. So seo tools have there place. That said marketing and seo are merging, so that is another factor. The starting point is x10 content then optimising and creating shareable content.
Hope that helps.
-
Thanks Don. So to give you more details, we want to rank for popular keyword phrases for a specific topic. So my question is whether you really need an SEO tool nowadays to check for keyword density, how many times the phrase was used etc
Isn't the key to focus on the user first and less so optimizing based on these seo tools.
-
It is too broad a question to answer in detail. Rank Brain is about optimising the customer experience etc. So the best starting point is the quality of content.
This is still a great WBF on content - that will cover any rank brain concerns..
https://moz.com/blog/how-to-create-10x-content-whiteboard-friday
So, in short, yes, we still use plenty of tools, from GA to Moz, and many more.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dramatic drop in SEO rankings after recovering from hacking
A few months ago my client's website was hacked which created over 20,000+ spammy links on the site. I dealt with removing the malware and got google to remove the malware warning shortly within a week of the hacking. Then started the long process to do 301 redirects and disavowing links under Webmaster tools over these few months. The hacking only caused a slight drop in rankings at the time. Now just as of last week the site had a dramatic drop in rankings. When doing a keyword search I noticed the homepage doesn't even get listed on Google Maps and for Google Search instead the inner pages like the Contact Us page show up instead of the homepage. Does anyone have any insight to the sudden drop happening now and why the inner pages are ranking higher than the homepage now?
Algorithm Updates | | FPK0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Is it a good idea to 301 redirect one same niche site towards another site for seo benefit
Hello friends, I have 2 android niche sites, one site is running on a technology dropped domain i catch 1 year ago it has, almost 400+ domains linking to different parts of the site, the other one i established from scratch and both are running from jan 2015. Now i want to redirect first site which already has 400 links pointing towards it to the home page of my 2nd android site. Is it a good idea to do so and does it give any boost in terms of seo?
Algorithm Updates | | RizwanAkbar0 -
Numbers vs #'s For Blog Titles
For your blog post titles, is it "better" to use numbers or write them out? For example, 3 Things I love About People Answering My Constant Questions or Three Things I Love About People Answering My Constant Questions? I could see this being like the attorney/lawyer, ecommerce/e-commerce and therefore not a big deal. But, I also thought you should avoid using #'s in your url's. Any thoughts, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
What are the most trusted SEO sites?
Other then SEOmoz what sites can you trust for SEO? Is there some type of formula I can use to find out if any site is trustworthy?
Algorithm Updates | | uofmiamiguy0 -
Google Algo Update In Que. What consititues over optimization?
http://www.pcmag.com/article2/0,2817,2401732,00.asp According to this, Google is bringing the hammer down soon on another 10-20% of the search results. While we don't advocate keyword stuffing, exchanging links, or anything too risky I am still concerned. Do we know if the example "perfectly optimized page"; http://www.seomoz.org/blog/perfecting-keyword-targeting-on-page-optimization is now going to be penalty bait? Is this over stuffing? Also, how might this effect ecommerce sites in particular?
Algorithm Updates | | iAnalyst.com2 -
This Guy Is Turning SEO Upside Down
Hi, Everything my competitor does goes against everything I have learned about SEO so far. For starters: he registered a brand NEW domain and within a space of **4 months and ** has a top ranking for one of the most competitive search terms on Google. he uses scraped content the navigation is almost non-existent. his backlinks seem dodgy. 1-page sites with content that doesn not relate. Bunch of links to other websites too And yet his site stats are as follows: Domain Authority: 72 MozRank: 4.63 MozTrust: 4.72 Linking Root Domains: 1725 On further investigation I discoverd that he owns a SEO company and that they in fact have achieved a #1 rank in various niches such as life insurance, car insurance, mortgage etc. On his SEO site he actually promises a #1 ranking in less than 4 months. The sample sites he lists on there all achieved #1 over a 4 month period...of course he owns most of these domains and then just sells the leads... So, my question is how on earth does he do it? Do you have any ideas Zane
Algorithm Updates | | Springboks0 -
Singular vs plural SEO
Hi everyone, OK I've been looking at the Google adwords keyword tool and it's thrown some of my On-page SEO into question (everything said here are examples, I haven't used any real life terms or figures). Lets say my page is about "Green Apples", let's say the keyword tool shows that the singular version "Green Apple" gets more searches (as an example). Should I optimize for the singular or the plural? Also lets say my title tag for that page is "Green Apples | Apples Galore UK" would Google/SEOmoz count that as an optimisation for the singular "Green Apple" or do the search engines take the title literally and don't differenciate between singular and plurals? Thanks in advance everyone! Regards, Ash
Algorithm Updates | | AshSEO20112