If you start a campaign in Moz, go to page optimization, enter a URL and keyword, and go to the bottom where it says "Content Suggestions" is that basically do a TF-IDF analysis? I want to make sure I understand how that works. Thanks!
Posts made by brettmandoes
-
Is the Content Suggestions section under Page Optimization a TF-IDF Analysis?
-
How Can I Batch Upload URLs to get PA for many pages?
Howdy folks, I'm using advanced search operators to generate lists of long tail queries related to my niche and I'd like to take the batch of URLs I've gathered and upload them in a batch so I can see what the PA is for each URL. This would help me determine which long tail query is receiving the most love and links and help inform my content strategy moving forward.
But I can't seem to find a way to do this. I went to check out the Moz API but it's a little confusing. It says there's a free version, but then it looks like it's actually not free, then I try to use it and it says I've gone over my limit even though I haven't used it yet.
Anyone that can help me with this, I'd really appreciate it. If you're familiar with SEMRush, they have a batch analysis tool that works well, but I ideally want to upload these URLs to Moz because it's better for this kind of research. Thanks!
-
RE: Optimal URL Structure for a Multi-City Directory
I think you should consider how your users are interacting with your website and how they search for your services/products/locations and follow that. For example, Yelp is focused on local reviews. People will filter first to their city, then the category naturally. You would never filter down to restaurants first, because if you're in Huntington Beach, CA you really don't care what's in Portland, OR. If location is secondary to your product, then it makes sense to start with the category. For example, let's say you sell ATVs and other off-road vehicles and gear, but some showrooms only have ATVs while others also carry dirt bikes. Customers who are looking for a dirt bike care more about reaching a showroom with dirt bikes, so that category structure would be preferable.
Note that I'm assuming in both of the above examples that your navigation is following the structure of your website for usability purposes. In terms of structure, one way is not inherently better than the other from a ranking/algorithm perspective, but if your structure is confusing it can be detrimental to SEO. For example, outreach is a lot harder if you have a garbage navigation that contributes to poor user experience on your website. Any piece of Google's algorithm that measures user satisfaction with your website (Rank Brain, pogo sticking, etc.) will either directly or indirectly affect you depending on how user friendly your website is.
One last thing: in both instances you have the geography in the URL, so if you're hoping for a boost for local phrases from an exact match URL I think you're already tapping that. EMDs are nowhere near as effective as they were in years past, so I wouldn't make that my focus.
-
RE: Differentiating Franchise Location Names to better optimize locations
Hi Jeff, I think I can help you with this, but to clarify, it looks like you have three separate questions:
1. What is best practice for naming different locations to optimize for local SEO?
2. What is the best URL structure to optimize for local SEO?
3. Should geo specific terms be used in blogs?Be sure to let me know if I'm missing the mark. I'm also going to go heavy on industry jargon and assume you know what it means, so feel free to ask questions if I go over your head at any point.
1. For local SEO, it's important to start with a good foundation. This means you have citations claimed for each location with consistent NAP information on your GMB profile, your listings, and the landing page on your website for that location. So if your name includes the geo on the website, it should also include the geo on your GMB profile and citations. It's preferably to use the specific city name they are in. For example, if you're in Flower Mound, TX, be sure to use Flower Mound, not Dallas. Some local SEOs get tripped by targeting the metro area they're in and that can tank results. If some of your locations are in the same city, dividing them up somehow as North/South, East/West, etc. is fine. Google typically picks one or both in those circumstances to display in search.
2. For URL structure, using subpages the way you have laid out is fine. For enterprise local SEO my agency uses a proprietary, scalable CMS to build unique, local websites that rank very well, so I'm more familiar with that structure, but one of the tricks we use is to include a geo variable in the URL, which helps rank for some terms like "glass repair dallas tx", because we can get picked up on the exact match. Every little bit helps.
3. For blogs, I would recommend you completely ignore the geo unless your blog is very unique and specific to the location. You should really only target the location when it's a page that you're trying to rank for local queries and you typically don't have that in a blog. For example, a blog about "what to expect in a hundred year old house" will typically not rank for keywords that trigger the local algorithm, so there's no reason to add the geo. It just gets in the way of the content, and inferior content doesn't rank well. Now a blog like "what to plant in your [location] fall garden" just may have some localization to it, because what you plant in the fall in Des Moines is different than Atlanta. But I find these cases to be few and far between.
Hope that helps, let me know if you have questions.
-
RE: Hi. One of our competitors is ranking ahead of us on Google. Our site has a much stronger authority and much more quality links than this competitor. Would anyone have any explanations for this? Thanks
Hi barryhq, Google has in the past called the top three components of their algorithm Content, Links, and RankBrain, without naming a particular order. It sounds like you've worked hard on links, so good job! Your problem is therefore potentially related to content (ignore RankBrain, you can't really optimize for it).
So let's talk about content. When you do your keyword research, try to focus in on the searcher's intent. What tasks are they possibly trying to solve that Google is surfacing that aren't addressed on your webpage? You can learn this by studying other top results, related searches, people also ask, etc. For example, if one of the keywords you have is "best running shoes" then you know people are doing comparison shopping and including content that compares the top running shoes on your page will help you rank for that. And it's entirely possible that you'll discover in this process that your website is not a suitable match for the keywords you've targeted. I've seen this happen with clients who pick a phrase without doing the research required to make an informed decision and end up targeting something they can never rank for.
It's also possible that you have technical SEO issues, like canonicalization or poor internal link structure or cannibalization that's making it harder to rank, but assuming that your technical SEO game is on point I would recommend focusing on content.
-
RE: Footer no follow links
Hi seoman, it's definitely outdated and was never accurate to begin with. The "nofollow" attribute was always designed to be applied to external links and modern advice is to never apply a nofollow link to your own internal links. If you're concerned about passing authority from a page like your homepage down into your footer links instead of more important pages, you should know that Google tags the links on your site so that they're weighted differently, i.e. a link in your body content is worth more than a link in your footer, image links don't pass as much authority, etc.
In short, I don't think you're going to move the needle by altering your footer links to nofollow.
-
RE: 301 Redirect and Canonical link tag pointing in opposite directions!
Canonicals are not absolute directives, so Google will eventually sort out which of the two signals is more important. My guess is that the redirect takes precedence, because if they displayed the canonical to a user in search, it would be displaying a URL that sends users through a redirect which is a poor experience and they take pains not to do that.
When there are confusing signals like this out there, Google will do its best to sort out these issues and John Mueller has repeatedly stated "we do a pretty good job" at figuring it out, but he almost always adds a disclaimer that it's "better" to have a less confusing structure.
In plain english, it's not a catastrophic error, but it's something you need to clean up as part of your optimization efforts.
-
RE: Proper URL Structure. Feedback on Vendors Recommendation
Hi there, I've got a few thoughts to drop about this, but I want to make sure I answer your specific question first, then answer what I think are the lead up or follow up questions that are either on your mind or that you'll land at in the end anyway.
There are specific instances where you may favor one URL structure over the other. For example, our landing pages are similar to your current structure, and the rest of the website is more similar to your vendor's proposed structure. Folders are a great way to categorize your content and help both Google and users navigate and understand your content. However, you do not want to lose the hyphens. That can make it difficult for users to read in search when they're deciding on a page to view and it can be difficult for Google to read. Let's say your URL has an acronym in it - maybe you're writing about basketball and NBA is in the URL. So your URL becomes: website.com/sports/hownbaistakingcharge Or website.com/sports/baskteballnbakobe. Are either of those readable? You have two stakeholders, Google and Users and your URL structure should support both. Compare the above to website.com/sports/how-nba-is-taking-charge or /basketball-nba-kobe. That's much better for Google because they can clearly read the different words and make sense of it, and it's much better for Users who are trying to quickly scan the URL on Google. I would push back on the vendor that the hyphenation is necessary.
I've listed a few other questions below that I would have for my vendor and team if we were proposing a major restructuring of the site's content.
A new URL structure means a few other things will likely change.
1. Have you thought about creating a redirect map for every page that is going to move?
2. How will the new URL structure interact with breadcrumbs on your site?
3. If you move to folders are you going to need to create head pages e.g. website.com/sports/how-nba-is-taking-charge is located under a main "sports" page that maybe doesn't exist yet. You WILL have users that attempt to reach the head page whether it exists or not and they'll be sent to a 404 instead.
4. Will changing your URL structure alter your main and sub navigation elements on the site? (in almost every instance, it should)And then my final question, knowing how much work it is to take a healthy site and improve it by changing the URL structure alone is this: what is the expected value? Why are we doing this? Sometimes there's a legitimate reason and sometimes it's pure vanity. The SEO upside to a major restructuring like this isn't normally enormous, but the effort involved can be titanic. So be sure your expectations are realistic going into it and get the details fleshed out as much as possible ahead of time.
Best of luck, let me know if I can answer anymore questions.
-
RE: Related Keywords: How many separate pages?
Instead of trying to group pages by keyword, try thinking about searcher intent and task accomplishment. Can you write one comprehensive page that addresses the searcher's needs and includes all the keywords? Or does it make more sense to break into a couple different areas, such as a page that's specific to a plaintiff and a page specific to a defendant?
Try this: create a venn diagram of the different audiences that may visit that section of the site you're contemplating building out, and group the keywords that you suspect each audience would use and see where the overlap is. If there are areas that are completely blank, you don't need a page for that specific audience or task. Doing this will help you determine which pages need to cover which keywords for the right audience. For example, for an optometrist there's probably searches involving "contacts", "glasses", and "lasik". You might be able to address all three on the same page, but that's probably a horrible experience for someone who is just looking for a specific eyeglass style to have long text about the benefits of lasik. Very little overlap there because the audiences and intent may be different, so they get different pages, and that shows up in the venn diagram.
Hope this helps!
-
How to Diagnose "Crawled - Currently Not Indexed" in Google Search Console
The new Google Search Console gives a ton of information about which pages were excluded and why, but one that I'm struggling with is "crawled - currently not indexed". I have some clients that have fallen into this pit and I've identified one reason why it's occurring on some of them - they have multiple websites covering the same information (local businesses) - but others I'm completely flummoxed.
Does anyone have any experience figuring this one out?
-
RE: Quick Fix to "Duplicate page without canonical tag"?
The simplest solution would be to mark every page in your test environment "noindex". This is normally standard operating procedure anyway because most people don't want customers stumbling across the wrong URL in search by mistake and seeing a buggy page that isn't supposed to be "live" for customers.
Updating your robots.txt file would tell Google not to crawl the page, but if they've already crawled it and added it to their index it just means that they will retain the last crawled version of the page and will not crawl it in the future. You have to direct Google to "noindex" the pages. It will take some time as Google refreshes the crawl of each page, but eventually you'll see those errors drop off as Google removes those pages from their index. If I were consulting a client I would tell them to make the change and check back in two or three months.
Hope this helps!
-
RE: PDF web traffic hitting our site
Based on this I don't think you have anything to worry about. It doesn't appear to be an attack, as you described in your original post. An actual attack on your website would have much higher volume. The worst this could possibly be is spam, which is mainly just annoying.
Easy solution: you don't want to filter out this traffic from GA because it may be useful at some point. So just create another view in GA, and name it "unfiltered". This view will have no filters and you can see all traffic in its raw glory. In your main view, name it something like "master" or "the one view to view them all" or whatever you want and set filters to remove that traffic from view.
Personally it looks more to me like these are old pdfs that other websites are linking to, which is what your hosting provider has also said. Your best move here is actually to setup redirects to relevant pages to recapture some of those links that are probably ending in 404s and get some link equity to important pages.
-
RE: How do you use Moz to research related topics?
Hey Dave, thanks for the response! I should have updated my question earlier today, but I was able to find a better way to do this type of research using Moz Pro's page optimization tab. The section in there that was previously labeled "related topics" was renamed "content suggestions", but it worked great. I was able to double content length and put in some genuinely useful information (I hope) that should help it rank better (I hope).
It's a darn sight faster than what I was doing before, which was manually copy/pasting all body copy from the top ten sites for high volume keywords into an ngram analyzer and looking for patterns. The results were actually pretty similar, but good gravy, was it boring.
-
RE: Important updates on Google Analytics Data Retention and the General Data Protection Regulation (GDPR)
Hey guys, we're going through this at my agency and I can break down a couple of things.
1. There is a data retention setting in Google Analytics. On May 25th (this Friday) that's going to change from indefinite to 26 months by default. This will affect past data and reports as outlined by Google. You should expect that data to go away if you do not change those settings.
2. Answering questions regarding GDPR or providing advice on the topic to clients is tantamount to providing legal advice, which we cannot do. For us, we are consulting with our lawyers, and recommending that our clients seek legal advice as well. We are opting not to change any settings or accept any addendums or agreements without consulting a lawyer first or without specific direction from the client to make a change on the client's behalf. Accepting new terms or policies without first consulting the client essentially means you're liable if they make a mistake, because you accepted it, not them.
3. Yes, the European legislation is affecting everyone, some more directly than others. Even if your only client is an ice cream shop in Wisconsin, it still affects you because big players like Google are pushing the legal burden of compliance off of themselves and onto their users. For example, Google gives out some warnings, puts up some banners, and changes their default settings and now they're compliant, but they make some compliance issues opt-in, opt-out for their end user. And Google won't give much advice on this because it's tantamount to providing legal advice.
Hope that helps. It's not a fun topic.
-
How do you use Moz to research related topics?
Like most of the folks here I'm a pretty big fan of the content that comes out through Whiteboard Fridays, and I try to apply the things I learn, but one of the WBF videos that I'm following along with does not do a stellar job of detailing execution using Moz KW Explorer.
https://moz.com/blog/related-topics-in-seo-whiteboard-friday
Now granted, this came out in 2016, but I still feel the core principle and strategy results in a higher quality piece of content and is still relevant to discovering and understanding searcher task completion requirements, and drafting content that fulfills those requirements. Towards the end Rand sort of mentions that you'll be able to do this with KW explorer, but I'm not really seeing the functionality.
The steps I followed were to enter in the keyword in kw explorer, went to keyword suggestions, and selected "based on closely related topics" and ran it, but received no suggestions - came up blank. I then selected "based on broadly related topics" and the same thing happened. I tried this out with the keyword r22, keeping it very broad to start but that didn't seem to work.
So what do you all do to perform this sort of research within Moz? Or do you even feel it's relevant in today's Rank Brain driven world?
-
RE: Javascript and SEO
Thanks for the response Nikki, I'll try to be as thoughtful about this as I can, but I am somewhat skeptical that your problem is javascript. It may be a contributing factor, but in general the concern that most SEOs would have with java is that Google can't crawl it and effectively the content rendered by java is invisible, making it completely impossible to rank as your page is deindexed, and yeah, this is a real risk. The fact that you're on page 1 right now for a competitive term though means that isn't likely your issue. And you're on a Wordpress site, so most of the js issues aren't going to be a problem for you, unless you're using an Angular integrated theme or something.
That doesn't mean there aren't any technical issues holding you back. I ran your page through a couple tools and I'm finding that the page is very heavy, slow to load, and has a very low performance score in terms of page load times and part of that is how js heavy your webpage is. I would recommend running your page through any of the free tools out there. The lighthouse extension for Chrome isn't great, but it was developed by Google so it gives you an idea how they might be measuring your page. Your page scored a performance rating of 4 out of 100, which again, big indication you have speed problems related to your js that could be tied to your rankings.
I think you're on the right track to investigate technical performance issues, but the easiest way to track this down is to start by making sure you don't have content that isn't being indexed. From there you should be able to see if there's any js that's blocking content from rendering for Googlebot. If Google is crawling and indexing the content, your js is okay from a visibility perspective and you can focus on the performance aspect.
If Google is displaying the page completely with fetch and render, you're probably okay, but try going into Chrome Dev Tools and disabling the cache, then reloading the page. Watch for any errors and try running lighthouse with that open. You'll probably be able to catch errors that way.
Good luck!
-
RE: What Moz reports would you suggest running when pitching SEO services to a client?
I just thought of one report that's quick to run and always good to share.
1. Go into keyword explorer and change the drop down to root domain
2. Enter the client's domain and hit run
3. A new screen should pop up with the option to add more domains. Add one or two of his competitors in the field below his URL and hit the "compare sites" button
4. Take a screenshot of the resulting graphWhat's nice about this is that you're providing something visual you can talk to without overwhelming them with data. You can talk about the problem you see in this graph and how you can address it.
Hope that helps!
-
RE: How much does doing google search queries dilute your search console data
Hi Fishe, thanks for sharing this. I had never really thought about filtering out ip traffic from search console data. I typically work with websites with a high enough volume that I think the filtering wouldn't likely impact my work, but it's good to know for my newer clients who may not have much brand presence and are spending a lot of time googling themselves out of anxiety. I can definitely see a use case for that scenario. Good work!
-
RE: Javascript and SEO
Hey Nikki, I think your specific question is more centered on "Will having a website that is only fully enabled with Javascript be harmful to SEO?"
First, there's a lot of mythology about this in SEO land. There are outdated resources and it looks like you've read some of them. Google has advanced their ability to crawl and understand js and the content behind it to a very advanced degree and the tools you may use as proxies to understand Google's capabilities aren't so effective.
But before I move on, I want to verify something with you. When you're talking about javascript, are you specifically looking for answers regarding a website like WIX, built with AJAX? Because that can change my answer significantly.
-
RE: What Moz reports would you suggest running when pitching SEO services to a client?
Hi Rupert, this is kind of a tricky question. The tools and reports provided by Moz are really meant to help provide SEOs with the knowledge to do their jobs, not so much as a sales tool. This means that the information you get from keyword explorer will be more useful during execution, and will be confusing to a prospective client whose familiarity with SEO is vague.
I would encourage you to use the tools that Moz provides to create a preliminary strategy, and only show the back end as an auxiliary. It's not important for them to walk away with a report that shows them a bunch of metrics, it's important for them to walk away feeling like you're the person that can do the job.
Ultimately as a consultant you're not just a mechanic turning wrenches. They have human problems that you need to address while presenting SEO services (a mechanical problem) to a prospective client. If they're the business owner for example you can ask directly, what do you think SEO will do for your business? What kind of timeframe are you looking for results? Is there any pressure to improve rankings in the near term? Asking these types of questions can often get to the root of another issue they're having, e.g. revenue is down the past quarter and they think SEO is a quick fix solution because they don't understand it takes time.
I've sold more SEO services and avoided more headaches by addressing these types of very human problems than I ever have by showing someone a report. In my above example (which was a real client), I addressed the client's need by redirecting them to another form of advertising that could generate quicker results (SEM) and still got the SEO contract. I used Moz to show how I gather intelligence on keywords and showcase my expertise. But I avoided going in depth or handing them anything during the initial consult.
There's an art to this that can't really be fleshed out in a forum, but there are a ton of books and courses out there on consulting. I can recommend Flawless Consulting by Peter Block. Fantastic book that will help you avoid bad deals and close more often just by being authentic and using your expertise appropriately.
I know this answers your question very indirectly, but I hope it's more useful. I didn't want to just tell you "spit out these reports and you'll be fine" because I've gone that route before and that business never stayed with me.
Good luck in your next pitch!
-
RE: Client wants to repackage in-depth content as PowerPoint files and embed on site. SEO implications?
Hi there, I think your specific question is, will embedding power points into the website hurt their site or help it? I'm going to try to break this down for you.
If the slides are indexable one of two things will happen:1. Rankings go up for new, related terms that you either weren't ranking for before or were ranking poorly for
2. The powerpoint cannibalizes rankings for the other pages that were previously built outI'm going to assume you know how to track for this since it's pretty straightforward.
If the slides are not indexable then there should be no reason it would negatively impact rankings. It's essentially invisible to Google.
SlideShare slides can be crawled and indexed. I would expect that to be the default behavior unless you find documentation that shows otherwise. If you don't want it indexed, embed on a page and mark that page noindex.
Let me know if that helps!
-
RE: New link explorer
Roman is spot on. Links are still a big part of the game, but there are specific instances where you can rank with just content. Local SEO is a prime example. I only ever bother with citations for local SEO because I consistently rank just by focusing on content, technical/on-page SEO, and citations.
For my clients with a national presence, outreach is necessary. Think healthcare and finance - smaller guys are competing with massive banks who have a ton of authority and history. Content won't win in that space alone (I wish it did). There are some real easy outreach wins early on, but once you've used up the easy stuff (like getting into directories or fixing broken backlinks) then you have to do a content inventory, find your best stuff, and promote it.
OR
You have to create amazing content, then promote that and earn some backlinks. Like Roman says, the term "great content" is overused and oversold. Most people who show me their great content are often showing off mediocre content.Best of luck!
-
RE: Bounce Rate Extremely Low
From going through the page's html, it looks like you've implemented your Google Analytics code through Google Tag Manager, which is good. What's not so great is that we can't see the code in your container to help you troubleshoot - all we can see is the container.
Simo Ahava is an analytics wizard, and he put together a fantastic guide to help troubleshoot this sort of thing: https://www.simoahava.com/analytics/troubleshoot-google-analytics-9-step-program/
Follow that step-by-step and see if it helps you solve your problem. Best of luck!
-
RE: Local SEO - 2 Locations
Hey there, we do this **A LOT **at my agency (I'm currently managing three enterprise local SEO clients) so I think I can help you.
1. Your citations are composed of your NAP+W information, so the best situation is to make sure that it's as unique as possible between the two separate locations.
a. Name - this will probably be the same unless your client has a naming convention like "FroYo Blast San Diego" and "FroYo Blast Sacramento".
b. Address - this will be unique
c. Phone - This **can be **unique and should be. I know some clients send everything through a call center and that's suboptimal.
d. Website - create location specific landing pages and link to those.
If you follow this then the only non-unique item in there is potentially the name. What we've found across something like 350 websites/locations is that the more unique this information is, the better rankings tend to be.2. For local SEO we've never needed to actively build links outside of citations and we rank page 1, often position 1, for highly competitive queries. Relevant content is more important, so make link building a lower priority. You may need to work on backlinks if you are in a very competitive space, but small local businesses generally have a hard time getting backlinks, which is probably one reason why it's not as important a signal. If it were, then the only HVAC businesses showing up in search would be the ones paying SEOs for link building services which I think Google realized.
3. Put the locations into your footer and wrap those in schema. You could do the header too I suppose, but from user testing we've found it's better to keep the header area decluttered. Start putting in too many phone numbers up top and people get confused.
4. We build a unique website for each location. When you can't do that your best bet is to build landing pages that optimize for the location. On one of our programs we have about 1500 of those landing pages, and we rank on page 1 for a little over half of our 18,000+ targeted keywords with that strategy. It's harder if you're not physically in that location, but since you have a physical location that makes it easier. Make sure you're mentioning the target location in your meta data, like title and h1 tags where appropriate. That helps!
Best of luck!
-
RE: SEO for Videos and Infographics
That will help if your video is about other videos, or you made an infographic about other infographics, or if you're specifically optimizing for a term like "housekeeping video" or "housekeeping infographic".
Optimizing for either of these two things can be pretty tricky and have separate guides. I'd recommend breaking them into two topics. For example, video optimization is extremely nuanced. Are you trying to rank higher in the organic listings because you noticed that videos are ranking highly, or are you trying to move up in a carousel? Or are you trying to use the video to curate backlinks and social shares? Are you trying to promote an already created video or did you create the video with a specific goal and strategy in mind?
Answering these questions really affects a ton of of your Video SEO strategy, right down to where you host it and how you promote it. I wrote an in depth article on this earlier this year that I think will help you and answers a few questions you may have: https://www.linkedin.com/pulse/video-seo-2018-beyond-brett-elliott/
For infographics, I don't have a great resource handy, so I'll have to defer to someone else who has a bit more experience. Mostly I've seen or been involved with using infographics as link-bait and that's about the extent of it for me.
-
RE: Keywords and content query
Hello, I think your question can be broken down like this:
1. Is it a problem if I can't add text/content?
2. Is there a certain word count I should aim for?
3. Is there a specific number of keywords on page I should aim for?So I'll try to answer this as best I can and if you have more questions, just fire back.
1. This could be a problem if the content on page is something you'll need to rank well. It seems counterintuitive to many because "content is king" has been parroted as SEO wisdom for years, but there are times when content is NOT the primary driver of rankings, and the secret is in the intent of the searcher. Think about it like this, if you're searching "best ac repair service near me", you probably just want a short list of the best HVAC companies near you. A 3,000 word article is less helpful here than a short list of the best, and indeed when I run this very search the top 5 results are all lists. The number one result has less than 600 words, but all of them have user generated content in the form of reviews. Another example where content may not matter: "buy golf balls". You're going to get a lot of ecommerce listing style pages that are short on content but allow people to easily buy golf balls. I know this because I just ran this search yesterday to help another Mozzer. But if your page is meant to be informative, you may need the ability to modify, add, or remove content, so this could be a problem. Try to match the searcher's intent with the page and that will help you determine if this is truly an issue.
2. As we just demonstrated in example one, no specific word count is recommended for all queries. However, there was a study performed in September 2016 by Backlinko that analyzed about a million queries and one of their findings was this:
In fact, the average word count of a Google first page result is 1,890 words
This would indicate that longer content is better, but as I discovered early in my career - if you write content just to have the length it will flop. We tried it at scale and wrote the content just to have the length for about 120 websites. It performed the exact same as the content we had before it, which was about 500 words. So don't do that.
3. This one is short and easy. The answer is no. The metric you're referring to is Keyword Density, and it was short lived and shut down back when Matt Cutts was still at Google. The myth lives on but it's a garbage metric that doesn't correlate to success. Avoid using or even referencing this.
Hope that helps, let me know if you need more info.
-
RE: Where to buy high quality backlinks in 2018?
Easy way to "buy" backlinks, in style, without running afoul of webmaster guidelines.
Step 1: curate a list of all the sites you want a backlink from
Step 2: get the emails of the webmasters there - lots of tools and methods for this, both automated and manual
Step 3: use customer match on Google or Custom Audiences on Facebook to upload your email list of webmasters you picked
Step 4: use targeted ads to get your content in front of this audienceThis really only works if you don't have garbage content. But basically you advertise this content only to people who actually have a website and would consider linking to you and voila. You've basically "bought" backlinks.
-
RE: Whats the best way to build good baclklinks efficiently?
If you have a local website, then posting to directories is foundational to local SEO, and for that I recommend Moz's own service: https://moz.com/products/local
It's effective and cost-effective. Just make sure you have Google My Business set up first or it won't work out real well for you.You'll find several businesses besides Moz also offer this service as a core part of doing SEO. Now if you're not a local business, it's still not bad practice to post to directories, especially if they are well established authorities within your niche. That's just good outreach.
-
RE: Keyword Stuffing
Hi Edwyn, can you share some more details? If you're not comfortable with a link to the page, it would at least be helpful to know more like how often the keyword is mentioned on page.
Sometimes, that keyword count metric is just off. If you have a golfing ecommerce site for example, you probably have a ton of mentions for the terms "golf ball" or "golf bag", especially on category level pages, and that's beneficial to your business and the user experience. In a situation like that, the keyword count might be very high but it's not necessarily bad for SEO.
Now, if you've written a paragraph about golf balls on that same page, and you mention "golf balls" 17 times, then trim it back. If you want to know how often you should mention a particular keyword, here's an easy exercise.
1. Pick your target keyword and google it.
2. Open the top 5 sites
3. Use the finder to see how many times those top 5 sites mention the keyword in their pageUsing the golf ball example I just did this in about 2 minutes and came up with this:
position 1: 14 mentions
position 2: 131 mentions
position 3: 3 mentions
position 4: 74 mentions
position 5: 64 mentionsAs you can see these sites have many, many mentions of golf balls on their top pages and include some big names like Dicks Sporting Goods and Amazon, and rank perfectly fine. A keyword count metric would probably warn them that they mention the target keyword too many times, but that doesn't appear to be the case. So go ahead, and try this with your target keyword. If you're coming in less than the top results for Google, then I wouldn't worry about keyword stuffing if your design legitimately uses the target keyword, such as in a product name or description.
Hope that helps!
-
RE: SERP always between 1-3, Should I remarket?
This is more an SEM than an SEO question. I think I can help but I have a couple of questions first.
1. Do you have multiple campaigns running for multiple domains/accounts?
2. Where ad platform are you remarketing on?
3. Are you familiar with Google Tag Manager and would you be interested in setting up cross domain tracking?
4. What's your level of familiarity with remarketing and ppc in general? -
RE: Whats the best way to build good baclklinks efficiently?
There are some very good articles, courses, and recommended strategies/tactics out there that answer this question more elegantly and thoroughly than I think most people can provide in a forum response.
A course with helpful backlink building tactics: https://ahrefs.com/blogging-course
An awesome article from the wizard: https://moz.com/blog/link-building-tactics-to-acquire-50-links
And my personal recommendation: focus on the content first. Ask yourself honestly, is this content even worth linking to? Would I link to this content if I owned another website? Who would even be willing to link to my website? Answering these questions might help point you in the right direction. For example, if you're a foodie blogger, there are a ton of groups out there on Facebook and other social media sites where you can network with other foodie bloggers and they often will link to each other's recipes and culinary crusades. On the other hand, a lawn care business is better off focusing on his local chamber of commerce and guest blogging for B2B golf course sites or garden blogs. Some niches are easier than others, but make sure you're fishing at the right hole.
Best of luck!
-
RE: I have number one positions organically, should I run an additional PPC campaign?
This is a tricky question to answer because you're asking about the overall strategy and tactics for running a successful PPC campaign and integrating that with your SEO strategy. I'll share with you some personal results and tactics we've used, and some learnings, and hopefully that will help answer your question.
First, some context. We do a lot of SEM and SEO for local services, so I'll be using HVAC as my go to reference point.
1. Brand keywords are super cheap for us. We've always found this to be a good investment as it doesn't normally cost much and if we don't buy those keywords, some of our competitors do.
2. Buying other brands - just as you want to keep competition out of your space, you can invade theirs. Also hasn't every cost us very much money in the past. Much lower conversion rate from these, but every one you snag is a customer won.
3. CPC varies greatly. Especially for local keywords. Different regions have different competition levels it seems. I've seen some guys with a total Cost per Lead of $45. I've also seen it skyrocket to over $100 CPL in a different region.
4. Don't confuse SEO and SEM. There's a big difference. SEM/PPC is referring to paid ads, SEO is the organic listings. They are two separate disciplines that require two separate skill sets, so you shouldn't have any SEOs trying to sell you SEM unless their agency does both.
5. Bid for profitability, not revenue, and you can keep your costs down so you're bringing in leads at an acceptable cost. As long as you've set up a tracking and attribution model that works you shouldn't have any problems making adjustments until your campaign is running smoothly. You can axe the keywords that are costing too much and improve the ones that are bringing solid leads.
Hope that helps. I do recommend doing SEM on top of your SEO. If you're having trouble running a profitable campaign, find an expert you trust and commit a stable budget for a year and see what they can do.
-
RE: Marking up an iframe with reviews schema. Possible? Ethical?
Thank you, I have advised my coworkers and our client that we will not be implementing this solution as it stands on the website. I normally bring a lot of hard data with me when I need to fight back against something like this, but was a bit short this time around. I just may frame your description for assessing risk and hang it up over my desk. Cheers!
-
Marking up an iframe with reviews schema. Possible? Ethical?
Hey there fellow Mozzers! I work with a broad variety of clients, many of them local businesses, and they in turn sometimes find a vendor that stumps me. This is one of those special cases, where the vendor is doing some shady stuff with reviews schema.
First, they're taking reviews from third party sites and filtering them to only show 4 and 5 star reviews (red flag #1), then they're asking us to post them to the website (red flag #2) and finally they are marking them up with schema (red flag #3).
If this were my vendor I would have fired them when they started telling me Google doesn't care, doesn't enforce the guidelines, and all that other nonsense, but hey, I'm not the client and I have to make good for them. I did flat out refuse to place these reviews as they asked, but they came back with a "solution", that I'm not sure I trust.
They're telling me they can't remove the schema (red flag #4), but they can iframe it onto the website. Their logic, which is wrong, is that Google can't/doesn't crawl iframes so therefore the reviews can be displayed without any negative consequence.
I obviously have some ethical concerns with this, but I have to provide the service to my client whether or not they share my values. However, I can object on professional grounds if I think they will take on undue risk. My only problem here is that I have no documentation for how this proposed solution would work. Working through this logically still leaves me with a gap, and that's where you folks come in!
A) We know that Google crawls iframes
B) We know that Google can apply schema within iframes (works with YouTube embeds)
C) We know that content within an iframe is technically on another website, so it doesn't normally apply to your website
D) I don't know how specifically reviews schema would interact with an iframe
E) I don't know if this would result in Google triggering an alarm and blocking the businessI'm hoping you guys can help me figure this out. Ethics aside (making me cringe to type that) is this technically feasible without risk, or would this still be a risky move?
For the record, another client tried filtering their reviews while marking up with schema against my recommendation and got caught, and received a penalty alert. They were removed from results until the problem was fixed.
-
RE: Is it deceptive to attempt to rank for a city you're located just outside of?
Just to piggyback off of Miriam, we do a lot for clients in the home services category who want to show up in markets like this. Our clients are service area businesses, so we build targeted pages that talk about the service they provide for that market. What we have found is that by making that page unique to the market has helped us with gaining some of that sweet page one visibility.
You will almost never rank higher in a market outside of your physical location, but yes, it's possible to get some visibility. Just make sure you're being honest in your representation to customers. An SEO strategy that ends with an angry, non-paying customer is not a strategy at all.
-
RE: Url structure on product pages - Should we apply canonicalized links in breadcrumbs or entry folders
Hey there Shahin, make sure your breadcrumbs reflect the path the users took. It's a navigational aid that's meant to help the user and creating a canonical version of the breadcrumbs would just create a confusing experience. If your user was trying to get back to the mountain tours page but the breadcrumbs only listed glacier hiking, wouldn't that be odd? Since the URLs are already canonicalized it shouldn't hurt your SEO to set things up this way.
-
RE: Indexed Pages Increase and Major Drop June 25th and July 16th?
Hi Kwilgus,
We've seen major fluctuations across several tools and our own proprietary data sets. One of the tools you could use to help keep track of ranking fluctuations is mozcast (http://mozcast.com/). Click on over to metrics, set to 90 days, and compare domain diversity with daily big 10 - the two seem to have an inverse linear relationship. You'll see that Google is monkeying around with their algorithm and giving preference to fewer sites.
Within our own data set we're noticing ranking fluctuations in the local SERP for August, and on other tools we've noticed some ranking fluctuations on the mobile SERP towards the end of July beginning of August.
So you're not going crazy, things are a little bouncy right now.
-
RE: Content Help for Dealers
The first thing I would do is focus on answering the questions people have. You can do this by typing in a product category (like motorcycles) at http://answerthepublic.com/. Based on a white paper released by STAT there are several types of questions that populate featured snippets at a faster rate, so I would focus on answering the what, why, and how questions first, and answer those questions in either
Sounds like you're going to have some serious CMS limitations but hopefully this helps you add content in a meaningful way to your client's website.
-
RE: Is there an API to download keyword data?
I've taken the survey, but you may want to add another field asking for feedback on specific features. 2016 saw a lot of companies bring Business Intelligence in house with data display platforms taking a leading role in creating stories from data. Because of this my clients are a little more aware and want to do a lot more with data - which means I'm more critical of the data I can GET from an API.
It would be excellent if I could create an index of my keywords and the pages that are ranking, so I could identify which pages are ranking highly, and compare to the pages that are ranking low. Right now I don't have a method of compiling all the URLs that are ranking, or displaying which ones are hovering around position 11-13, which is low hanging fruit to bump onto page 1.
Most APIs don't offer the ability to draw out that information, which I could then create visuals and tables with. It would put Moz ahead of the competition to make more data points available via API.
-
RE: Is there an API to download keyword data?
Thanks Lisa, I appreciate the clarity!
-
RE: How to make Form Type Links have "Nofollow" attributes
Are you saying that the links are on another website linking to yours? If that's the case you don't mark those links as "nofollow" (you can't, they're on someone else's website), instead you would add those URLs to your disavow file.
Otherwise, if the link is on your site and you don't have an editor to mark the link up for you, just add rel="nofollow" to the link in html.
-
RE: Duplicate content
If you have verified your NAP via Google My Business and it's accurate, then you have nothing to worry about. Google won't punish you because someone else is creating duplicate listings, and they won't factor in unverified information. Duplicate NAP information would be an issue if you were trying to rank multiple websites for the same location. In that scenario, only one would get the benefit from citations and an optimized GMB profile.
If you haven't create a Google My Business profile yet, go to google.com/business and get started. It will take some time, but in a few weeks you can have a fully verified and optimized profile. Just follow the steps they provide and call support when you get stuck. Additionally please feel free to ask questions here.
-
Is there an API to download keyword data?
Howdy fellow Mozzers,
We're tracking a client's keyword rankings in one of our campaigns and want to export that data on a regular basis to display in Tableau. Tableau let's us quickly match up different data sources and create visuals with the data. It's useful for marrying different data sources that don't normally talk to each other.
We know we can pull keyword tracking data manually, but we're hoping there's an API or scheduler that we can use to download the data automatically. Automating the process would be super helpful.
Thank you!
-
RE: HTTP HTTPS Migration Gone Wrong - Please Help!
Nice, good job! I would double check with your CDN provider to make sure implementation was done correctly according to their process. Unless you're saying you discontinued use of the CDN when you switched?
I wouldn't panic, just make sure your team knows that you can't control the rate at which Google re-indexes the website and that it's still early in the process to tell if there is an issue somewhere. Let Google do their thing and then once your traffic and rankings seem more regular, reevaluate. At that point I would add HTTP/2 support if possible and measure the impact from that because that provides some additional benefits such as a boost to site speed.
-
RE: HTTP HTTPS Migration Gone Wrong - Please Help!
1. Are you using a CDN?
2. Did you update all your internal links to https?
3. Did you update all of your canonical tags?
4. Did you update all of your hreflang tags?
5. Are you using plugins/modules from a third party? Are they secure? Do they have documentation or a rep you can contact about migrating to https?
6. Some CMS's have specific settings that need to be altered when migrating - make sure those were done correctly.
7. Use screaming frog to check for any external scripts, and ensure they're calling https.
8. Did you update your old redirects?
9. Did you update your robots.txt file to include the new https sitemap?
10. Did you enable HSTS?
11. Do you have a disavow file? Did you update it for HTTPS?Bonus:
Did you update all of your other paid campaigns, analytics, etc. to reflect the migration? -
RE: DMOZ gone!! The king is dead. Long live the king.
Hi Steven,
DMOZ and other sites like it are either gone or are nearly ghosts. This is relatable to the conversation on machine learning, where one group believes you can program AI to learn and another (smaller) group believes you have to program all human knowledge into an AI manually to create a true artificial intelligence. Machine learning works, it's efficient, and widely adopted by the rest of humanity. The other approach - one that requires a lot of endless heavy labor - has died off for obvious efficiency reasons (and it doesn't work).
DMOZ was owned by AOL, which was recently purchased by Verizon. I've no doubt Verizon pulled the plug because it was resources spent on a dinosaur that has no value. It was a precursor to modern search engines, but I don't think anyone was using it for its intended original design.
Truth of the matter is, companies that don't invest in machine learning when a competitor in their industry is are going to the grave. You just can't compete with that level of efficiency with human hands.
-
Google changed my settings to 100 results/search by default
Is anyone else experiencing this?
-
RE: Fred Google Update & Ecommerce Sites
Thin content could be the culprit, but there are other things it could be as well. Without access to historical data, past SEO activities, etc. it's nigh impossible to tell you why you're experiencing a drop in rankings, especially since you're citing the "fred" update, which we still don't know anything about.
When I visited that URL the site structure seemed odd to me, as I was redirected to http://www.key.co.uk/en/key/b, which is pretty deep in your site architecture. This means links to your website www.key.co.uk are passing page rank down through three subpages to get to /b. Page rank should pass 100% through 301 redirects now, but I'm not 100% certain how Googlebot views your structure since it seems like you've got a redirect chain going there. If you can simplify your site architecture and check Search Console to make sure that it's technically healthy, then you're off to a good start.
-
RE: Our organic homepage traffic just recently spiked from about a typical under 20 per weekend to about 820 -- what could be causing this?
Hi Rick, sorry for the hiatus, I have a couple other questions for you.
1. Have you set up conversion tracking? Has there been an increase in conversions?
2. Do you have any campaigns running? Print, broadcast, radio, etc.? Many offline campaigns cause a boost in organic searches for my clients. -
RE: Site is too slow? Seeing a new code.
Hi Beachflower,
Did you ever get a resolution to this? I'm curious to see what the outcome and solution was. If this is a malicious attack then you'll need to consult someone who specializes in net sec, but I've dealt with a few different kinds of attacks before so I can make a couple of recommendations.
1. Change all of your logins. Make them unique and difficult for a bot to guess. Then set it to lock out users after five incorrect guesses. This prevents brute force hacks.
2. Add a honeypot to your login forms. A honeypot is a hidden field that bots will try to fill out on a form. Users can't see it, so they don't fill it out. If it gets filled out, the program knows it's a bot, and invalidates the attempt to login.
3. Use screaming frog to find all the js that was maliciously inserted on each URL and create a "cleanup" list. A developer should be able to write a simple "find and replace" program that just deletes it.
4. Consider migrating to https if you haven't already. This can prevent Man-in-the-Middle attacks (MIM) on your site, and also confers several SEO benefits such as improved user experience, a slight boost in ranking, and faster site speed (HTTP/2 integration).
These are just a few first steps to take and a Net Sec professional will have much more to add. Hope that helps!