The Crawler looks for content not for design technically you are talking about how to style your H1. That is a CSS topic you don't have to worry about that as long your tag and your content is relevant.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by Roman-Delcarmen
-
RE: SEO friendly H1 tag with 2 text lines
-
RE: What Keyword density would you suggest?
“keyword density, in general, is something I wouldn’t focus on. Search engines have kind of moved on from there.” John Mueller, Google 2014
- https://www.youtube.com/watch?v=Rk4qgQdp2UA ---> Check this video from Google
- https://www.hobo-web.co.uk/keyword-density-seo-myth/
If you want to check the optimization level for a keyword go **on-page-grader **
Despite what many SEO Tools would indicate, the short answer to this is, in my experience, there is no IDEAL %. There is no one-size-fits-all optimal ‘keyword density’ percentage anybody has ever demonstrated had direct positive ranking improvement in a public arena.
I certainly do not believe there is a particular percent of keywords in words of text to get a page to number 1 in Google. While the key to success in many niches is often simple SEO, search engines are not that easy to fool in 2018.
I write natural page copy which is always focused on the key phrases and related key phrases. I never calculate density in order to identify the best % – there are way too many other things to work on. I have looked at this, a long time ago.
IN SUMMARY
- it's an outdated concept from the paleolithic era of search engines
- add your keyword to your title, headline and meta tags and that s all and please forget about it
- focus on a relevant task such as schemas, internal linking, site performance, link-building, amp and the most important focus on creating a good content/copy rather than focus on that. Even if you are not a blogger or publisher and you are a small business owner hire a good writer who can create the copy for your homepage that really converts (1500 words) and forget about keyword density
That is how your content is going to look like
- **Main Keyword **
- **Keyword Related --Service 1 **
- Keyword Related --Service 2
- **Keyword Related -- Services Areas **
- Keyword Related -- FAQ
Hope this info will help you
Regards -
RE: Duplicated titles and meta descriptions
I would not worry about it, as long you made the right set it fo
- The canonical tags
- The language and region tags
- You submitted the sitemaps for every single region on Search Console
- Another good point to keep in mind is adding schemas to your pages
-
RE: Kind of duplicate categories and custom taxonomy. Necessary, but bad for SEO?
Please read this articles
- https://zyppy.com/site-architecture-seo/
- https://webmasters.googleblog.com/2008/10/importance-of-link-architecture.html
- https://yoast.com/site-structure-the-ultimate-guide/
- https://doyouevenblog.com/seo-category-pages/
Let use a simple search operator --> sacwellness.com "ADHD counseling"
Let's take your own keywords as examples
ADHD counseling, Anxiety therapy, and Career counselingLet use a simple search operator
sacwellness.com "ADHD counseling"And these are the results according to Google ( Please check how bad are these pages 7 of 10 have errors on the title ) So I'm not talking about complex error related to taxonomies or indexability
- https://sacwellness.com/category/adhd-counseling/
- https://sacwellness.com/adhd-counseling/top-ten-fidget-toys/
- https://sacwellness.com/tag/adhd/
- https://sacwellness.com/listing-category/child-counseling/
- https://sacwellness.com/category/suicide/
**Let's take one of the results, check this page, the meta tags have no relation to the content inside the page. The title, description, and content inside. This is how Google actually see your site **
_PTSD and Trauma page%% - SacWellness.com ---> Title _
https://sacwellness.com/category/ptsd-and-trauma/ ----> URL
Aug 2, 2018 - write a guest blog for our site! We will feature it on our front page for a week or two and include it in our social media advertising. footer.Let's use another simple search operator
site://sacwellness.comAnd according to google your most relevant pages are home, blog, and contact. I mean we are talking about a directory. Even worst just take some of your keywords "ADHD counseling" and look for the same category on any other directory
How do you expect rank a page for this keyword **"ADHD counseling" **if even the most basic aspects such as titles and description are wrong? How do you expect that Google recognizes them?
-
RE: Kind of duplicate categories and custom taxonomy. Necessary, but bad for SEO?
1) can I have one set of categories and another set that are pretty much the same but with the word "therapy" or "counseling" tacked on and have Google recognize both of them?
Before to talk more in deep about taxonomies I think is important to you get understanding of some concepts
Crawl budget: Crawl budget is the number of pages Google will crawl on your site on any given day. This number varies slightly from day to day, but overall it’s relatively stable. The number of pages Google crawls, your “budget”, is generally determined by the size of your site, the “health” of your site (how many errors Google encounters) and the number of links to your site.
https://yoast.com/crawl-budget-optimization/
- Crawled pages: When Google visits your website for tracking purposes. This process is done by Google’s Spider crawler.
- Indexed pages: After crawling has been done, the results get put onto Google’s index (i.e. web search).
- Keyword Intent: Basically is the goal behind any search
Let me explain with a very basic example, let's take 3 simple keywords
- Dentist Tampa Florida - Volume: 20
- Tampa Dentist - Volume: 20
- Cosmetic Dentist Tampa FL - Volume: 20
So the main point here is, you or any other SEO can try to rank for 1 or 2 or even 3 keywords and does not matter at all, From the Google point of view, all of them have the same intents (and that is a crucial point). So that means for Google hey... There are 60 users looking for a dentist on Tampa Florida and this is the main point for SEO. This is what is really matter on SEO. Google. Will try the give them the best possible result for that intent.
Talking about your case if you have 1 category page using a single keyword and 1 custom taxonomy using a close related keyword probably both of them will be competing for the same intent, so if Google can't decide which one of those pages have value or which one is better well, it will ignore both of them.
2) Would it be better just to no-index my blog categories and keep the custom taxonomy indexed since they hold the 6 most recent posts in a given category? In my opinion that's the best option, there are other options such as add canonical tags or redirect. You need to check your Search Console. If those pages have been indexed you should redirect them. As I see you have a blog and also have a listing pages you can use the custom taxonomies for listing and categories for the blog
3) Is this even a problem at all? Yes, this is a problem and in your case is a big problem mainly because your site is a directory site and the hierarchy of your site is the core for users and crawlers. You have a niche website, so you don't need too many backlinks or social media presence, you just need a good site structure and a good on-page optimization that all you need to rank your site. Your site structure depends on your taxonomies, categories and tags.
How to structure your website
We’ll take you through all the steps of creating a rock solid
structure for your website!1 Create an overview of your pages
2 Organize your content as a pyramid
3 Define a menu
4 Name your sections
5 Add internal links to strengthen the structure
6 Reflect the structure in breadcrumbsThe structure of your website should be like a pyramid. On the top of
the pyramid is your homepage, and beneath the homepage are some
category pages. Category pages bundle your listing or posts in
groups with related content. To create even smaller groups within
your categories you can use tags or subcategories.Once you’ve developed your new site structure correctly, you should
consider how you’re going to connect the sections of this pyramid.
Think of these sections as small pyramids inside your larger pyramid.
Each page at the top of that pyramid should link to all its sub-pages
and the other way around.Since you’re linking from pages that are closely related to each other
content-wise, you’re increasing your site’s possibility to rank. You’re
‘helping’ the search engine out by showing it what’s related and what
isn’t.So the way you config the URL of your site (URL Structure) and the way you connect your pages (Internal links) is the way of how Google determinate which pages are important on your site. Here is where concepts like crawl budget and **indexed pages become important **
If the answer were useful don't forget to mark it as a Good Answer... Cheers and Good Luck
-
RE: Kind of duplicate categories and custom taxonomy. Necessary, but bad for SEO?
Ok I took the time to
I took the time to analyze your site
- Screamingfrog to check some technical aspect
- Ahrefs to check your site structure
- Majestic to check your trust-flow and so on
This what I found
- Moz: DA:7
- Ahrefs: UR:9
- Ahrefs: DR:2.1
- 264 Pages according to Screamingfrog
This is my honest point of view
First Point: If you don't know if which is better custom taxonomies or default that means that you didn't a good research or even worst you didn't a research. When you run a research you create your site structure and match all your target keyword with the specific page for that specific keyword. I usually create an excel file with my list of all keywords in collum the specific page for that keyword, I also add the URL, Title, Meta description, canonical etc. So your main problem is not related to custom taxonomies or categories your main problem is the lack of planning.
- So the first step that you need to do is run a **content audit ( basically make an inventory of your content and create a structure that means URL structure and internal links structure and selects your taxonomies and your keywords) **If you do that then you will have a clear idea of which taxonomy is better for your site.
Second Point: I don't know why are focusing on your taxonomies and categories. None of them are even ranking they are not even in the first top 50 pages of your site, why? because you don't have a site structure ( add categories or taxonomy to your site does not mean that you have a site structure)
These are your top 10 pages
- https://sacwellness.com/listing/joe-borders-mft/?lpc_loc=60
- https://sacwellness.com/listing/lindsay-goodlin-lcsw/?lpc_loc=60
- https://sacwellness.com/listing/eunie-jung-ph-d/?lpc_loc=60
- https://sacwellness.com/gender-issues/world-health-organization-to-stop-labeling-transgender-people-as-mentally-ill/
- https://sacwellness.com/listing/the-place-within-counseling-folsom/?lpc_loc=18
- https://sacwellness.com/listing/mike-everson-lmft/?lpc_loc=11
- https://sacwellness.com/listing/joan-druckman-phd/?lpc_loc=18
- https://sacwellness.com/listing/elizabeth-defazio-rn-ms/?lpc_loc=60
- https://sacwellness.com/listing/mitch-darnell-ms-osm-ctrc/?lpc_loc=60
- https://sacwellness.com/listing/william-schneider-laadc-ca-icadc/?lpc_loc=Sacramento
So, in summary, how do you fix this problem
- Create an inventory of all your pages
- Create a keyword research for those pages
- Match your keywords with your pages
- Optimize your URL structure (Dynomaper, Sitebulb, or Semrush can be a good option)
- Create an internal links structure which reflects the hierarchy of your site
- Implement schemas to your site
- Implement amp to your site
As see you are running a directory site that means you will need to have an almost perfect technical SEO, you need to worry about inbound links or social media or stuff like that you need to focus on all your technical aspect in order to rank your site.
If you are interested I can share with you the audit that I made just send a PM
Good luck and have a nice day
-
RE: Kind of duplicate categories and custom taxonomy. Necessary, but bad for SEO?
In your case, if you have taxonomies and categories competing for the same keyword is pretty easy, select one them and optimize them.
-
RE: Kind of duplicate categories and custom taxonomy. Necessary, but bad for SEO?
To help you understand taxonomy systems, first let me explain the
difference between categories, subcategories, and tags. Categories are
used to create large groups within your site. They bundle content that
has a similar high-level topic. Products or blog posts on your site
should fall into a category (a shop category or a blog category).
Because categories are hierarchical they can have subcategories.
Sub-categories fall into at least one category. They bundle a smaller
group of products into a category. Subcategories can have
subcategories too, which bundle an even smaller group, and so on.
By creating categories and subcategories you’ll create a treelike
structure.Tags on the other hand just group content on certain topics together.
Tags are not hierarchical. You can see them as an index of your site.
They’ll not necessarily fall into a category. They can apply to
products, but to other site content as well.You can have both a hierarchical and a non-hierarchical taxonomy
system for your website. Ideally, these taxonomy systems are much
alike. For example, you could have a blog on an eCommerce site. In
this case, you’d probably write a lot on topics related to your products.
Maybe about events where you use them, or what to use them for,
how to use them best, comparisons between different products etc.
Therefore, it makes sense that your tags will partly overlap with the
product categories and subcategories of your shop. This is ok. Because
in the end, you’d like to rank with those posts to draw people to the
products you sell. And, if you group products, whether that’s in
categories or tags, it’s easier to make them rank.Category archives are landing pages
Your category archives are more important than individual pages and
posts. Those archives should be the first result in the search engines.
That means those archives are your most important landing pages.
Thus, they should also provide the best user experience. The more
likely your individual pages are to expire, the more this is true. In a
shop, your products might change, making your categories more
important to optimize. Otherwise, you’d be optimizing pages that
are going to be gone a few weeks/months later.Categories prevent individual pages from competing
If you sell boxers and you optimize every product page, all those
pages will compete for the term ‘boxers’. You should optimize them
for their specific brand and model, and link them all to the ‘boxers’
category page. That way the category page can rank for ‘boxer’, while
the product page can rank for more specific terms. This way, the
category page prevents the individual pages from competing. -
RE: How to find orphan pages
Yes I mentioned in my case I use Semrush and there is a dedicated space for that specific parameter. The easiest way to get your log files is logging into your cPanel and find an option called Raw Log Files. If you are still not able to find it, you may need to contact your hosting provider and ask them to provide the log files for your site.
Raw Access Logs allow you to see what the visits to your website were without displaying graphs, charts, or other graphics. You can use the Raw Access Logs menu to download a zipped version of the server’s access log for your site. This can be very useful when you want to quickly see who has visited your site.
Raw logs may only contain a few hours’ worths of data because they are discarded after the system processes them. However, if archiving is enabled, the system archives the raw log data before the system discards it. So go ahead and ensure that you are archiving!
Once you have your log file ready to go, you now need to gather the other data set of pages that can be crawled by Google, using Screaming Frog.
Crawl Your Pages with Screaming Frog SEO Spider
Using the Screaming Frog SEO Spider, you can crawl your website as Googlebot would, and export a list of all the URLs that were found.
Once you have Screaming Frog ready, first ensure that your crawl Mode is set to the default ‘Spider’.
Then make sure that under Configuration > Spider, ‘Check External Links’ is unchecked, to avoid unnecessary external site crawling.
Now you can type in your website URL, and click Start.
Once the crawl is complete, simply
a. Navigate to the Internal tab.
b. Filter by HTML.
c. Click Export.
d. Save in .csv format.Now you should have two sets of URL data, both in .csv format:
All you need to do now is compare the URL data from the two .csv files, and find the URLs that were not crawlable.If you decided to analyze a log file instead, you can use the Screaming Frog SEO Log File Analyser to uncover our orphan pages. (Keep in mind that Log File Analyzer is not the same tool that SEO spyder)
The tool is very easy to use (download here), from the dashboard you have the ability to import the two data sets that you need to analyze
If the answer were useful do not forget to mark it as a good answer ....Good Luck
-
RE: How to find orphan pages
Even Screaming-frog have problems to find all the orphan-pages, I use Screaming-frog, Moz, Semrush, Ahrefs, and Raven-tools in my day to day and honestly, Semrush is the one that gives me better results for that specific tasks. As an experience, I can say that a few months ago I took a website and it was a complete disaster, no sitemap, no canonical tags, no meta-tags and etc.
I run screaming-frog and showed me just 200 pages but I knew it was too much more at the end I founded 5k pages with Semrush, probably even the crawler of screaming frog has problems with that website so I commenting that as an experience.
-
RE: How can I discover the Google ranking number for a keyword in Brazil?
Well that is pretty easy to perform, there are several tools for keyword tracking, but the easiest way to do that is using your Search Console Account ( As I see that page is not ranking for any keyword )
Go to your GSC and follow these steps
- Google Search Console
- Select the right property (http://solaristelecom.com)
- Performance
- Queries
Then you can filter by pages_** (/ligacao-internacional)**_ country, devices etc. If you don't find a specific keyword on your Search Console Account that's because you are not ranking for that keyword. Also, make sure that GSC it was set it up properly.
Based on your answers in this community. I deduce you have been struggling to rank your site with no results
Actionable Suggestion using Moz
My suggestion for you is quite simple. Start to build a link profile to your site. Assuming that you are starting in the SEO world, I will give a simple plan to perform in the next 30 days.
- Select your keyword (Already have it right)
- Select the first 10 websites ranking for that specific keyword
- Use your Link-Explorer and download all the links pointing to those sites
- Create a list of relevant prospect for you ( No more 30 links, keep in mind that you are starting on SEO your goal can be 1 link per day)
- Create a list of keyword related to your main keywords ( Use keyword Explorer)
- Outreach those websites (these can be local directories, local news, blog websites etc)
- Build links to your site
- Try to optimize as much as you can all your Technical SEO (start by adding SSL to your site)
- On your Moz pro dashboard, it will show you some the most common errors that you will need to fix. Once you finished that part, you can continue with most advanced levels such as AMP Implementation, Internal Linking or Schemas or whatever you want.
-
RE: How can I discover the Google ranking number for a keyword in Brazil?
ligação internacional / Brazil
Keyword difficulty : 0
Search volume: 2.1K
Clicks: 1.4KTop 3 pages
-
RE: WEbsite cannot be crawled
Ok, I made a quick test of your robot.txt file and looks fine,
https://www.threecounties.co.uk/robots.txtThen I made a test https://httpstatus.io/ to check the status code
of your robot.txt file and show me 200 status code (So it's fine)Also, you need to make sure that your robot.txt file is accessible for the Rogerbot (Moz crawler)
This day the hosting providers have become very strict with third-party crawlers
This includes Moz, Majestic SEO, Semrush and Ahrefs.Here you can find all the possible sources of the problem and recommended solutions
https://moz.com/help/guides/moz-pro-overview/site-crawl/unable-to-crawlRegards