As seoman10 mentioned, you or your client don't have to worry as long you set up the Robot.txt file properly
Best posts made by Roman-Delcarmen
-
RE: Should client sub domains appear in Google or not?
-
RE: SEO threats of moving from [.com.au] domain to [.com] domain for a 15yr old SAAS company.
Based on my experience I can tell you that you minimize the drops in traffic and visibility but you can not avoid it. There are many site migration types. It all depends on the nature of the changes that take place.
- Site location changes
- Platform changes
- Content changes
- Structural changes
- Design changes
Your case is a site location or domain migration. Google’s documentation mostly covers migrations with site location changes, which are categorized as follows:
- Site moves with URL changes
- Site moves without URL changes
First, need to prepare a migration plan. First of all, need to run an audit for your site, in my opinion, Screamingfrog or Raven Tool are the best options to perform this task.
- Site structure
- Content
- Internal Links
- Search Console, Google Analytics and Tag Manager
**_Split your move into smaller steps. _**I recommend initially moving just a piece of the site to test any effects on traffic and search indexing. After that, you can move the rest of your site all at once or in chunks. Also keep in mind that while moving just one section is a great way to test your move, it's not necessarily representative of a whole site move when it comes to search. The more pages that you move, the more likely you'll encounter additional problems to solve. Careful planning can minimize problems.
**Time your move to coincide with lower traffic **If your traffic is seasonal or dips on certain weekdays, it makes sense to move your site during traffic lulls. This lowers the impact of anything that breaks and also dedicates more of your server’s power to helping Googlebot update our index.
Expect temporary fluctuation With any significant change to a site, you may experience ranking fluctuations while Google recrawls and re-indexes your site. As a general rule, a medium-sized website can take a few weeks for most pages to move in our index; larger sites can take longer. The speed at which Googlebot discover and process URLs largely depends on the number of URLs and your server speed. Submitting a sitemap can help make the discovery process quicker, and it's fine to move your site in sections. If it involves a URL change, you might consider an A/B test or trial run.
A quick tip that nobody will mention. Starts to build links to your new domain before the migration.
- Search Console Migration https://support.google.com/webmasters/answer/83106?hl=en
- Google Analytics Migration https://torpedogroup.com/blog/post/google-analytics-changing-domains
Regards and Good Luck
-
RE: What is SEO best practice to implement a site logo as an SVG?
As you can see Yoast SEO just follow the official information and guides
-
RE: Will google merge structured data from two pages if they have the same canonical?
When you set up the canonical tags you are telling Google hey page A is the main page, is the relevant and other or others are secondary so Google will take that main page as a reference.
-
RE: How can I discover the Google ranking number for a keyword in Brazil?
ligação internacional / Brazil
Keyword difficulty : 0
Search volume: 2.1K
Clicks: 1.4KTop 3 pages
-
RE: Meta Data Question
Basically, you are talking about just one page (I mean one URL) with 2 different contents the food menu and drinks menu. So the answer to your question depends on what you have and what you want but mostly what your users do to land
In an ideal scenario, you will need to have 2 different pages one for the food menu and one for the drinks menu. But that is not the relevant point. What you want or what I can tell you is not relevant talking form the SEO perspective.
What is your audience behavior?
No matter if you are a big company or a small restaurant, or if are using Wordpress, Joomla or Drupal the first question you need to answer
is what your user do to reach your website or your competitor's website, probably they are the same.If you made a research and found that there are 10 users a month that asking to google for your menu, your discounts or even your offers, well you have your answer. Let's take this example you have a restaurant called tacos top and your website is tacostop.com
Let's take the first scenario, you will use a single page for every menu and sub-menu page
tacostop.com
--tacostop.com/menu
----tacostop.com/menu/food
----tacostop.com/menu/drinks
----tacostop.com/menu/wines
----tacostop.com/menu/coffesThe tacostop.com/menu will be like a category page is going to be one of the flagship pages. You can include your food, drinks, desserts, coffees and so on. Inside of this page, you need to include a link to every specific sub-menu and all these 3 level subcategories need to point (link) to his parent page in this way you are telling Google, hey this page is important to me if some ask for my menu show this one.
This page needs to be optimized for a keyword like "taco stop menu" and the secondary pages need to be optimized for longtail keywords all the keywords need to be optimized based on a Local Strategy and your Audience. If you don't have enough data on your Google Analytics or your Search Console.
You should research your competition or your local audience.Let's talk about the second scenario, you will use a single page for all the content. In this case, you will need to optimize a single page for multiples keywords and this will be you structure
tacostop.com
--tacostop.com/menu
----tacostop.com/menu#food
----tacostop.com/menu#drinks
----tacostop.com/menu#wines
----tacostop.com/menu#coffesAs you can see all the categories are on the page as a section a good example of that is Wikipedia
So your headers and its structure play a relevant roll rol on this scaneario.H1 ---> Your main keyword "Tacos Top Menu"
H2 ---> Tacos Top Foods
H3 ---> Pastas
H3 ---> Meats
H3 --->Seafood
H2 ---> Tacos Top Drinks
H3 ---> Orange Juice
H3 ---> Lemon Juice
H3 ---> WaterSo in this way, every category will be a section on this page and every section need to be optimized for its main keywords If use Moz track the single performance for those keywords is very easy
I hope this explanation can help you
If my answer were useful don't forget to mark it as a good answer
Cheers -
RE: How can I discover the Google ranking number for a keyword in Brazil?
Well that is pretty easy to perform, there are several tools for keyword tracking, but the easiest way to do that is using your Search Console Account ( As I see that page is not ranking for any keyword )
Go to your GSC and follow these steps
- Google Search Console
- Select the right property (http://solaristelecom.com)
- Performance
- Queries
Then you can filter by pages_** (/ligacao-internacional)**_ country, devices etc. If you don't find a specific keyword on your Search Console Account that's because you are not ranking for that keyword. Also, make sure that GSC it was set it up properly.
Based on your answers in this community. I deduce you have been struggling to rank your site with no results
Actionable Suggestion using Moz
My suggestion for you is quite simple. Start to build a link profile to your site. Assuming that you are starting in the SEO world, I will give a simple plan to perform in the next 30 days.
- Select your keyword (Already have it right)
- Select the first 10 websites ranking for that specific keyword
- Use your Link-Explorer and download all the links pointing to those sites
- Create a list of relevant prospect for you ( No more 30 links, keep in mind that you are starting on SEO your goal can be 1 link per day)
- Create a list of keyword related to your main keywords ( Use keyword Explorer)
- Outreach those websites (these can be local directories, local news, blog websites etc)
- Build links to your site
- Try to optimize as much as you can all your Technical SEO (start by adding SSL to your site)
- On your Moz pro dashboard, it will show you some the most common errors that you will need to fix. Once you finished that part, you can continue with most advanced levels such as AMP Implementation, Internal Linking or Schemas or whatever you want.
-
RE: Clean URL vs. Parameter URL and Using Canonical URL...That's a Mouthfull!
Yep, I completely agree.
-
RE: Can't work out robots.txt issue.
Ok, I made a quick test of your robot.txt file and looks fine, making HTTP status code test and shows me 200 code which is ok. Also, you need to make sure that your robot.txt file is accessible for the Moz crawler
- Remember to put your file is in the top-level directory of your web server.
- Also, check if your hosting provider is not blocking third-party crawlers (server level)
Here you can test the status code of your robot.txt file** https://httpstatus.io/ **If you're still having trouble this is the email of Moz help@moz.com
Also, you should check
Best of luck!
-
RE: Meta Data Question
can you send me an example (inbox) so I can understand better
-
RE: Ranking issues with my local business website.
Also you should consider some
Opt-in option in your website to create leads (popup with "get your free consultation" or Discount just today) to main goal will be optimize your conversion rate, remember is better optimize the traffic that you already have instead of reach new traffic (also is better for your SEO. -
RE: Crawler issues on subdomain - Need resolving?
In that case, you will need to check your configuration, because as I mentioned the behavior of one should not affect the second one.
As I remember Moz use Google Analytics to set up your Moz Pro Account, so you should check both configurations (MOZ and GA) to found the error.
**Good Luck **
-
RE: Ranking issues with my local business website.
Send you an Inbox hope it help you
-
RE: I want to find all the keywords that an existing page is currently ranking for...is there a way to do that in MOZ or another tool?
Moz is a great tool, but in your case, Semrush and Ahrefs are better tools for what are you looking for.
- On Semrush > Domain Analytics > Overview > Top Organic Keywords > View Full Report
- On Ahrefs > Site Explorer > Organic Search > Organic Keywords
Hope this info will help you
-
RE: Is the MOZ Community Ranking broken?
I have a similar problem, if I make a comment on the blog I receive the Most Point automatically, but in the Q&A seems is does not work.
I have two weeks posting or trying to resolve other members question and then I realize that all was waste of time. In some cases I recive some points 3 or 4 days after, but Im not sure the reason. (maybe some one gave me a thumb up, some mark as a good answer one of my answer). But the point is that the system does not work correctly.
-
RE: I want to find all the keywords that an existing page is currently ranking for...is there a way to do that in MOZ or another tool?
Ahrefs > Overview > Top Pages > Page URL > Organic Keyword, once you are, there click on the selected URL will show you the keywords that are ranked for that specific page.
That report will show you
- keyword
- _Keyword Difficulty _
- _Cost Per Click _
- _Traffic _
- Position
Hope this will help you
Regards
-
RE: HTTPS during or after redesign
In Summary. I prefer to work on the site structure first, probably because is one my main skills and I have been investing a lot of time and money on that. Donna Duncan suggests you make https migration first, mainly because is a less complicated process and she is right. Anyway, the main idea behind is the same. The best way to perform the project is one step at a time.
KISS "Keep it simple, stupid"
-
RE: Moz Pro Tools
Just in case you dont know how to do it
https://www.youtube.com/watch?v=Ao6nuhGkI1k -
RE: My homepage redirects to itself?
First, you need to check your robot.txt and your htaccess file, then check the plugins on your site to see if there is a plugin creating the redirections
Good Luck
-
RE: Is it good or bad to add noindex for empty pages, which will get content dynamically after some days
Well, if those pages do not have any value your best choice is add the no-index tag, I mean if they don't answer any question and aren't useful they will consume your crawl budget. Thin content can be identified as low-quality pages that add little to no value to the reader. Examples of thin content include duplicate pages, automatically generated content or doorway pages.
Google tries to provide the best results that match the search intent of the user. If you want to rank high, you have to convince Google that you’re answering the question of the user. This isn’t possible if you’re not willing to write extensively on the topic you like to rank for. Thin content rarely qualifies for Google as the best result. As a minimum, Google has to know what your page is about to know if it should display your result to the user. So try to write enjoyable, informative copy, to make Google, but first an foremost, your users happy.
How to Determine if a Page is "Low Quality"
https://moz.com/blog/low-quality-pagesWhat is Thin Content and Why is it Bad for SEO?
https://www.custard.co.uk/thin-content/How to Turn Low-Value Content Into Neatly
https://moz.com/blog/low-value-content-next-levelNow is a good idea to familiarize yourself with Google’s Quality Guidelines. Think long and hard about whether you may be doing this, intentionally or accidentally.
You’re probably not straight-up spamming people, but you could do better.
the golden rule to identify if your page needs the no- index tag or not, is very simple
“Does this add value for your visitors?” Well, does it?Also, check what Google says about it** "Thin content with little or no added value"**
https://www.youtube.com/watch?v=w3-obcXkyA4IN SUMMARY, Adding the no-index tag to unuseful pages will not hurt your site
Hope this info helps you with your question.
-
RE: How to set up Goals/Conversions in Google Analytics with Moz
OK this is the right way to connect your Google Analytics with Moz
Connecting Google Analytics > Moz Official Documentation
In the other hands as I understand Moz does not have a Goal Tracking Feature available.Here is a list very useful related to Google Analytics Goal Tracking
- understanding how to use google analytics event tracking
- 4 Google Analytics Goal Types That Are Critical To Your Business
- Create, edit, and share goals
IF THIS ANSWER WERE USEFUL MARK IT AS A GOOD ANSWER
-
RE: Different breadcrumbs for each productpage
Hi
Seen your case looks like you have big issue, with your site structure and also with your canonical tags I mean if you have
- Home > Bigbag Sandpit sand type xyz
- Home > Sand > Sandpit sand > Bigbag Sandpit sand type xyz
Then you have 2 pages competing for the same keywords usually will ignore both of them
So first need to know to define the source of the problem, in my experience, this usually occurs when you don't set up correctly the parent and child pages, also you will need to check your site taxonomy ( trust me on e-commerce is even more crucial than link building) a bad taxonomy can ruin all your efforts, also depending on the size of your site can require a lot of time.
The best tool in my opinion (experience) to fix the error Screamingfrog, Semrush and Dynomapper ( this one is almost mandatory)
Good Luck
-
RE: Is it good or bad to add noindex for empty pages, which will get content dynamically after some days
In that case, you can create some rules in your robot.txt file. All depends on the configuration of your site. Also, you need to check on your search console and your crawl budget.
As I mentioned all depends on your site. If you deal with 10 new users per day, just take it easy, config your robot.txt file in the other hand if you deal with 1000 or 10000 users, in that case, you will need to think in a better solution.
The first idea that comes to my mind is to create a script on javascript who evaluate some parameters on those pages and if meet the parameters (do not add the tag) if not **(add the tag) **
-
RE: WEbsite cannot be crawled
Ok, I made a quick test of your robot.txt file and looks fine,
https://www.threecounties.co.uk/robots.txtThen I made a test https://httpstatus.io/ to check the status code
of your robot.txt file and show me 200 status code (So it's fine)Also, you need to make sure that your robot.txt file is accessible for the Rogerbot (Moz crawler)
This day the hosting providers have become very strict with third-party crawlers
This includes Moz, Majestic SEO, Semrush and Ahrefs.Here you can find all the possible sources of the problem and recommended solutions
https://moz.com/help/guides/moz-pro-overview/site-crawl/unable-to-crawlRegards
-
RE: If I redirect a subdomain, does this affect the parent domain?
Your subdomains will be treated as entirely separate websites in the eyes of Google, as Matt Cutts explains: https://www.youtube.com/watch?v=_MswMYk05tk
In simple words, if you have
These are 2 different sites or properties according to Google, so in your case, you can move, migrate or update your subdomain and this will not affect the parent domain.
Regards
-
RE: The New and Improved Domain Authority Is Here!
Good to know it! I think that DA was needing an update
-
RE: Hreflang in country specific XML Sitemaps?
If you have different content/pages oriented for different regions, yes you need to implement
-
RE: Http to https:Risky?
Google identifies several reasons to switch to HTTPS in their website migration guide: There are other benefits, though, including the Google ranking boost.
Google even have their own guide, “Securing Your Website With HTTPS,” which I encourage everyone to read, along with this article. https://support.google.com/webmasters/answer/6073543
In my case, I just simple made the redirection 301, then I added the property on search console the http and https version and set the https on search console as a main view
- Obviusly I submitted the new sitemap
But this guide will show you a detailed step by step to follow
http://searchengineland.com/http-https-seos-guide-securing-website-246940 -
RE: Hreflang and canonical
can you explain better your answer please in order to give a better help, please
-
RE: What is the best way to employ log-in to benefit in SEO?
Beware of the Login pages – add them to Robots Exclusion
A lot of sites today have the ability for users to sign in to show them some sort of personalized content, whether its a forum, a news reader, or some e-commerce application. To simplify their users life they usually want to give them the ability to log on from any page of the Site they are currently looking at. Similarly, in an effort to keep a simple navigation for users Web Sites usually generate dynamic links to have a way to go back to the page where they were before visiting the login page, something like: Sign in.
If your site has a login page you should definitely consider adding it to the Robots Exclusion list since that is a good example of the things you do not want a search engine crawler to spend their time on. Remember you have a limited amount of time and you really want them to focus on what is important in your site.
Out of curiosity I searched for login.php and login.aspx and found over 14 million login pages… that is a lot of useless content in a search engine.
Out of curiosity I searched for login.php and login.aspx and found over 14 million login pages… that is a lot of useless content in a search engine.
Another big reason is because having this kind of URL's that vary depending on each page means there will be hundreds of variations that crawlers will need to follow, like /login?returnUrl=page1.htm, /login?returnUrl=page2.htm, etc, so it basically means you just increased the work for the crawler by two-fold. And even worst, in some cases if you are not careful you can easily cause an infinite loop for them when you add the same "login-link" in the actual login page since you get /login?returnUrl=login as the link and then when you click that you get /login?returnUrl=login?returnUrl=login... and so on with an ever changing URL for each page on your site. Note that this is not hypothetical this is actually a real example from a few famous Web sites (which I will not disclose). Of course crawlers will not infinitely crawl your Web site and they are not that silly and will stop after looking at the same resource /login for a few hundred times, but this means you are just reducing the time of them looking at what really matters to your users.
Source
Beware of the Login pages – add them to Robots Exclusion -
RE: I am really surprised to see this page is ranking like crazy even the content is very thin
As I see you have some really good links but most important you have really good anchor text
- artificial intelligence
- artificial intelligence 101: how to get started
One of those anchor text is your Headline / Title which according to Adam White is the most valuable backlink type
If you want to understand how important are the anchor texts on your backlinks please read this articles The Single Best Anchor Text for SEO That No One Is Talking About
-
RE: Mobile first - what about content that you don't want to display on mobile?
Mobile-first indexing means Google will predominantly use the mobile version of the content for indexing and ranking. Historically, the index primarily used the desktop version of a page's content when evaluating the relevance of a page to a user's query. Since the majority of users now access Google via a mobile device, the index will primarily use the mobile version of a page's content going forward. We aren't creating a separate mobile-first index. We continue to use only one index.
With mobile-first indexing, Googlebot primarily crawls and indexes pages with the smartphone agent. We will continue to show the URL that is the most appropriate to users (whether it's a desktop or mobile URL) in Search results.
if your site has separate desktop and mobile content, which means you have a dynamic serving or separate URLs (or m-dot) site, make sure you follow the best practices below to prepare for mobile-first indexing:
- Your mobile site should contain the same content as your desktop site. If your mobile site has less content than your desktop site, you should consider updating your mobile site so that its primary content is equivalent with your desktop site. This includes text, images (with alt-attributes), and videos in the usual crawlable and indexable formats.
- Structured data should be present on both versions of your site. Make sure URLs in the structured data on the mobile versions are updated to the mobile URLs. If you use Data Highlighter to provide structured data, regularly check the Data Highlighter dashboard for extraction errors.
- Metadata should be present on both versions of the site. Make sure that titles and meta descriptions are equivalent across both versions of your site.
So in your case, you are trying to keep the paradigm of the desktop first cutting the content for mobile. Probably you are trying to fit a desktop site into a mobile and that's probably your main error. I had the same issue in the past. So the best way to deal with that is very simple, literally, you need to starts with a blank paper to design your site starting for the mobile version. And that means images, content, graphics, call to actions and so on
-
RE: Do SEOs really need to care about trend in increase of voice search?
I completely agree with William Voice search will be the future
-
RE: Canonical tag on a large site
Hi Cristiana
Answering your question, I will say that canonical-tag to a site is not an option is a requirement, almost mandatory requirement. The canonical tag is directly related to the duplicated content issues.
From a technical standpoint, you'll need to understand how duplicate content can unintentionally be added to a site. Many times, it's simply a canonicalization issue. For example, homepage canonicalization causes most duplicate content issues on sites.
For example, search crawlers might be able to reach your homepage in all of the following ways:
-
https:yoursite.com
-
https:www.yoursite.com
-
http:yoursite.com
-
http:www.yoursite.com
Just to give you an example on Google Search Console you need to verify these versions of a single domain for single property
-
http:www.yoursite.com
-
https:www.yoursite.com
-
http:yoursite.com
-
https:yoursite.com
Google will see each URL as a different page – and it won't know which one you prefer to send users to. The problem can get exponentially worse if it exists on every page on your site.
The easiest way to solve the problem is with a server-side redirect that sets one of those URLs as the “official” version of the page, and only serves that version, regardless of which URL was the destination.
You can also use the rel canonical tag – it's a directive that's inserted in the header of the page. It looks like this: rel='canonical'
When you're starting SEO on a new site, you'll want to check out all of the canonicalization that's been declared, so that you have a solid understanding of what's going on with the site content.
-
-
RE: Website ranking on Google dropping for unknown reason while rankings are improving on Bing. Please help!
Firsts your need to keep in mind that Google and Bing use a different algorithm and that means they use different evaluation methods to try to get the best results.
I have been checking your site, based on my experience one of the accurate ways to measure the performance of a site is by checking the Trust flow (basically measure how many of the sites on your neighborhood are pointing to your site) so matter how many links you have or the DA what matters is how many of them belong to your niche.
- MOZ - DA -17
- Ahrefs - UR 25
- Ahrefs ----- DR 2.1
- Majestic --- TF 1
As you will notice your domain rating and trust flow are too low. So basically you have links pointing to your site but they are not passing any value. So I will suggest you to start to work on building a link profile
Regards
-
RE: How does Google rank a "Site:yourexamplesite.com" Query
Yes usually google display the order based on the hierarchy of the pages usually use the main menu and the internal links, but that just a personal observation In some case I saw some really deep internal pages on the first page probably based on the rank of the page.
Also when I evaluated those result each tool show me different result Moz, Ahrefs, Majestic, Semruhs.
-
RE: My competitor is ranking above me for a branded search in Google. How can I come back on top?
Well, in that case, you need to focus on 2 thinks, internal linking. use this search operator
- site: yourwebsite.com "keyword"
In that way, you can add strong internal links using your main keyword, also you should try with schemas adding your brand info or you should check all the Do-Follow links pointing to your competition and compare that with your own backlink profile. There are several ways to do it. But without any other information, there is not too many advice that comes to my mind.
Regards and Good Luck
-
RE: How to find orphan pages
Even Screaming-frog have problems to find all the orphan-pages, I use Screaming-frog, Moz, Semrush, Ahrefs, and Raven-tools in my day to day and honestly, Semrush is the one that gives me better results for that specific tasks. As an experience, I can say that a few months ago I took a website and it was a complete disaster, no sitemap, no canonical tags, no meta-tags and etc.
I run screaming-frog and showed me just 200 pages but I knew it was too much more at the end I founded 5k pages with Semrush, probably even the crawler of screaming frog has problems with that website so I commenting that as an experience.
-
RE: Opencart vs. Wordpress/Woo
I have experience working with both as you mentioned Opencart is a nightmare and you are right (this is just my personal opinion) Wordpress offers a better customization level for non-tech people but create you a big hole at the security level. I mean if you have a small store (less than 1000 per month) is not a big deal but more than that I will strongly suggest you move to Shopify is a good mix of the flexibility of WordPress with a strong core
-
RE: SEO friendly H1 tag with 2 text lines
The Crawler looks for content not for design technically you are talking about how to style your H1. That is a CSS topic you don't have to worry about that as long your tag and your content is relevant.
-
RE: Duplicated titles and meta descriptions
I would not worry about it, as long you made the right set it fo
- The canonical tags
- The language and region tags
- You submitted the sitemaps for every single region on Search Console
- Another good point to keep in mind is adding schemas to your pages
-
RE: How does Google rank a "Site:yourexamplesite.com" Query
You are a talking about _**Advanced Google search operators: **_Google search operators are special characters and commands that extend the capabilities of regular text searches. In this case, you are referring to the operator Site:
This searches only within a given domain – delectable when you want to only search within the confines of a particular site. For instance, if I were looking for members on a social network like Twitter,
- site:twitter.com Nicolas Cage
- site:twitter.com Pamela Anderson
You can learn more about search operators here
https://moz.com/learn/seo/search-operators
https://ahrefs.com/blog/google-advanced-search-operators/
https://www.searchenginejournal.com/google-search-operators-commands/215331/ -
RE: Can anyone recommend an SEO workflow?
Hi Thomas, select the right author is crucial I mean the difference in performance between a good author/ copywriter is not 10% or 20% or even 50%. In my opinion is 1000%. Of course is not cheap, and is not easy. These days the word SEO means nothing without content