- A backlink is when someone posts a link to your website on another website. A search engine catalogs theses and uses them as an indicator of your website's popularity and legitimacy. The more popular and legitimate the website that linked to you, the more legitimate you seem in the search engine's eyes.
- A 301 and redirect link are one and the same. When you use the internet and navigate to an address (like google.com for example) the computers talk to each other before the website is displayed. A key communication between them is HTTP Status Codes. You may already be familiar one of them: 404 (Page Not Found!). 301 (permanent redirect: page has moved permanently) and 302 (temporary redirect: page has moved temporarily) are types of this code. They indicate the computer that the page has moved, and that you will be redirected. When search engines find a page that has been moved with a redirect link, they will take that legitimacy built with the backlinks and apply it to the new page. For example, you have a domain called GoodVacuumCleaners.com and you want to move to BestVacuumCleanersEver.com. You will move the website to the new domain and set the old domain with a redirect link (301) so you can move your search rankings juice to the new domain.
Posts made by Advanced-Air-Ambulance
-
RE: Can anyone please explain the real difference between backlinks, 301 links, and redirect links?which one is better to rank a website? i am looking for the help for one of my website
-
RE: Is it worth keeping a decades-old domain that's merely 301 redirecting to the main domain?
An old domain may have gathered many useful links that can boost your main domain's profile. I vote keep.
-
RE: Google has discovered a URL but won't index it?
Hey Daniel. I agree with Chris. I have also noticed slow indexation recently. Might be a pain in the arse, but maybe you should request each page to be indexed individually in Search Console to add them to the high priority queue.
-
RE: Meta Description for ALL my Posts is Being Overwritten!
Hey Mike. Not to fret, this is actually a common silly mistake in theme development. The meta description shouldn't be set in WordPress theme because that is handled in the wp_head function. This function is where Yoast, and other plugins, hook into to deliver all the SEO optimized meta and header tags.
-
RE: Moz report not updated
Hey Sacha. If the update was supposed to happen today, sometimes I noticed it refreshes later during the day rather than earlier. If you have any further problems, I suggest you contact Moz Support directly. https://moz.com/help/contact
-
RE: Website homepage temporarily getting removed from google index
Hey.
I'm going to bounce off of Will here and recommend you check Search Console first. If you are de-indexed, it should display the reason on URL inspection, under index status. If you are really getting de-indexed and re-indexed within days, you are being recrawled frequently, so at least that's one good sign.
-
RE: Meta Description for ALL my Posts is Being Overwritten!
This is wrong, but right. This is a theme problem. The blog description is a description for the blog website as a whole, and not a description for the page. In Wordpress, you should not set a meta description explicitly in the theme file. That's the job of the wp_head, especially if you are using an SEO plugin to manage your descriptions. It's a common mistake that is easily remedied as evidenced by his reply.
-
RE: Competing Pages for Ranking Keywords?
Yeah, I don't disagree with you. Algorithmic filtering and ranking doesn't work exactly as intended, and there are ways to work around it as you point out. Generally, cannibalization shouldn't occur, but it does happen in some circumstances.
-
RE: Meta Description for ALL my Posts is Being Overwritten!
Sounds like a WordPress issue. I suspect there is a meta description defined in your theme file. Check that isn't the case.
It will look like this or very similar:
get_bloginfo('description')
Get rid of the whole meta tag that contains this php code.
If you are unsure how to find this, it will most likely be in your theme's header file. You can find this by going to the theme's folder and look for header.php.
You can also do this through the WordPress Dashboard > Appearance > Theme Editor. On the right hand side select the "Theme Header".
-
RE: Shopify Site with Multiple Domains?
I checked the page source. You can do this by heading to any web page, right clicking and selecting "view page source".
In Shopify, you can output the canonical url using the canonical_url object. I found this here: https://www.shopify.com/partners/blog/canonical-urls
{{ canonical_url }}
-
RE: Shopify Site with Multiple Domains?
I concur. Check with the business owner to see why both domains are operable. If there isn't a technical reason, redirect all traffic to the main domain.
-
RE: Shopify Site with Multiple Domains?
Hello Mike,
This may be hurting your presence on one or the other site. Duplicate content is typically frowned upon. You can get around this by simply assigning a canonical URL in the head of the HTML. A canonical URL tag tells the search engine "hey, the content is also accessible here but the official page is actually at example.com".
I went ahead and checked the HTML of both pages and they already have a canonical URL set to the main website address you posted. So it looks like you're good to go!
-
RE: Competing Pages for Ranking Keywords?
No, thankfully you don't encounter the issue of cannibalization between pages. Google or other search engines may use the more targeted page if it's determined to be more relevant , however if it's not determined to be more relevant you will get the page that's already ranked. It won't pull your other page down, however if you are just duplicating the content from the main page, the search engine may choose to skip indexing the new pages overall.
Overall, this should help and not hurt.
-
RE: Keyword in alt text or keyword in the body?
Search engines are smart enough to distinguish between content and alt texts. Typically alt texts are used for indexing images. I don't believe that would count toward your content keyword count, but it could also be a signal toward page relevance.
-
RE: Single Folder vs Root
I recommend Joseph's approach. There are many benefits to this approach: manageability, scalability, and seo. You can address all the practice areas available in specific locations as well as rank the firm more strongly in each location by key of relevance.
-
RE: Backlink quality vs quantity: Should I keep spammy backlinks?
I should also clarify, these may hurt you if they are your only links. If you have very little equitable links, this may cause Google and other search engines to falsely recognize you as spam. So just be careful and be on the look out for extra suspicious spam links. The balanced approach is the best approach: don't worry but stay aware!
Here is a more technical write-up from Moz that I reccomend: https://moz.com/help/link-explorer/link-building/spam-score
-
RE: Backlink quality vs quantity: Should I keep spammy backlinks?
No problem Liana.
- That is correct. Google understands that you don't have control of 3rd party sites, so instead of penalizing you, they minimize/ delete the effect the spam site links have.
- Yes, but only kind of. It may or may not increase PA/ DA, but according to Google it shouldn't hurt you.
But yeah that's the gist of it! Instead taking the time investigating and disavowing links, you could spend that time cultivating relationships with other websites and businesses that could give you nice quality linkage.
Hope this answer works for you.
-
RE: If few backlinks got deleted how they affect website?
Hi Julia,
It really depends on the quality of those backlinks. Let's use the Moz scoring system for this example. If you are losing 30 links with very low page authority (PA), low domain authority (DA), and high spam score, you will not be affected much, if at all. You shouldn't worry in this case. However, if you are losing 30 links with high PA, or high DA especially, you might be affected considerably, and should work to keep this links or get new links with similar PA and DA scores.
-
RE: Backlink quality vs quantity: Should I keep spammy backlinks?
Hi Liana,
As far as spammy links, Google has done well detecting whether or not they are intentional, aka black hat. If they aren't, Google does not penalize you for these links, so it's best to leave them.
As far as a strategy for generating links to your website, you should always focus on high quality over quantity. High quality links give you exponentially more return than high quantity of bad links.
I recommend this article Google wrote for us to understand when and how to disavow links.
https://support.google.com/webmasters/answer/2648487?hl=en
In short, rarely do you ever need to disavow links, even if they have a high spam score. You are only hurt when they sense you are gaming the system and in the case that they detect or suspect unethical backlinking, you will be penalized with a "manual action". You can check if you were penalized, as well as disavow flagged backlinks, in the Google Search Console.
-
RE: Is submitting a disavow list is helpful in link analysis
Google has an easy-to-read article for you about when and how to disavow links.
https://support.google.com/webmasters/answer/2648487?hl=en
In short, rarely do you ever need to disavow links, even if they're spammy. They have smart analysis built-in to detect whether a spammy link is just spam or if it is the result of unethical SEO practices trying to game the system, otherwise known as "black hat SEO". In the case that they detect or suspect unethical backlinking, you will be hit with a "manual action". You can check for these, as well as disavow links, in the Google Search Console.
-
RE: Why Google not disavow some bad links
Hi Mark,
Don't fret too much about spammy links. According to Google, they try to limit negative effects of third party sites.
Spammy backlinks that bring you down usually come with a "manual action", that is a big penalty sent down from the big man Google himself. These actions are made when it appears these spammy links are deliberate, paid for, or appear to be "black hat SEO".
Unless you've experienced one of these penalties, or want to pre-empt one, you don't need to disavow them. If you want to check for manual actions or would still like to disavow these links and keep track of them, use the Google Search Console.
-
RE: Finding the reason behind not Ranking-UP
You should also check if there have been any manual penalties set against you. You can do this through the Google Search Console.
-
RE: Finding the reason behind not Ranking-UP
You've pointed out you're beating them in both on-page and off-page SEO. My guess is that your user experience scores are very low. The last Google update has significantly increased the weight of user experience variables in their algorithm.
I was able to confirm my suspicions using the PageSpeed test by Google (which now includes Core Vital metrics).
While your content and off-page SEO is great compared to your competition, it doesn't mean much if your site doesn't load or takes too long to load. You need to cut that FCP, LCP, TTI down significantly.
-
RE: Can the login page be updated (the tab order)?
That is odd. I've tried out 3 different password managers, not including the built in Apple and Google ones, and I haven't encountered this issue.
I looked into why this could happen. I inspected the page source of the login page (https://moz.com/login) and it has valid markup with every field clearly marked (for example, password is marked as password by 3 separate html attributes), so no error on Moz's end.
Perhaps you should file a ticket with your password manager help line/ support.
-
RE: How can I identify most relevant websites in Mexico that create content about a specific term?
Hi Harol,
You can identify the top players for a keyword in an area by setting a Search Profile. You can do this with the MozBar extension.
- Install the MozBar
- Go to Google or Bing or Yahoo
- On the top left of the MozBar you will see a "Search Profiles" dropdown, click on the last option "Add New Profile"
- You don't need to fill out the whole form, you can just select Mexico, but if you want to have a more specific location, fill out more in the form
- Refresh the page and search
-
RE: Using GeoDNS across 3 server locations
Hi Keith,
I meant the physical bandwidth - i.e. your time. I probably should've been more clear in a technical forum!
For the architecture, there are a few common setups. What I am in the middle of doing here at my company is through Google Cloud services. Duplicating the website app or script (I.e. Wordpress, Ghost, Drupal, CMS, Python App, Rails app, etc) across the several servers and using a load balancer to determine the fastest server. In the app's configuration I am using a single Database server also set up on Google Cloud, so when one server executes a command, it is reflected for all users on all servers. If you're Cron-jobbing all the servers you have set up but no common database, you're going to have some integrity issues, with some servers having some comments or edits, and some servers not.
-
RE: Using GeoDNS across 3 server locations
Personally, I would use the one domain. And from what you've said, you would prefer it as well.
Thankfully, rankings are on a domain basis and not an IP basis, so there would be no issue in the first scenario. If you are duplicating and synchronizing the servers, you are better off using the one domain because you aren't creating two separate websites with differing content (UK English vs US English).
Do you have the bandwidth or ability to produce separate versions (for each domain) for each area you want to target? If not you are best off generalizing your website to target all English users instead of en-US, en-GB, etc. You're going to have to evaluate your geotargeting goals and budget.
-
RE: How do i beat my spammy competitors? they are ranking on page 1 of google!
Your best strategy in beating a spammy website is following the best practices of SEO and web development.
You've already outlined two key components you will need to use: high quality backlinks and rich relevant content. As you've pointed out, generating a wide network of backlinks takes a lot of time and many relationships. That is a big reason as to why links have a high value in SEO. Although it is not as effective as it previously was due to the increasing number of variables, it still holds a lot of equity amongst the signals used by Google. Your .edu strategy should be noteworthy, as only established institutions are afforded this domain space.
Because at first you will be outmatched on the off-page SEO like backlinks, focus on the on-page factors. This includes content relevance, content quality, page speed, and others. You will find a lot of resources available here on Moz that you can refer to.
https://moz.com/learn/seo/on-site-seo
https://moz.com/learn/seo/on-page-factors
-
RE: Using GeoDNS across 3 server locations
The way GeoDNS works is through one of two methods: split DNS or load balancing. The end result is the same, the user will be directed to their closest or fastest available server.
Theoretically, this helps achieves a major goal of technical SEO - great site speed.
With the new Google Web Core Vitals update of this year, site speed and user experience has been further notched up as ranking factors. To get more technical– LCP, largest contentful paint, the speed of which the largest asset on a page loads, and FCP, first contentful paint, the speed of which the first legible content is produced on the screen, are site speed signals used by Google in their ranking algorithm. By connecting a user to the closest/ fastest server available, you can bring down the time on LCP and FCP and thereby increase your rank. The rank change may not be immediately noticeable depending on the competitiveness of your keywords and industry. You can measure these and other variables here: https://developers.google.com/speed/pagespeed/insights/
In short: No, your SEO won't be negatively impacted, and it will more likely be positively impacted by these optimizations.
-
RE: How does low difficulty keywords help us to rank for high difficulty keywords?
Ranking for low difficulty keywords creates kind of a domino effect. By ranking higher for those keywords, you might be link-building or establishing a higher domain authority. In doing these, your website will have generated better metrics to be more competitive for higher difficulty keywords.
Basically, as long as your content is relevant for all keywords, high and low difficulty, the strategies you'll typically use to boost one will help both.