Correct, Bing powers AOL and Yahoo.
I don't think DuckDuckGo has a Sitemap submission feature. If you list your sitemap in the robots.txt file, though, maybe they will still crawl it.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Correct, Bing powers AOL and Yahoo.
I don't think DuckDuckGo has a Sitemap submission feature. If you list your sitemap in the robots.txt file, though, maybe they will still crawl it.
Also Googlers a) have to fight pretty much only with algorithms and b) have to minimize collateral damage. That gives black hatters an advantage, and makes it impossible for Google to completely win. What annoys me is the fact that almost every single SEO company claims to be 100% white hat and Google-approved, while many are buying links, doing link networks, etc. I can't figure out if they are really that ignorant of the rules or if they are just lying.
I think you over-estimate how smart Googlers are and how dumb shady SEOs are. I've seen blatant tricks that have been working for years. I don't use such tricks because they are very risky, but I don't think we should dismiss them as useless - I think it's worth understanding which work and which don't, especially if our competitors are using them. We have competitors who are using very shady SEO tactics and winning with them. Understanding what they are doing and what is working helps us determine which white hat tactics to deploy against them. That's why I'm asking for data on CTR manipulation.
Yeh, I think this is a good topic to be aware of, even if we wouldn't use it. That way we can respond correctly if our competitors use it, figure out white-hat ways to combat it, have a better understanding of Google's algorithms, etc. All else failing, it wouldn't be expensive to test them out.
I generally agree with you that Google is getting better at detecting tricks, but Google is far from perfect and there are a lot of tricks that still work. I'm looking for specific results/data specifically on the CTR manipulation services...
Have you done any tests with CTR manipulation services work?
This is not a tactic white hat SEOs would use, but it's still good for us to know whether it works, so we can respond correctly if our competitors use it, to figure out white hat ways to combat it, to answer client questions, to have a better understanding of Google's algorithms, etc.
I've seen a variety of services on the fringe of the SEO world that send a flow of (fake) traffic to your website via Google, to drive up your SERP CTR and site engagement. Seems gray hat, but I'm curious as to whether it actually works.
The latest data I've seen from trustworthy sources (example and example 2) seems mixed on whether CTR has a direct impact on search rankings. Google claims it doesn't. I think it's possible it directly impacts rankings, or its possible Google is using some other metric to reward high engagement pages and CTR correlates with that.
Any insight on whether CTR manipulation services actually work?
Thanks, I'll send you a PM.
OK, that's very good to know. I missed that.
Here is the Google source I found that implied that hreflang tags do not combine/consolidate link metrics:
"Generally speaking, the rel-alternate-hreflang construct does not change the ranking of your pages. However, when a page where you use this markup shows up in the search results, we may use this markup to find alternate, equivalent pages of yours. If one of those alternates is a better match for the user, their query language, and the location, then we may swap out the URL. So in practice, it won't change the ranking of your pages, but it will attempt to make sure that the best-suited URL (out of the list of alternates) is shown there." ~John Mueller
This seems to be describing a "swap out" effect rather than a consolidation of metrics. In my mind, that sounds different. It sounds like what John is saying is that "if your main site ranks for the keyword "barcelona" in English search results, if someone searches in Spanish we'll give you the same ranking, we'll just display your Spanish URL instead". That seems different from a consolidation to me (the Spanish URL isn't being given the link authority from the main URL to help it rank for other Spanish keywords, it's just being swapped out in SERPs where the English URL already ranks). Of course Google hasn't released the details so I'm guessing a bit here.
"Gianaluca was speaking of multiple "sites" but this is translation."
My issues is where multiple sites and translations are the same thing, i.e. when you have different language versions of your site on different subdomains. Gianaluca seems to be saying that hreflang will not consolidate link authority across your sites that are in different languages. Here's another source saying the same thing: https://www.semrush.com/blog/7-common-hreflang-mistakes-and-how-to-fix-them/
I've got a situation now where it appears that Google is not consolidating/sharing link signals efficiently between the language versions that are hosted on separate subdomains. My concern is that part of the issue may be the fact that the different lanaguage versions are on different subdomains. That's why I'm keen to know why Moz excepts language-specific websites from their "no subdomains" advice.
Any thoughts?
"use hreflang and that acts like a canonical"
Google and other sources don't indicate that hreflang will pass/consolidate link authority. So I think hreflang and canonical tags are different in that regard. Based on that and what I've seen, I don't see that hreflang tag would negate the disadvantages of a subdomain. If you have evidence it does, though, I am very interested!
"But you can't change server location with subdirectories. With subdomain you can make de.example.com and place this in German server and es.example.com and place this in Spanish server."
What you're talking about is geographic targeting, but Moz was specifically referring to language-targeting. Those are similar, but they are subtly different things.
Language-specific sites are not necessarily targeted to a specific country. They can target multiple countries (eg Spanish speakers in US, Spain, Mexico, etc.) or you might have two Language-specific sites targeting the same country (eg an English and a Spanish site both for the US).
So if a language-specific site isn't a geographic-targeted site, I still don't understand why Moz would recommend a subdomain in that case.
In Moz's domain recommendations, they recommend subdirectories instead of subdomains (which agrees with my experience), but make an exception for language-specific websites:
Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website).
Why are language-specific websites excepted from this advice? Why are subdomains preferable for language-specific websites? Google's advice says subdirectories are fine for language-specific websites, and GSC allows geographic settings at the subdirectory level (which may or may not even be needed, since language-specific sites may not be geographic-specific), so I'm unsure why Moz would suggest using subdirectories in this case.
Google is getting much better at recognizing location, but I would still work to include it on the page in a few places. That's what I've seen the best results doing. My recommendation:
I don't see anything that I would think would trigger that. Let me PM you the URL.
Thanks for the suggestions!
The homepage, category, and product pages have all lost traffic.
So far, I haven't found any noteworthy changes in content.
I've been wondering if this might be part of the issue.
I've reviewed Majestic link data, and only see a few deleted backlinks, so I'm thinking it's not a backlink issue.
Thanks for the suggestion. So far the only significant difference in optimization I've found has been that they added Schema.org markup.