Correct, Bing powers AOL and Yahoo.
I don't think DuckDuckGo has a Sitemap submission feature. If you list your sitemap in the robots.txt file, though, maybe they will still crawl it.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Correct, Bing powers AOL and Yahoo.
I don't think DuckDuckGo has a Sitemap submission feature. If you list your sitemap in the robots.txt file, though, maybe they will still crawl it.
Also Googlers a) have to fight pretty much only with algorithms and b) have to minimize collateral damage. That gives black hatters an advantage, and makes it impossible for Google to completely win. What annoys me is the fact that almost every single SEO company claims to be 100% white hat and Google-approved, while many are buying links, doing link networks, etc. I can't figure out if they are really that ignorant of the rules or if they are just lying.
I think you over-estimate how smart Googlers are and how dumb shady SEOs are. I've seen blatant tricks that have been working for years. I don't use such tricks because they are very risky, but I don't think we should dismiss them as useless - I think it's worth understanding which work and which don't, especially if our competitors are using them. We have competitors who are using very shady SEO tactics and winning with them. Understanding what they are doing and what is working helps us determine which white hat tactics to deploy against them. That's why I'm asking for data on CTR manipulation.
Yeh, I think this is a good topic to be aware of, even if we wouldn't use it. That way we can respond correctly if our competitors use it, figure out white-hat ways to combat it, have a better understanding of Google's algorithms, etc. All else failing, it wouldn't be expensive to test them out.
I generally agree with you that Google is getting better at detecting tricks, but Google is far from perfect and there are a lot of tricks that still work. I'm looking for specific results/data specifically on the CTR manipulation services...
Have you done any tests with CTR manipulation services work?
This is not a tactic white hat SEOs would use, but it's still good for us to know whether it works, so we can respond correctly if our competitors use it, to figure out white hat ways to combat it, to answer client questions, to have a better understanding of Google's algorithms, etc.
I've seen a variety of services on the fringe of the SEO world that send a flow of (fake) traffic to your website via Google, to drive up your SERP CTR and site engagement. Seems gray hat, but I'm curious as to whether it actually works.
The latest data I've seen from trustworthy sources (example and example 2) seems mixed on whether CTR has a direct impact on search rankings. Google claims it doesn't. I think it's possible it directly impacts rankings, or its possible Google is using some other metric to reward high engagement pages and CTR correlates with that.
Any insight on whether CTR manipulation services actually work?
Thanks, I'll send you a PM.
OK, that's very good to know. I missed that.
Here is the Google source I found that implied that hreflang tags do not combine/consolidate link metrics:
"Generally speaking, the rel-alternate-hreflang construct does not change the ranking of your pages. However, when a page where you use this markup shows up in the search results, we may use this markup to find alternate, equivalent pages of yours. If one of those alternates is a better match for the user, their query language, and the location, then we may swap out the URL. So in practice, it won't change the ranking of your pages, but it will attempt to make sure that the best-suited URL (out of the list of alternates) is shown there." ~John Mueller
This seems to be describing a "swap out" effect rather than a consolidation of metrics. In my mind, that sounds different. It sounds like what John is saying is that "if your main site ranks for the keyword "barcelona" in English search results, if someone searches in Spanish we'll give you the same ranking, we'll just display your Spanish URL instead". That seems different from a consolidation to me (the Spanish URL isn't being given the link authority from the main URL to help it rank for other Spanish keywords, it's just being swapped out in SERPs where the English URL already ranks). Of course Google hasn't released the details so I'm guessing a bit here.
"Gianaluca was speaking of multiple "sites" but this is translation."
My issues is where multiple sites and translations are the same thing, i.e. when you have different language versions of your site on different subdomains. Gianaluca seems to be saying that hreflang will not consolidate link authority across your sites that are in different languages. Here's another source saying the same thing: https://www.semrush.com/blog/7-common-hreflang-mistakes-and-how-to-fix-them/
I've got a situation now where it appears that Google is not consolidating/sharing link signals efficiently between the language versions that are hosted on separate subdomains. My concern is that part of the issue may be the fact that the different lanaguage versions are on different subdomains. That's why I'm keen to know why Moz excepts language-specific websites from their "no subdomains" advice.
Any thoughts?
"use hreflang and that acts like a canonical"
Google and other sources don't indicate that hreflang will pass/consolidate link authority. So I think hreflang and canonical tags are different in that regard. Based on that and what I've seen, I don't see that hreflang tag would negate the disadvantages of a subdomain. If you have evidence it does, though, I am very interested!
So, I was reading Percy Jackson & the Olympians: The Battle of the Labyrinth tonight, and found a description of a graffiti tag in the Labyrinth that says "MOZ RULZ". What do you think? Is one of the Mozzers a demigod who has explored and survived the Labyrinth?
(See attachment for book quote.)
A few suggestions:
I can think of a few options for the homepage:
Hope this helps!
Sure, it's possible. Site size is generally not a ranking factor. Not directly, at least. As Jim points out, the larger site may be more useful to the user, which would attract more backlinks, etc.
Bottom line: Focus on making your site the best. If that means adding more great content and making it bigger, great. If you can make it great without adding more pages, that's fine, too.
Correct, Bing powers AOL and Yahoo.
I don't think DuckDuckGo has a Sitemap submission feature. If you list your sitemap in the robots.txt file, though, maybe they will still crawl it.
Billy gave you some good advice, but I disagree with him on one point. Domain authority doesn't necessarily have anything to do with whether a link is dangerous or not. You can get a spammy link from a domain with high authority or a great link from a lower authority domain.
You want to get rid of links that Google will view as intended primarily to manipulate search rankings instead of providing value to the user.
Some factors to consider:
Hope that helps!
Alan's answer is spot on.
Avoid javaScript or carefully follow best-practices so Google can crawl all content without reading javaScript. Alan is exactly right when he said: never count on Google crawling through your JavaScript to find content, but also never count on being able to hide anything from Google via JavaScript.
Whether you use PHP or not is not directly relevant. PHP is executed server-side, so Google never sees your PHP code and doesn't care. What matters is whether your programmer writes PHP code that will generate search friendly HTML code, url structure, etc.
Depending on what the pages are, I would let them get indexed so they can be ranked for any longtail keywords, then perhaps 301 redirect them to the main sale page. Or leave them up with a notice that product is no longer available but display related products.
Well, you're probably either getting more traffic and/or maybe you need to optimize your file sizes.
How much bandwidth have you used?
Usually images or videos are what gobbles up bandwidth. Are you hosting any large images, videos, or files?
Load time can be a server issue, a software/coding issue, and/or a filesize issue.
4GB of bandwidth in 4 days is a lot - that's how much my site that gets 30K+ uniques per month has used. Based on Alexa rank, it looks like your site doesn't get that much traffic? Have you had an uptick in traffic?
You can check page and file sizes at http://tools.pingdom.com/fpt/ That tool shows your homepage and associated files has a total size of 1MB. To use 4GB of bandwidth at that rate, you'd need ~4,000 pageviews.
You could run your top pages through that tool and see if any of the embedded images or other files are too large.
You could also check your log files to see if someone has embedded an image from your site, you're getting crawled by robots, etc.
Hope that helps!
Aye.
Oh, and hey! I just noticed I know you from NOLA.
7/15/2016 Infographics might be overdone in content marketing, but your brand can still use them creatively.
3/5/2015 Page load speed is an important factor in optimizing website performance. These tips will have your ecommerce pages loading faster in no time.
4/11/2013 Measuring traffic and social metrics for your own site is easy – just take a peek in Google Analytics for a wealth of traffic, conversion, and social data. Analyzing your competitors' blogs isn't so easy, but it is important. Proper competitive analysis can lead to new content ideas, better outreach, better guest posting opportunities, insights on marketing, and other helpful data that can improve your marketing ROI. In short, if you’re not analyzing your competitors’ websites, you're missing out.
11/12/2012 Since the infamous Penguin Update, many webmasters have been scrambling to remove backlinks Google may have penalized them for (a.k.a. high risk links, toxic links, bad links, etc.). But how do you determine which links are toxic and should be removed?
12/15/2011 Google is indisputably seated at the throne in the free internet app kingdom. With 45+ tools in its arsenal, most marketers only use and are familiar with a few of them. One slightly lesser-known tool your marketing campaign should not live without is Google Insights for Search. Here are six of the ways Google Insights can help fine-tune your keyword strategy:
Entered the world of online marketing in 2003. My fortes are SEO and conversion rate optimization.
Looks like your connection to Moz was lost, please wait while we try to reconnect.