Hi Ron10,
if you consider that this question as been answered, please indicate it as such.
If not, please ask for more specific answers
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hi Ron10,
if you consider that this question as been answered, please indicate it as such.
If not, please ask for more specific answers
Links in PDF do pass link equity. Be sure though that the PDF are correctly saved as text not image.
I warmly suggest you to read this great guide about SEO for PDF by Lunametrics.
Hi,
I suggest you both to give a read to this post by DejanSEO, which is quite clear and - IMHO - points to the right interpretation of a somehow confused best practice.
One fundamental question: is the redirection based on IP detection or user agent?
Because if it is over IP detection, then that's something Google deprecates.
Instead, if it based over User Agent, then 302 and 301 are substantially the same for Google in this case, even though with a 301 you are sure to pass link equity.
On the other hand, why have not you considered these two options:
First of all remember that the hreflang annotation is not necessarily needed in every page.
Said that, it really depends on your devs facilities what method to use, if in-code or using the sitemaps.
Both work fine, and what you should not do is using both at the same time, because the possibility of creating contradictory hreflang annotations increases.
I'd add one condition:
we do really want our images to rank in Google Images? Maybe we don't want that, especially since Google Images is not anymore that great source of traffic it was before its layout changed (we can see photos directly inside Google without clicking the "see the page" link). If it is so, I would prioritizes the image sitemaps.
But if you want your images indexed, picked up (and hotlinked) by other site or directly stolen but with you being able to find who used them without credit, then the image sitemap is needed:
here an example directly by Google: http://www.google.com/schemas/sitemap-image/1.1/sitemap-image.xsd.
I don't know what CMS are you using, but if it WordPress, the WordPress SEO by Yoast plugin automatically generates the image sitemaps.
Here the answers:
the hreflang tag doesn't slow down websites performance. it makes have an incidence only in the case you have hundreds of hreflang markups in the html code, so that in t case it is better to use the sitemap.xml implementation.
you site is only in English, hence you can't use the hreflang, because it is an "alternate", therefore it always needs a pair (eg: "e-US". using it only in self-referential way (as sometimes we do with rel "canonical" is wrong and Google will present it as a mistake and not consider it
if you really want to target only the US public, then you must geotarget the domain in Search Console, going to its "International" section and selecting United States as the country targeted by the site.
Keyword Density is a myth as it has been demonstrated for several years !!! Please, don't spread myth in the Moz Q&A.
Some sources:
The Dirk answer points to some potential answers.
Said that, when I click on your SERP's link, I see others sitelinks (just two):
As Dirk pointed out, your site has detected my IP (quite surely, but maybe it is user agent), and when I click on the second sitelink I see this url: http://www.revolveclothing.es/r/Brands.jsp?aliasURL=sale/all-sale-items/br/54cc7b&&n=s&s=d&c=All+Sale+Items.
The biggest problem, when it comes to IP redirections, is that they are a big problem in terms both of SEO and usability:
There's a solution:
making the IP redirection just the first time someone click on a link to your site and if that link is not corresponding to the version of the country from were users and bots are clicking;
presenting the links to the others country versions of your site, so that:
bots will follow those links and discover those versions (but not being redirected again);
users are free to go to the version of your site they really need (but not being redirected again if coming from those country selector links).
Said that, it would be better using a system like the one Amazon uses, which consists not forcing a redirection because of IP, but detecting it and launching an alert on-screen, something like: "We see that you are visiting us from [Country X]. Maybe you will prefer visiting [url to user's country site]".
Then, i just checked the hreflang implementation, and it seems it was implemented correctly (at least after a very fast review with Flang).
I tried to search for "Resolve clothing" in Spain incognito and not personalized search, and it shows me the Spanish website and Spanish sitelinks correctly;
I tried the same search from Spain but letting Google consider my user-agent (setup for English in search), and I saw the .com version and English sitelinks (which is fine).
Remember, sitelinks are decided by Goggle and we can only demote them.
To conclude, I think the real reason has to be searched not in a real international SEO issue (but check out the IP redirection), but to a possible and more general indexation problem.
The answer by Dimitri is wrong! (Sorry Dimitri).
The hreflang's href doesn't pass any link equity (use this definition, not link juice, please :-)).
It is a rel="alternate" and doesn't have any connection with things like 301s.
EGOL was right asking more information also for one precise reason: in some website a "thin page" maybe the best thing the same website can offer to a visitor because that page answers exactly to what the user needs from it.
That is why so often the Googlers say that thin content per se it's not a problem.
It's a problem if it is due to some technical issue or because of bad on-page SEO (i.e.: a page with a photo and no caption and written description of the photo).
So, to better answer your question, we need to know more about the nature of those thin pages you are talking about.
p.d.: using "noindex, follow" is not anymore suggested by Googlers. In fact, few months ago, John Mueller declared that if Google sees a page with a noindex,follow for a long time, then it will start considering the "follow" as a nofollow", so the original reason of its use won't be satisfied.