Page Authority for localized version of website
-
Hello everyone,
I have a case here were I need to decide which steps to take to improve page authority (and thus SEO value) for the German pages on our site. We localized the English version into German at the beginning of 2015.
www.memoq.com - English
de.memoq.com - German
By October 2015 we implemented href tags so that Google would index the pages according to their language. That implementation has been successful. There is one issue though: At that time, all our localized pages had only "1" point for Page Authority ("PA" in the Moz bar). At the beginning we though that this could be due to the fact that localization was done using subdomains (de.memoq.com) rather that subfolders (www.memoq.com/de). However, we decided not to implement changes and to let Google assess the work we had done with the href tags.
Its been a while now, and still all our German pages keep having only "1" point for page authority. Plus we have keywords for which we rank in the top 10 in English (US Google Search), but this not the case for the translated version of the keywords for German (Germany Google search).
So my question basically is:
Is this lack of page authority and SEO value rooted in the fact that we used subdomain instead of subfolder for the URL creation. If so is it likely that Page Authority for German pages and SEO value will increase if I change the structure from subdomains to subfolders?
Or is it that the problem in PA is rooted somewhere else that I am missing?
I appreciate your feedback.
-
Correct. I just confirmed that it is how Mozscape handles the hreflang tag. We do not yet transpose the actual value of the canonical version onto the hreflang variant. You should, in theory, assume the Page Authority of your homepage is identical with the Page Authority of the de.* variant. At least, that appears to be the way in which hreflang is supposed to be handled.
-
HI,
Thanks for your quick reply and expect your next answer. So the subdomain/subfolder issue is not the actual reason for the "1" point for every German page?
-
Page Authority is based on the links pointing to the page. There are only a handful of links on the web which point to your de.* version of your site, so it wouldn't have any independent Page Authority. Now, my guess is that MozScape simply does not currently project PA through to all of the hreflang variants. I am double checking on this now and should have an answer for you soon.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking for combined version of keyword but not separated version
Hi All, My site is currently ranking on page 1 for the term "golfholidays" but is ranking at the bottom of page 3 for the term I am targeting and have optimised for, which is "golf holidays". Does anyone have any experience with the combined keyword ranking above the singular version? Nowhere on my page doesn't it mention the term "golfholidays" and backlinks to my site mostly use the anchor "golf holdiays" Thanks!
Technical SEO | | Andy94120 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
Technical SEO | | WebServiceConsulting.com0 -
Results Pages Duplication - What to do?
Hi all, I run a large, well established hotel site which fills a specific niche. Last February we went through a redesign which implemented pagination and lots of PHP / SQL wizzardy. This has left us, however, with a bit of a duplication problem which I'll try my best to explain! Imagine Hotel 1 has a pool, as well as a hot tub. This means that Hotel 1 will be in the search results of both 'Hotels with Pools' and 'Hotels with Hot Tubs', with exactly the same copy, affiliate link and thumbnail picture in the search results. Now imagine this issue occurring hundreds of times across the site and you have our problem, especially since this is a Panda-hit site. We've tried to keep any duplicate content away from our landing pages with some success but it's just all those pesky PHP paginated pages which doing us in (e.g. Hotels/Page-2/?classifications[]263=73491&classifcations[]742=24742 and so on) I'm thinking that we should either a) completely noindex all of the PHP search results or b) move us over to a Javascript platform. Which would you guys recommend? Or is there another solution which I'm overlooking? Any help most appreciated!
Technical SEO | | dooberry0 -
What has happened to my page rank
hi my page rank for the site www.in2town.co.uk was page rank four and then last week it went down to page rank 2 and now my page rank is 0. i really do not understand what has happened. can anyone please give me advice on what is happening
Technical SEO | | ClaireH-1848860 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
How to attach at text to image that other websites use from my website
I often have other websites link to my website. They will do this with an image that they pull off of my website. (actually my website continues to serve the image). These inbound links are great, but they don't have alt text. Is there a way for me to attach alt text to the images, or is this something the other website needs to code themselves?
Technical SEO | | EugeneF0