Multisite domain
-
good morning I have a wordpress site I have activated the multisite, currently the site has a domain authority of 8, when I publish a post, it is indexed quite quickly, if I publish a post in a language other than the /es subdomain it takes 24 hours why? If the author domain is the same, why does the employee take longer to be indexed on Google? Thank you
-
Hello ,
A WordPress multisite benefits from a single domain authority, you can notice slower indexing for posts in languages other than English.
Google prioritizes indexing content in languages it understands and ranks well for. Your site's focus on English content gives google a deeper understanding, potentially leading to faster indexing for English posts. Content in less familiar languages might take longer, especially if google has less data about your site's authority in those languages. User engagement signals also influence's indexing speed. If your English content receives more user interaction), google might prioritize future English posts based on this positive response. Content in other languages might initially receive less engagement which leads google to observe user behavior before fully indexing it.
Also make sure your translations maintains the same level of optimization as your English content. This includes translation accuracy, relevant keyword usage in respective languages, and proper meta descriptions for each post. Search engines typically index high-quality content faster.
Search engines crawls websites frequently based on various factors like website size, update frequency, and overall health. If your english content subdirectory (e.g., /) receives more updates or has more pages, it might be crawled more frequently, leading to faster indexing. You will need to double check for any technical issues specific to your non-english subdirectories.
Issues like broken links, slow loading times, or missing sitemaps will delay indexing for those sections of your website.
Though the author domain stays constant, these factors can contribute to the observed indexing speed variations.
My suggestions -
-Focus on relevant keywords in the respective languages and create accurate meta descriptions for each post.
-Use tools like GSC and track indexing status and identify any technical issues specific to your non-english subdirectories.
-Update sitemaps for each language subdirectory to ensure google is aware of all content and can crawl it efficiently.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Boost Your WordPress Website Speed to 95+ (Without Premium Plugins)
I'm reaching out for some advice on improving my WordPress website's speed. I'm currently using a free theme for this fusion magazine and aiming for a score of 95+ on Google PageSpeed Insights. I'm aware that premium plugins can significantly enhance performance, but I'm hoping to achieve similar results using primarily free solutions and manual optimizations.
Technical SEO | | mohammadrehanseo0 -
How to stop /tag creating duplicate content - Wordpress
Hi, I keep getting alert for duplicate content. It seems Wordpress is creating it through a /tag https://www.curveball-media.co.uk/tag/cipr/ https://www.curveball-media.co.uk/tag/pr-agencies/ Something in the way we've got Wordpress set up?
Technical SEO | | curveballmedia0 -
Cross Domain Canonicalization for Site Folder
Hello colleagues! I have a client who decided to launch a separate domain so they could offer their content translated for other countries. Each country (except the US/English) content lives in its own country folder as follows: client.com/01/02/zh
Technical SEO | | SimpleSearch
client.com/01/02/tw etc. The problem is that they post the content in US/English on this domain too. It does NOT have its own folder, but exists righth after the date (as in the above example) Oh, and the content is the same as on their "main" domain so google likes to index that sometimes vs. the original client on the domain where we want the traffic to go. SO, is there a way to say "hey google, please index the US content only on the main domain, but continue to index the translated content in these folders on this totally separate domain?" Thank you so much in advance.0 -
Does a subdomain benefit from being on a high authority domain?
I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?
Technical SEO | | RG_SEO0 -
Sub-domain dilema - variations
Hi All, We're creating a sub-domain for a new program launch on a client site and we need to choose the "right" sub-domain name that properly reflects the offering and hopefully keyword driven. The client is already using the preferred sub-domain name but until we get clarification that we can take it over, I'm doing my due diligence and expecting a worst case scenario in that we cannot use it. The current preferred sub-domain is only a landing page (not a full site) and it does NOT rank for anything and it's not being built out further. My question: Would pluralizing our preferred keyword have any effect in the way SEs see the two sub-domains as they're very closely related? hardcandy.sitename.com or hardcandies.sitename.com The new sub-domain would be a fully-functional-SEO-friendly site (well that's the plan anyway). Thanks for all your help.
Technical SEO | | Bragg0 -
Google refuses to index our domain. Any suggestions?
A very similar question was asked previously. (http://www.seomoz.org/q/why-google-did-not-index-our-domain) We've done everything in that post (and comments) and then some. The domain is http://www.miwaterstewardship.org/ and, so far, we have: put "User-agent: * Allow: /" in the robots.txt (We recently removed the "allow" line and included a Sitemap: directive instead.) built a few hundred links from various pages including multiple links from .gov domains properly set up everything in Webmaster Tools submitted site maps (multiple times) checked the "fetch as googlebot" display in Webmaster Tools (everything looks fine) submitted a "request re-consideration" note to Google asking why we're not being indexed Webmaster Tools tells us that it's crawling the site normally and is indexing everything correctly. Yahoo! and Bing have both indexed the site with no problems and are returning results. Additionally, many of the pages on the site have PR0 which is unusual for a non-indexed site. Typically we've seen those sites have no PR at all. If anyone has any ideas about what we could do I'm all ears. We've been working on this for about a month and cannot figure this thing out. Thanks in advance for your advice.
Technical SEO | | NetvantageMarketing0 -
How Do You Balance Factors In Domain Selection?
Recently, there has been considerable discussion and debate about the value of choosing keyword-rich and exact-match domains. In a recent webmaster Q&A, Matt Cutts seems to suggest that google is planning on "turning down the dial" on keyword domain signals: http://www.youtube.com/watch?v=rAWFv43qubI How will these changes (if at all) impact your advice to clients on domain selection?
Technical SEO | | Gyi0