Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi there, I have an issue with the amount of internal links on my webpages. Moz campaign manager gives a lot of 'too many on page links' issues. Over 7000.
    I know the importance of a good internal linking structure. 1. Not too many internal links (over approximately 100) is good for flowing through some authority from authoritive pages. 
    2. Too many internal links can spend all of the 'crawler budget' so the crawlers won't crawl the complete website anymore (right?). This can cause problems with indexing new webpages (right?). This is the situation: The website is a webshop The header contains 6 links, the footer contains 32 links, the homepage contains 42 links, the body content of some category pages contains a variated amount of links from 30 to a maximum of 100 links. Product pages do contain a maximum of 25 links. There is no problem here. Now here's the problem: The website navigation is a dropdown menu that contains 167 links to tier 2. These links are very important for our visitors. They can immediately find the right category/product by it. Removing or shrinking this dropdown is not an option. But the dropdown navigation is causing all of the 'too many on page links' issues. Question: is there a SEO (indexing, PA) problem in this situation which i should solve? What should I solve and how should I solve this? Note: pages have good organic positions and authority. Thanks a lot. Marcel

    | MarcelMoz
    0

  • Hi, Im working on brand new website i didn't even start my link building yet, just added to local directories i slowly started getting my ranking on 3rd page of Google then few weeks ago my ranking fell for all the keywords so now the website doesn't even rank on 10th page. Its been like this for a few weeks now. Here's the website Screenshot http://screencast.com/t/wDWk8sxLw Thanks for your help

    | mezozcorp
    0

  • hi, something very unusual is happening with my site. a few months ago before we updated our site, i mean upgraded it, we would be indexed every day, now it seems we are getting indexed every four days, and now today i have noticed something very strange, this morning when i checked my site www.in2town.co.uk it was cached 31st july, but this afternoon it is showing 25th july. can anyone please let me know why this has happened many thanks

    | ClaireH-184886
    0

  • Hey guys, I'm in the process of re designing my company website. This website runs in Kentico CMS and now I'm moving to drupal CMS. The main reason for the move is that I need a more flexible design with responsiveness and integrate better social aspects. Currently I'm having to deal with a developer whom can't be bothered to update the site. I also find kentico extremely difficult to deal with. The site has good Google rankings and I was thinking about setting up some 301 redirects when the move is completed. Now the question is this site is been live for over 2 years and I have tons of blog posts sitting in here. How can I get these moved? When this is done will they lose their 'age' ? What if I re create the blog posts as it is and then setup 301 re-directs on this? I have G+ authorship and will they be affected by this? Last question is having a responsive site will affect my SEO rankings once moved? I heard the 301 is not considered as a 'safe' thing with Google so what are my options? If anyone can share a link(s) to site upgrades/moving to new design tutorials/best practices articles based on SEO ill be grateful. Thank you very much.

    | Suganthan
    0

  • Hi, I just recive a "nice" Massage at my WMT- Unnatural links to your site—impacts links _Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more._Did someone here came across any massage like this before?if so, any suggestion on what to so next?Whould love for some help! Thanks

    | Tit
    0

  • For years now whenever we recreate a site we always set up both an xml sitemap and an html sitemap page. Stupid question maybe - but what is the value to having an html sitemap on site?

    | Pete4
    0

  • I have a forum at http://www.onedirection.net/forums/ which contains a gallery with 1000's of very thin-content pages. We've currently got these photo pages disallowed from the main googlebot via robots.txt, but we do all the Google images crawler access. Now I've been reading that we shouldn't really use disallow, and instead should add a noindex tag on the page itself. It's a little awkward to edit the source of the gallery pages (and keeping any amends the next time the forum software gets updated). Whats the best way of handling this? Chris.

    | PixelKicks
    0

  • We have been getting multiple 404 errors in GWT that look like this:  http://www.example.com/UpdateCart. The problem is that this is not a URL that is part of our structure, it is only a piece.  The actual URL has a query string on the end, so if you take the query string off, the page does not work. I can't figure out how Google is finding these pages.  Could it be removing the query string? Thanks.

    | Colbys
    0

  • Hello, I have been brought in at the last minute to consult for an e-commerce client who is about to relaunch their website. The site currently receives 8000 visits a month, 3100 of which are from organic search. They have a few thousand product pages.  The web development firm they are using is changing all of the old product page urls and using 'search engine friendly' urls for the new site, which is expected to launch in a few weeks. However, they did not/are not planning on including 301 redirects from the old URLs. Other than simply stating 'this will be bad for your SEO', what would be a correct way of explaining to the client how much of a problem it will be if their new site launches without 301s. For example, is this a big enough issue to delay the launch of the site / get in a contract dispute with the web developer?

    | stageagent
    0

  • Hey I have an Immigration website in South Africa
    MigrationLawyers.co.za and the website used to be divided in to two categories:
    1st part -  South African Immigration
    2nd part - United Kingdom Immigration Because of that we made all the pages include the word "South Africa" in the titles. eg.
    ...ers.co.za/work-permit-south-africa
    ...ers.co.za/spousal-visa-south-africa
    ...ers.co.za/retirement-permit-south-africa
    ...ers.co.za/permanent-residence-south-africa I'm sure you get the idea.
    we since, removed the UK part of the website and now are left only with the SA part. Now my question is: Is it bad? will google see this as spammy, as I'm targeting "South Africa" in almost every link of the website. Should I stick to the structure for new pages, or try to avoid any more use of "South Africa". Perhaps I can change something as it currently stands? Kind Regards
    Nikita

    | NikitaG
    0

  • I am suddenly seeing a ton of crawl errors in webmaster tools. Almost all of them are URL links coming from scraper sites.that I do not own. Do you see these in your Webmaster Tools account? Do you mark them as "fixed" if they are on a scraper site?  There are waaaay too many of these to make redirects. Thanks!

    | EGOL
    0

  • I am facing a steady, but deteriorating Google traffic drop and i am struggling to find the real reasons. The site is in the video entertainment niche. Here is some data regarding the site in general: Site redesign happened around June We are not doing a lot of off-site and link building in 2013, besides social activities and content distribution to partner sites (top links from OSE are gained long time ago) GWT crawls errors are clean (besides 404s) No manual actions from Google or penalty notifications Serious fluctuations in indexed content in June, but no real effect on traffic Removal of old content during August Top keywords are dropping (may be caused by category pages removal and flattening site structure) Robots file disallows only search-generated pages No sitemap errors or warnings 2K of duplicate meta-titles and descs Thank you and appreciate your help. wyOyoA9

    | dimicos
    1

  • Is there an easy way or any way to get a list of all deindexed pages? Thanks for reading!

    | DA2013
    0

  • Say you are listed with both Google places and Google Local. Places still allows custom categories, while Local limits you to preset categories. Which is the better strategy: to build service pages following custom services available in Places, or build out service pages following the (allowed) preset categories in Local.

    | waynekolenchuk
    0

  • Wondering if I could pick the brains of fellow mozer's. Been working with a client for about 3 months now to get their site up in the engine. In the three months the DA has gone from about 11 to 34 and PA is 40 (up from about 15) so that's all good. However, we seem not to be moving up the ranking much. The average DA of competitors in the niche in the top ten is 25. We have 9.2 times the average no of backlinks too. During a call to the client today they told me that they noticed a major drop in their rankings a few months back. Didn't say this when we started the project. I just searched for the first paragraph on their homepage and it returns 16,000 hits in google, The second returns 9600 and the third 1,400. Searching for the first paragraph of their 'about us' page gives me 13,000 results!! Clearly something is not right here. Looking into this, I seems that someone has use their content, word for word, as the descriptions on thousands of blogs, social sites. I am thinking that this, tied in with the slow movement in the listings, has caused a duplicate content penalty in the search engines.  The client haven't copied anyone's content as it is very specific for their site but it seems all over the web. I have advised them to change their site content asap and hope we get a Panda refresh in to view the new unique content. Once the penalty is off i expect the site to shoot up the rankings. From an seo company point of view, should I have seen this before? Maybe. If they had said they suffered a major drop in rankings a few months back - when they dropped their seo agency, I would have looked into it, but one doesn't naturally assume that a client's copy will be posted all over the web, it is not something I would have searched for without reason to search Any thoughts on this, either saying yes or no to my theory would be most welcome please. Thanks Carl

    | GrumpyCarl
    0

  • We made an update to the Robots.txt file this morning after the initial download of the robots.txt file.  I then submitted the page through Fetch as Google bot to get the changes in asap. The cache time stamp on the page now shows Sep 27, 2013 15:35:28 GMT.  I believe that would put the cache time stamp at about 6 hours ago.  However the Blocked URLs tab in Google WMT shows the robots.txt last downloaded at 14 hours ago - and therefore it's showing the old file. This leads me to believe for the Robots.txt the cache date and the download time are independent.  Is there anyway to get Google to recognize the new file other than waiting this out??

    | Rich_A
    0

  • Do href links that leave a site and use target="_blank" to open a new tab impact SEO?

    | ChristopherGlaeser
    0

  • HI, We are planning to launch sites specific to target market (geographic location) but the products and services are similar in all those markets as we sell software.So here's the scenario: Our target markets are all English speaking countries i.e. Britain, USA and India We don't have the option of using ccTLD like .co.uk, co.in etc. How should we handle the content? Because product, its features, industries it caters to and our services are common irrespective of market. Whether we go with sub-directory or sub-domain, the content will be in English. So how should we craft the content? Is writing the unique content for the same product thrice the only option? Regards

    | IM_Learner
    0

  • I am working on an OS Commerce site and struggling to get it to rank even for the domain name. Moz is showing a huge number of 302 redirects and duplicate content issues but the web developer claims they can not fix those because ‘that is how the software in which your website is created works’. Have you any experience of OS Commerce? Is it the 302 redirects and duplicate content errors negatively affecting the ranking?

    | Web-Incite
    0

  • Hi, Regarding schema.org microdata, which page(s) should have the microdata? 1) http://schema.org/Physician  -- appears to be about the office.  Since we have all of the contact/address info in the footer on each page, should we do the same with microdata?  I can't seem to find a suggested implementation on schema.org Assuming an office has multiple MDs, how should the docs be listed since the physician schema appears to be for the office, not for the individual doctors? Thanks for any insight!

    | Titan552
    0

  • I use BigCommerce, and they have a system where all the URLs are dynamically generated from the name of each product. So if I named a product "widget x y z" the url would be /widget-x-y-z/, and if I changed that to "blue widget x y z", it would change to /blue-widget-x-y-z/ and automatically redirect the old one to the new one. As a result, in 6 months, because of a lot of tweaking and experimenting, I've ended up with a hefty list of 400 redirects. Some of them are very old, and some are recent. So my question is in two parts: a) does having all of these redirects hurt my rankings? b) if so, would deleting them help?

    | shabbirun
    0

  • My traffic just fell off the cliff and I can not figure out what has happened!  Help!

    | PatriciaBeth
    0

  • We have received messages in Google webmaster tools that there is an increase in soft 404 errors. When we check the URLs they send to the 404 not found page:
    For example, http://www.geographics.com/images/01904_S.jpg
    redirects to http://www.geographics.com/404.shtml.
    When we used fetch as Google, here is what we got: .
    #1 Server Response: http://www.geographics.com/404.shtml
    HTTP/1.1 200 OK Date: Thu, 26 Sep 2013 14:26:59 GMT
    What is wrong and what shall we do? The soft 404 errors are mainly for images that no longer exist in the server. Thanks!

    | Madlena
    0

  • Hi there Our website www.snowbusiness.com has a non www version and this one has 398 backlinks. What is the best way of transfering this link value if i establish the www. address as the canonical URL? Thanks, Ben

    | SnowFX
    0

  • Hi I have just migrated from a custom written php/mysql site to a site using wordpress and woocommerce. I couldnt believe the drop in speed . I am using a few plugins for wordpress - contact forms / social sharing. and I have a few woocommerce plugins for taking payment etc. I am hosting images css's and js's on W3 Total Cache and MAXCDN hoping to speed the site up but  tools at http://tools.pingdom.com/fpt   sometimes show that the time between browser request and reply can be between 1 and 15 secs. I have searched all day looking for a post I read about two months ago with a tool that seems to look at server responce and redirect processing etc hoping it would help but cant find it. If anyone knows what I am talking about I would appreciate them posting a link The site is http://www.synergy-health.co.uk   and an example of an inner page is http://www.synergy-health.co.uk/home/shop/alacer-emergen-c-1000-mg-vitamin-c-acai-berry-30-packets-8-4-g-each/ Any suggestions please?  Perhaps I have w3total cache set wrong? Also, as the has been tanked and is in freefal iin google ranking since January would this be a good time to change the structure of Url  from home/shop/product  to domain.name/brand/product? Thanks in advance !

    | StephenCallaghan
    0

  • I am doing SEO on a site which is running on WP. And it has all pages and categories duplicates on domain.com/site/ However, as it got crawled I saw that all domain.com/ pages have rel=canonical with main page tag (does it mean something?). Thing is I will fix permalinks structure and I think WP automatically redirects if it is changed from /?page_id=  to  /%category%/%postname%/ or  /%postname%/ Isn't there something I miss? Second problems is a forum. After a crawl it found over 5k errors and over 5k warnings. Those are: Duplicate page content; Duplicate page title; Overly-Dynamic URLs; Missing Meta descr; Title Element too long. All those come from domain.com/forum/ (fortunately, there are no domain.com/site/forum duplicates). What could be an easy solution to this?

    | OVJ
    0

  • Hi! how are you? I've been working on some of my sites, and noticed that i'm getting lots of crawls by search engines that i'm not intereted in ranking well. My question is the following: do you have a list of 'bad behaved' search engines that take lots of bandwidth and don´t send much/good traffic? If so, do you know how to block them using robots.txt? Thanks for the help! Best wishes, Ariel

    | arielbortz
    0

  • Good Morning... I have an ecommerce site running on Magento and the sitemap is automatically generated by Magento based on the categories and sub categories and products. I have recently created new categories that i want to replace the old categories, but they are both in the auto-generated sitemap.  The old categories are "active" (as in still exist if you know the URL to type) but not visible (you can't find it just by navigating through the site).  The new category pages are active and visible... If i want Google to rank one page (the new category page) and not the old page (old category page) should i remove the old page from the sitemap?  Would removing the old page that used to target the same keywords improve my rankings on the newer category page? Sitemap currently contains: www.example.com/oldcategorypage www.example.com/newcategorypage Did I confuse you yet? Any help or guidance is appreciated. Thanks,

    | Prime85
    0

  • Hi guys, A client of ours has a website with a very bad linkprofile. We adressed this issue and we migrated the website to another domain. We redirected the bad website (cornelisbedding.be) to the new domain (cornelisbedding.com) with a 302 redirect. We didn't want to pass the bad link juice. The problem we are having now is that we can't afford to lose the redirect on cornelisbedding.be. We would lose to much traffic because the old domain still has alot of links that generate good quality traffic. I have read that Google will treat 302 redirects as 301's in the long run. We really want to avoid this.
    We were thinking of using a meta refresh with a delay on, but in Google's eyes that would be considered spammy. Are their any other suggestions on how to handle this? Thanks you!

    | Jacobe
    0

  • I use this schema for my page: itemtype="http://schema.org/VideoObject"> Video: <span itemprop="name">Interview with the Foo Fighters</span itemprop="name"> <meta itemprop="duration" content="t1m33s" ></meta itemprop="duration" content="t1m33s" > <meta itemprop="thumbnail"< p=""></meta itemprop="thumbnail"<> content="foo-fighters-interview-thumb.jpg" /> <param ...></param ...> <embed type="application x-shockwave-flash" ...=""></embed type="application> <span itemprop="description">Catch this exclusive interview</span itemprop="description"> with Dave Grohl and the Food Fighters about their new album, Rope. But i cannot make it work on my website: Tumbnail des not appear <meta itemprop="thumbnail" content="foo-fighters-interview-thumb.jpg" ></meta itemprop="thumbnail" content="foo-fighters-interview-thumb.jpg" > i use for ejemple <meta itemprop="thumbnail" content="http: www.mywebsite.com="" diretion-of-the-foto.jpg" =""></meta itemprop="thumbnail" content="http:> But no only does not appear foto on page as there is no button to start video, just a blank page is seen as result of the code. Can someone helpme with a own ejemple on is website. Thanks

    | maestrosonrisas
    0

  • I am confused on what is going on. I recently converted my website to wordpress.  I kept the same url structure to the best of my abilities (without the .html) and 301 all .html to / So whatever was www.domain.com/page.html to www.domain.com/page/ The old CMS was a proprietary cms done by the company who originally designed the website.  It was poorly put together and was really buggy.  I decided to move to wordpress due to many limitations particularly for SEO. Now traffic is down about 10% and looks like the downward trend will continue, also, impressions are down by about 75%. Any thoughts on what could be happening?

    | mike_sif
    1

  • My website is fairly new, it was launched about 3.5 months ago. I've been publishing new content daily (except on weekends) since then. Is there any reason why new posts don't get indexed faster? All of my posts gets +1's and they're shared on G+, FB and Twitter. My website's at www.webhostinghero.com

    | sbrault74
    0

  • Hello everyone, About 1-2 weeks ago, I have implemented rich snippets (microdata) for the product pages of my e-commerce site. However, in the web masters tools, google is saying that the crawlers did not detect any structured data in my site. I have also checked my pages using Structured Data Testing Tool. You can see an example test result in the following address. http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.tarzimon.com%2Fproduct%2Fnaif-tasarim-torr-aydinlatma-1031 What may cause this problem? Thank you for your help

    | hknkynr
    0

  • Hi I was analyzing my sites's backlinks and see a lot of the domains are not indexed by google.
    I assume google has unindexed them - may be these domains are penalized. Should I request removal of my site's link from these pages ? What is the recommended approach in such a case? Thank you for your inputs.

    | VaiSam
    0

  • I run various sites that use Geo-location to place related links in navigation menus on a page.  For example, if you land on the home page, we will see that you are in Florida and then in one of the content boxes on the page, show job listings that this site has in Florida.   We also give the option to search for other jobs or use other navigation options. The idea is to try to help the user along the best we can, but ..... What opinions do persons have here on if these links should be nofollowed as GoogleBot will always see links to places in California etc. - wherever Googlebot is crawling from? Would this then be confusing as we are a site that focused on the entire US and not just California etc Thanks!

    | CleverPhD
    0

  • Hi I am building links to a page www.companyname.com/category.index.php There is also another similar url www.companyname.com/category.index.php#. This page is linked to from the non # page. This is a new client and I'm not entirely sure why that link is there. Am I correct in thinking that these two urls are different in the eyes of the search engines? If so, would some of the link juice to www.companyname.com/category.index.php be transferred to www.companyname.com/category.index.php# and affect the ranking of the non # page? I hope this makes sense! Thanks

    | sicseo
    0

  • Trying to set a preferred domain in GWT, and the site is verified via Google Analytics and meta tag in the code, but still asks: Part of the process of setting a preferred domain is to verify that you own http://site.org/. Please verify http://site.org/. Tried looking for answer to no avail, am I missing anything?

    | vmialik
    0

  • Quick question for you all - Is there an issue with me having an H1 tag physically below an H2 tag on a web page??

    | Pete4
    0

  • Hi Mozzers, I would need some advice on how to tackle one of my client’s websites. We have just started doing SEO for them and after moz crawled the e-commerce it has detected: 36 329 Errors – 37496 warnings and 2589 Notices all going up! Most of the errors are due to duplicate titles and page content but I cannot identify where the duplicate pages come from, these are the links moz detected of the Duplicate pages (unfortunately I cannot add the website for confidentiality reasons) : • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&products_per_2&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00=&products_per_00&products_per_2&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_2=&products_per_00&page=2 • www.thewebsite.com/index.php?dispatch=categories.view&category_id=233&products_per_00&products_per_00&products_per_00&products_per_00&page=2 With these URLs it is quite hard to identify which pages need to be canonicalize. And this is jsut an example out of thousands on this website. If anyone would have any advice on how to fix this and how to tackle 37496 errors on a website like this  that would be great. Thank you for your time, Lyam

    | AlphaDigital
    0

  • I want to update my permalinks - actually I want to change the URL's to fit the content and keywords better. I can choose "edit" the URL, but don't I need a redirect? I don't see any htaccess Plugin installed.......is that what I need to be able to change my URL's in Wordpress?

    | cschwartzel
    0

  • Apologies if my question sounds like a school Maths lesson 😉 If you have 2 sites: Site 1) is linked to by sites A,B & C Site 2) is linked to by sites X,Y & Z You then 301 redirect site 2 to site 1.  Most of the juice from site 2 (obtained from links X,Y,Z) should be passed over to site 1. But what if site 2 is linked to by the same sites A,B,C as site 1 instead of X,Y,Z. Since both sites have exactly the same links will the same, less,  or any weight be passed over by the 301 redirect? Many thanks.

    | martyc
    1

  • We have a main website and we created a satellite site to support the original one with backlinks. Can I add both sites to the same Google Analytics profile? My programmer said that there is no reason to use a new GA profile for the satellite site, since Google will see the connection between the 2 websites via scripts, Google Plus buttons and other programmed solutions. So, is there a reason to use a new GA account for the satellite site (and later the new satellite sites) as well?

    | Romaine
    0

  • I have an ecommerce site built using Magento and urgently need advice on best practice for handling multiple product categories (where products appear in more than one category on the site creating multiple URLs to the same page). In April this year, based on advice from my SEO who felt that duplicate content issues were causing my rankings to be held back, I changed about 25% of the product categories to 'noindex, follow'. This has made organic traffic fall (obviously) as these pages fell out of Google's index.  But, contrary to what I was hoping for, it didn't then improve rankings - not one iota, nothing - which was the ONLY reason why I did this. This has had a real negative impact on sales, so I'm starting to think this was actually an a terrible idea. Should I change them back? And to ask a wider question, what is best practice for this particular scenario?

    | Coraltoes77
    0

  • Hello, We have TLDs as below:
    bannerbuzz.combannerbuzz.co.uk
    bannerbuzz.com.au
    bannerbuzz.ca Within few days, We are moving in Magento Technology for 2 domains,
    Bannerbuzz.com.au and Bannerbuzz.co.uk We have purchased new plan, So IP address of those 2 domains will be changed. So my question is, It will impact on my current Ranking or not?

    | CommercePundit
    0

  • I'm working on creating content for top category pages for an ecommerce site. I can put them under the left hand navigation bar, and that content would be near the top in the code. I can also put the content at the bottom center, where it would look nicer but be at the bottom of the code. What's the better approach? Thanks for reading!

    | DA2013
    0

  • Under my MOZ account I'm getting a bunch of temporary redirect warnings. Most of them are blog post with a /feed or a /trackback . I know the trackback URL's are coming from blogs where people have commented because it brings up a Trackback URL | Comments RSS Feed section. I'm not sure how to make this /trackback work. The only line of code in my editor that says trackback is h3#postinfo,
    h3#comments,
    h3#respond,
    h3#trackbacks,
    #respond h3 {
    margin: 0;
    }

    | jampaper
    0

  • I'm a newbie, and just got my first crawl diagnostics report. I was looking at the temporary redirects, and virtually all of them are trackback URLs from our blog (we use ourdomain.com/blog and Wordpress). The diagnostics report says they're 302 redirects. Is this problematic and, if so, can you suggest the best way to fix the problem? Thanks in advance for your help!

    | sally58
    0

  • I've been looking at some bigger enterprise sites and noticed some of them used HTML like this: <a <="" span="">data-href="http://www.otherodmain.com/" class="nofollow" rel="nofollow" target="_blank"></a> <a <="" span="">Instead of a regular href="" Does using data-href and some javascript help with shaping internal links, rather than just using a strict nofollow?</a>

    | JDatSB
    0

  • I'm thinking about implementing Disqus on my blog. I'd like to know if the Disqus comments are indexed by search engines? It looks like they are displayed using Ajax or jQuery.

    | sbrault74
    0

  • I am working on a revamped multi-language site that has moved to Magento.  Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts.  They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it.  Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?

    | MickEdwards
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.