Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • ​Hi All, Traffic to few pages of my site is dipping from last couple of months. When I analyzed one of the web page (http://ow.ly/IEkt307dfNr​) in Moz tools, it is warning that the keyword "dance classes" is used excessively in the page (30+ times). But, it is used in genuine manner; because that page is a listing page of "dance class teachers" who provide the service, we added "dance classes" under each of the provider. It helps users to connect with the teach easily. Is this okay or will it fall under keyword stuffing? Should I change something?

    | Avinash_1234
    0

  • Hey Moz Crew! So I'm not necessarily looking for the answer here but more of a where do I begin to learn more. If you guys could point me in the right direction or even just help me ask the question in a better way, I would be so thankful. Ok so there is page on my website that lives on the second page of Google. The page could be modified and I could add content to it if I wanted to, but let's just assume that this page is perfectly optimized with absolutely wonderful content and a great user experience. Now of course I would like to get a bunch of links to that page, but If I can't write anymore content on that page or update it, It will be harder to convince people to link to it (does that even make sense?). But if I can write blogs about really good subjects around that page, and those blogs do very well, how can I make sure that the actually page is getting all the juice that it can. And will it even get juice? Is this just a simple internal linking question? Am I tapping on the door of micro sites or landing pages? Oy vey where do I start!? ❤ Much love guys 🙂

    | Meier
    0

  • Hi there! I just changed the preferred domain settings from http://example.com to http://www.example.com and received a recommended action from Google: "Ensure that you specify the new host as canonical in all page links or sitemaps." Could you please let me know if "the new host" is equal to "canonical" and if I have to include this tag into every page of my website ? Thank you!

    | kirupa
    0

  • Some time ago I started to do SEO for a one-page website and didn't get any positive result: no traffic, no filled in online booking form (yet another, multiple page website offering the same service yielded in multiple filled-in "schedule an appointment" forms). I found out my one-page website was considered to be "keyword-spamming" and converted it to a multiple page one. Its domain authority went up, but it doesn't still bring any traffic. I am thinking maybe I have to let the search engines know that it has been updated so they stop penalizing it? Do you think it might help and if yes, what exactly I should do? Will be thankful very much for any suggestion!

    | kirupa
    0

  • Hi, I have belatedly come to the conclusion that I have been using tags and categories when blogging in wordpress incorrectly. The result is that Google seems to prefer to show my archives and tags in search results rather than the post itself. Not good UX. As the site is only a few months old, am I best to learn my lesson and tag and categorize correctly moving forward or Should I go back in to these posts and clean them up & categorize and tag correctly. If I do this, will it cause 404s and hurt my SEO? Thanks!

    | johnyfiveisalive
    2

  • Hello, I am a bit puzzled since the company I'm working with has recently opened a new location. The company's first location is listed in all local directories. Could you please advise me what to do to make  2 locations visible and not to mess up with keeping same NAP all over the web? Most directories do not simply have the option to add another location. I had to open a brand new Google Business page for the second location (while using the same name brand). I suppose I am getting into trouble with this?

    | kirupa
    0

  • Hi All! I recently acquired a client and noticed in Search Console over 1300 404s, all starting around late October this year. What's strange is that I can access the pages that are 404ing by cutting and pasting the URLs and via inbound links from other sites. I suspect the issue might have something to do with Sitemaps. The site has 5 Sitemaps, generated by the Yoast plugin. 2 Sitemaps seem to be working (pages being indexed), 3 Sitemaps seem to be not working (pages have warnings, errors and nothing shows up as indexed). The pages listed in the 3 broken sitemaps seem to be the same pages giving 404 errors. I'm wondering if auto URL structure might be the culprit here. For example, one sitemap that works is called newsletter-sitemap.xml, all the URLs listed follow the structure: http://example.com/newsletter/post-title Whereas, one sitemap that doesn't work is called culture-event-sitemap.xml. Here the URLs underneath follow the structure http://example.com/post-title. Could it be that these URLs are not being crawled / found because they don't follow the structure http://example.com/culture-event/post-title? If not, any other ideas? Thank you for reading this long post and helping out a relatively new SEO!

    | DanielFeldman
    0

  • My company is joining an Industrial Association. Part of the membership is a link to our site from theirs. I've found that going to their site triggers a "threat alert" through our company malware detection system and shows a link that may be infected with malware. With all of that said I have (2) questions... Since this is a paid membership, will Google penalize us for having a link to our company from this association's website? Since a link on their site has potential malware issues, should we add our link to their site or could it be harmful to us? Any helpful advice is appreciated.

    | SteveZero12
    1

  • Hi fellow Mozzers, I have been tasked with providing some SEO recommendations for a website that is to be built using express.js and Angular.  I wondered whether anyone has had any experience in such a framework? On checking a website built in this and viewing as a GoogleBot etc using the following tools it appears as though most of the content is invisible: http://www.webconfs.com/search-engine-spider-simulator.php http://www.browseo.net/ Obviously this is a huge issue and wonder if there are any workarounds, or reccomendations to assist (even if means moving away from this  - would love to hear about it)

    | musthavemarketing
    2

  • My web site is http://www.mercimamanboutique.com/ Cached version of French version is, cache:www.mercimamanboutique.com/fr-fr/ showing incorrectly The German version: cache:www.mercimamanboutique.com/de-de/ is showing correctly. I have resubmitted site links, and asked Google re-index the web site many times. The German version always gets cached properly, but the French version never does. This is frustrating me, any idea why? Thanks.

    | ss2016
    0

  • I post a JavaScript accordion drop down tab [ a collapsible content area ] at the end of all my posts. I labeled the accordion "Show Article Sources"., and when a user clicks it, then the accordion expands open and it shows all the sources I listed for my article. And this is where I post all of my articles links that I reference per each article. But I read somewhere that google crawlers can not read text in a drop down JavaScript tab. So I am wondering now if this is true because that would mean I have no internal linking SEO going on since it cant read the links? ..... if it is true, then I should remove the accordion from all my articles and some how include the links I reference in the actual body text so I can get SEO benefits from external linking similar content? If that's true, what is an aesthetic way to do this, any example links? Tips ? Thoughts ?

    | ianizaguirre
    0

  • We work with quite a few sites that promote retail traders and feature a traders' directory with pages for each of the shops (around 500 listings in most cases). As retail strips, shops come and go all the time, so I get a lot of pages that are removed as the business is no longer present. Currently I've been doing 301 redirects to the home page of the directory if you try to access a deleted trader page, but this means a ever growing htaccess file with thousands of 301 redirects. Are we handling this the best way or is there a better way to tackle this situation?

    | Assemblo
    0

  • So bear with me here as this is probably a technical issue and i am not that technical.  We have a microsite for one of our partner organisations and recently we have detected that content from our main site appearing in the URLs for the microsite - both in search results and then when you click through to the SERP.  However, this content does not exist on the actual website at all. Anyone have a possible explanation for this?  I have tried searching the web but nothing.  I assume there is something in the set up of the microsite that is associating it with the content on the main site.

    | Discovery_SA
    0

  • Hi Moz Community, I am wondering if anyone can shed some light on this current predicament I am facing... For my website, which is the site for a magazine I work for, the current URL structure is www.website.com/article-title At first glance, I thought it must be that we would have to re-structure the URLs to include the category structure, for example... www.website.com/category/sub-category/article-title However, upon deeper investigation, I've seen that we do actually have breadcrumbs enabled therefore google is indexing and following the structure that we would re-activate for the URL structure i.e. www.website.com/category/sub-category/article-title With this in mind, is it actually worth re-structuring the URLs to include these categories as it will take a long time to organise and implement?! Obviously, thinking in terms of UX, it is a must-do, but I'm just trying to weigh up the pro's and cons with this.. Appreciate your help, Leigh

    | leighcounsell
    0

  • Is there a percentage (approximate or exact) of duplicate content you should have before you use a canonical tag? Similarly how does Google handle canonical tags if the pages aren’t 100% duplicate? I've added some background and an example below; Nike Trainer model 1 – has an overview page that also links to a sub-page about cushioning, one about Gore-Tex and one about breathability. Nike Trainer model 2,3,4,5 – have an overview page that also links to sub-pages page about cushioning , Gore-Tex and breathability. In each of the sub-pages the URL is a child of the parent so a distinct page from each other e.g. /nike-trainer/model-1/gore-tex /nike-trainer/model-2/gore-tex. There is some differences in material composition, some different images and of course the product name is referred multiple times. This makes the page in the region of 80% unique.

    | punchseo
    0

  • I am using the Medium blogging platform to blog, but it is pointed to my site and appears at blog.mysite.com. Since the content is hosted on Medium and pointed to my subdomain via an A Record / CNAME / etc... 1. Will my domain get credit for backlinks to the blog content? 2. If Medium changes in the future and no longer points to my subdomain, will I lose all of the backlinks I've built up?

    | davidevans_seo
    0

  • Hello we run a large sports website. Since 2009 we have been doing game previews for most games every day for all the major sports..IE NFL, CFB, NBA, MLB etc.. Most of these previews generate traffic for 1-2 days leading up to or day of the event. After that there is minimal if any traffic and over the years almost nothing to the old previews. If you do a search for any of these each time the same matchup happens Google will update its rankings and filter out any old matchups/previews with new ones. So our question is what would you do with all this old content? Is it worth just keeping? Google Indexes a majority of it? Should we prune some of the old articles? The other option we thought of and its not really practical is to create event pages where we reuse a post each time the teams meet but if there was some sort of benefit we could do it.

    | dueces
    0

  • Our product descriptions appear in two places and on one page they appear twice. The best way to illustrate that would be to link you to a search results page that features one product. My duplicate content concern refers to the following, When the customer clicks the product a pop-up is displayed that features the product description (first showing of content) When the customer clicks the 'VIEW PRODUCT' button the product description is shown below the buy buytton (second showing of content), this is to do with the template of the page and is why it is also shown in the pop-up. This product description is then also repeated further down in the tabs (third showing of content). My thoughts are that point 1 doesn't matter as the content isn't being shown from a dedicated URL and it relies on javascript. With regards to point 2, is the fact the same paragraph appears on the page twice a massive issue and a duplicate content problem? Thanks

    | joe-ainswoth
    0

  • Hi, First time posting! Our website was page 1 for just about all local searches (legal related). Due to a rebrand we took the site over onto a new domain name and it has totally dropped - on average pages 8-9, even for local searches. The transfer was done back in May, I did expect a hit but I assumed it would recover fairly quickly but it doesn't seem to be the case. Any ideas/advice/tips would be greatly appreciated 🙂 Jamie

    | jamiericey
    0

  • Hello, I've been looking online for some help with this. An estate agent has a page of properties for sale. Is it possible to mark these individual properties up and if so would they appear as rich snippets in the SERPS - never seen anything like this for properties for sale so just wondered,

    | AL123al
    1

  • Hi I know alt tags should be on an image, however at the moment I have 23,741 missing on the site, how important are these? It's a big project for someone to update & I need some justification Thanks Mozzers 🙂

    | BeckyKey
    0

  • Hi, I thought that I just could link one sitemap from my site's robots.txt but... I may be wrong. So, I need to confirm if this kind of implementation is right or wrong: robots.txt for Magento Community and Enterprise ...
    Sitemap: http://www.mysite.es/media/sitemap/es.xml
    Sitemap: http://www.mysite.pt/media/sitemap/pt.xml Thanks in advance,

    | Webicultors
    0

  • Hello, I recently solved www / no www duplicate issue for my website, but now I am in trouble with duplicate content again. This time something that I cannot understand happens: In Crawl Issues Report, I received  Duplicate Page Content for http://yourappliancerepairla.com (DA 19) http://yourappliancerepairla.com/index.html (DA 1) Could you please help me figure out what is happenning here? By default, index.html is being loaded, but this is the only index.html I have in the folder. And it looks like the crawler sees two different pages with different DA... What should I do to handle this issue?

    | kirupa
    0

  • Hi I am trying to fix the high priority duplicate content URL's from my recent MOZ crawl (6 URL's) in total. Would someone from the community be able to offer some web development advice?  I had reached out on the Moz Community on the main welcome page.  Samantha stated that someone in web development on Moz's Q&A forum would be better suited to assist me.  I took a word press class on Lynda.com, but other than that, I am a novice.  I manage my site www.rejuvalon.com on Go Daddy's managed wordpress site.  Thanks so much for your help! Best, Jill

    | justjilly
    0

  • I usually do social bookmarking on stumbleupon, scoop.it, pinterest, bundlr, folkd, diigo, reddit and delicious. Does anyone have any recommendation on other good social bookmarking sites?

    | Armen-SEO
    0

  • The French and German version of my web site are showing mixed language sitelinks I have no idea how to fix this now that Google have remove demote sitelinks feature, how can I make sure only the german site links appear in the german version of the site and french site links only appear in the french version of the site? www.mercimamanboutique.com/de-de/ www.mercimamanboutique.com/fr-fr/ Thanks again

    | ss2016
    0

  • Hi. I am looking after a website in Ireland called CarsIreland.ie When you search "carsIreland", I come number 1 and the organic sitelinks appear however When you search "cars Ireland", I come number 1 and the organic sitelinks Do Not appear why?

    | Rossconlon
    0

  • We are currently building a new site which has the ability to pre-populate product options based on parameters in the URL. We have done this so that we can send individual product URLs to google shopping. I don't want to create lots of duplicate pages so I was wondering what you thought was the best way to handle this? My current thoughts are: 1. Sessions and Parameters
    On-site product page filters populate using sessions so no parameters are required on-site but options can still be pre-populated via parameters (product?colour=blue&size=100cm) if the user reaches the site via google shopping. We could also add "noindex, follow" to the pages with parameters and a canonical tag to the page without parameters. 2. Text base Parameters
    Make the parameters in to text based URLs (product/blue/100cm/) and still use "noindex, follow" meta tag and add a canonical tag to the page without parameters. I believe this is possibly the best solution as it still allows users to link to and share pre-populated pages but they won't get indexed and the link juice would still pass to the main product page. 3. Standard Parmaters
    After thinking more today I am considering the best way may be the simplest. Simply using standard parameters (product?colour=blue&size=100cm) so that I can then tell google what they do in webmaster tools and also add "noindex, follow" to the pages with parameters along with the canonical tag to the page without parameters. What do you think the best way to handle this would be?

    | moturner
    0

  • Does anyone have any insight as to why a site wouldn't show any results when using this google search operator:
    related:site.com There are no results appearing. We recently moved from .com to .org with 301 redirects in place and change of address tool submitted. There are no penalties or warnings in search console but we have seen a significant decrease in search traffic. Thanks in advance.

    | SoulSurfer8
    0

  • I have a new client that wants to promote their subdomain uk.imagemcs.com and have their main domain imagemcs.com fall off the SERPs. Objective? Get uk.imagemcs.com to rank first for UK 'brand' searches. Do a search for 'imagem creative services' and you should see the issue (it looks like rules have been applied to the robots.txt on the main domain to exclude any bots from crawling - but since they've been indexed previously I need to take action as it doesn't look great!). I think I can do this by applying a permanent redirect from the main domain to the subdomain at domain level and then no-indexing the site - and then resubmit the sitemap. My slight concern is that this no-indexing of the main domain may impact on the visibility of the subdomains (I'm dealing with uk.imagemcs.com, but there is us.imagemcs.com and de.imagemcs.com) and was looking for some assurance that this would not be the case. My understanding is that subdomains are completely distinct from domains and as such this action should have no impact on the subdomains. I asked the question on the Webmasters Forum but haven't really got anywhere
    https://productforums.google.com/forum/#!msg/webmasters/1Avupy3Uw_o/hu6oLQntCAAJ Can anyone suggest a course of action? many thanks, Nathan

    | nathangdavidson2
    0

  • Hi everyone, One of my client's website has a 8/17 Spam Score. (Domain Authority 18) Thing is, he has 0 inbound links (using Open site explorer) but he has 5 referring domains  (when using Majestic). No error messages whatsoever in his Search Console. My questions are : what could explain that 8 Spam Score ? what can I do to lower that spam score (except disavow links in GWT) ? Thanks!

    | julienraby
    0

  • Hi Everyone, Today, I woke up to a dramatic page rank decline (nearly 20 positions) for a client's website (eacoe.org). When I looked in Webmaster tools, I noticed that the site was just indexed yesterday by Google (a request that the webmaster had submitted back in April of this year). Would this re-indexing event have caused the sharp decline? In Webmaster Tools, I don't see many errors (one 404 error that we are planning on fixing).  I likewise see no Manual Actions/ penalties brought up by Google about our site. My first concern is that the re-indexing led to rank decline, but I'm not entirely sure if I should be focusing on something else. And if it is the re-indexing, what are there any recommended steps of attack? Thanks for your help! -Bruce

    | dynedge
    0

  • Hi Moz friends, I've noticed since Tuesday, November 9, half of my post's meta dates have changed in regards to what appears next to the post in the search results. Although published this year, I'm getting some saying a random date in 2010! (The domain was born in 2013; which makes this even more odd). This is harming the CTR of my posts and traffic is decreasing. Some posts have gone from 200 hits a day to merely 30. As far as on our end of the website, we have not made any changes in regards to schema markup, rich snippets, etc. We have not edited any post dates. We have actually not added new content since about a week ago, and these incorrect dates have just started to appear on Tuesday. Only changes have been updating certain plugins in terms of maintenance. This is occurring on four of our websites now, so it is not just specific to one. All websites use Wordpress and Genesis theme. It looks like only half of the posts are showing weird dates we've never seen before (far off from the original published date as well as last updated date -- again, dates like 2010, 2011, and 2012 when none of our websites were even created until 2013). We cannot think of a correlation as to why certain posts are showing weird dates and others the correct. The only change we can think of that's related is back in June we changed our posts to show Last Updated date to give our readers an insight into when we changed it last (since it's evergreen content). Google started to use that date for the SERPs which was great, it actually increased traffic. I'm hoping it's a glitch and a recrawl soon may help sift it around. Anybody have experience with this? I've noticed Google fluctuates between showing our last updated date or not even showing a date at all sometimes at random. We're super confused here. Thank you in advance!

    | smmour
    2

  • I just had a developer friend call me in a panic, because they had gone live with a new site and found out (the hard way) that they had missed some pages on their 301 redirects. So the pages are appearing in Google but serving 404s. Ouch! So their question was: other than running a report for 404 errors in something like Screaming Frog, is there a way to hunt down ONLY pages serving 404s, then export to CSV so they can be redirected? Anyone got any tricks up their sleeve?

    | muzzmoz
    0

  • Hey guys! Hope you're all doing well. I need your smart brains for a problem the company I work for is currently facing in the realm of SEO. They wanted to hire someone to find the problem, but I thought I would reach out to you guys, first, to at least try and narrow down the long list of things we could look at/investigate. I put the info (briefly) on 3 slides, and would really appreciate any and all insights you can offer me: https://docs.google.com/presentation/d/12d0YH7BNdCIpJ86d95uBOtOWphWQ3NB1kwDYeOzIt-U/edit#slide=id.g19145c9fa5_0_0 Thank you in advance!

    | michelledemaree
    0

  • Hi guys, We relaunched our website www.troteclaser.com on Sept. 6th. Traffic on the new website has been stable or slightly increased except for one area: In Central and South America organic traffic dropped by 50%. We properly set up all 301 redirects and solved all 404s within a week. We changed approx. 30% of the website structure. But I don't think that internal link juice could be the problem. Any idea what might be the cause for a local drop in traffic like this? Did anyone have similar cases in the past? Thomas

    | Troteclaser
    0

  • Hello, I have a client that plans to use different URLs for signed in and signed out customers. My concern is that signed in and signed out customers will provide back links to different URLs of the same page and thus split page rank. I'm assuimg the URL for signed in customers won't be fetched by Google and therefore rule out canonicalizing the signed in URL to the signed out version. The solution for me would be to ensure that there is only one URL for each content page, and to instead use cookies to prompt customers to sign up to the service that aren’t already a customer. However, please correct me if I’m wrong in my assumptions. Thanks

    | SEONOW123
    0

  • Hi moz community, I am the web agency for a e-commerce website. Its current domain is https://www.liquorland.co.nz but now all the e-commerce part will be moved to a sub-domain https://shop.liquorland.co.nz. There are thousands of e-commerce current being indexed in Google (i.e., 15,500) plus they also have a mobile version of the page like /mobile/default.aspx. Is it necessary to 301 redirect all the pages? We are afraid it may slow down the website because the request will go through thousands of filters. Is it OK to just redirect the main categories? Many thanks in advance.

    | russellbrown
    0

  • A couple of months ago there was a result in Google for our branded search term which wasn't the 'official' URL, actually the result shown in the SERP was www.mycompany-ip.nl. We've applied a 301 redirect of this URL to the 'official' URL which is a subdomain: department.mycompany.nl. From Google the redirect is obviously working, but up until now, I don't see Google replacing the incorrect URL by the correct URL. I am wondering what to do to make the result correct. André

    | ConclusionDigital
    0

  • Hello, Currently on our site we have the rel=prev/next markup for pagination along with a self pointing canonical via the Yoast Plugin. However, on page 2 of our paginated series, (there's only 2 pages currently), the canonical points to page one, rather than page 2. My understanding is that if you use a canonical on paginated pages it should point to a viewall page as opposed to page one. I also believe that you don't need to use both a canonical and the rel=prev/next markup, one or the other will do. As we use the markup I wanted to get rid of the canonical, would this be correct? For those who use the Yoast Plugin have you managed to get that to work? Thanks!

    | jessicarcf
    0

  • Hi there, We switched from the domain X.com to Y.com in late 2013 and for the most part, the transition was successful. We were able to 301 most of our content over without too much trouble. But when when I do a site:X.com in Google, I still see about 6240 URLs of X listed. But if you click on a link, you get 301d to Y. Maybe Google has not re-crawled those X pages to know of the 301 to Y, right? The home page of X.com is shown in the site:X.com results. But if I look at the cached version, the cached description will say :This is Google's cache of Y.com. It is a snapshot of the page as it appeared on July 31, 2014." So, Google has freshly crawled the page. It does know of the 301 to Y and is showing that page's content. But the X.com home page still shows up on site:X.com. How is the domain for X showing rather than Y when even Google's cache is showing the page content and URL for Y? There are some other similar examples. For instance, you would see a deep URL for X, but just looking at the <title>in the SERP, you can see it has crawled the Y equivalent. Clicking on the link gives you a 301 to the Y equivalent. The cached version of the deep URL to X also shows the content of Y.</p> <p>Any suggestions on how to fix this or if it's a problem. I'm concerned that some SEO equity is still being sequestered in the old domain.</p> <p>Thanks,</p> <p>Stephen</p></title>

    | fernandoRiveraZ
    1

  • Hi, Our website has two version of URLs. dektop: www.myexample.com and mobile www.myexample.com/m If you go to our site from a mobile device you will land on our mobile URL, if you go to our site from desktop computer you will land on a regular URL. Both urls have the same content. Is that considered duplicate? If yes, then what can I do to fix it? Also, both URLs are indexed by google. We have two separate XML sitemaps- one for desktop and one for mobile. Is that a good SEO practice?

    | Armen-SEO
    0

  • Hello - We launched a schema plugin for our WordPress site to make our blog seen as articles and main page as an organization. The day after, we saw a dramatic decrease in Keyword rankings but our website health improved with Google. Any thoughts on what could be causing this?

    | Erin_IAN
    0

  • Hey Moz Community, I've been seeing a steady decrease in search console of pages being indexed by Google for our eCommerce site. This is corresponding to lower impressions and traffic in general this year. We started with around a million pages being indexed in Nov of 2015 down to 18,000 pages this Nov. I realized that since we don't have around 3,000 or so products year round this is mostly likely a good thing. I've checked to make sure our main landing pages are being indexed which they are and our sitemap was updated several times this year, although we're in the process of updating it again to resubmit. I also checked our robots.txt and there's nothing out of the ordinary. In the last month we've recently gotten rid of some duplicate content issues caused by pagination by using canonical tags but that's all we've done to reduce the number of pages crawled. We have seen some soft 404's and some server errors coming up in our crawl error report that we've either fixed or are trying to fix. Not really sure where to start looking to find a solution to the problem or if it's even a huge issue, but the drop in traffic is also not great. The drop in traffic corresponded to lose in rankings as well so there could be correlation or none. Any ideas here?

    | znotes
    0

  • In the Moz Site Crawl issue, I was seeing an error that said we were temporarily redirecting our homepage to https URLs. So I changed the code in htaccess to make it 301 redirect but I'm still getting the same error. I implemented it last week and we just had a new crawl yesterday. Here is the new code: RewriteEngine on
    RewriteCond %{HTTP_HOST} ^heritagelawmarketing.com [NC]
    RewriteRule ^(.*)$ http://www.heritagelawmarketing.com/$1 [L,R=301,NC] Does anyone know why I'm still getting 302 redirects? Thanks

    | Heydarian
    0

  • Hi, I am trying to redirect any IP from outside India that comes to Store site (https://store.nirogam.com/) to Global Store site (https://global.nirogam.com/) using this methodThis is causing various indexing issues for Store site as Googlebot from US also gets redirected!- Very few pages for "store.nirogam.com/products/" are being indexed. Even after submission of sitemap it indexed ~50 pages and then went back to 1 page etc. Only ~20 pages indexed for now.- After this I tried manually indexing via "Crawl -> Fetch as Google" - but then it showed me a redirect to global.nirogam.com. All have their "status -> Redirected" - This is why bots are not able to index the site.What are possible solutions for this? How can we tell bots to index these pages and not get redirected?Will a popup method where we ask user if they are outside India help in solving this issue?All approaches/suggestions will be highly appreciated.

    | pks333
    0

  • I do SEO for online boutique marketplace. I've been here for about 4 weeks and no one's done there SEO (they've been around for about 5 years), so there's lots to do. A big concern is whether or not to submit a sitemap, and if I do submit one, what's the best way to go about doing one.

    | Jane.com
    0

  • Hi. I want to build a new site that is optimised for a training product that we have. We have an existing domain which I'm considering pointing at this new site.  This domain is one of the new .training TLDs. Let's call this domain foo.training where my main keyword to optimise for will be "foo training". I've also looked and can see that foo-training.com is available. I read up on best practices for domains here : https://moz.com/learn/seo/domain My question is will the .training domain be seen as "spammy" in any way? Am I better to just go ahead and register the .com?

    | rmcatalyst
    0

  • Hi All, When I check on my ecommerce site in one of the architecture tool in that my Ecommerce Homepage interlink with 765 pages whereas  when I check few competitors and big brands then there homepage linked with 28 pages, 33, 47, 57 etc not like my site 765 pages. Do I am wrong anywhere? Can you please check the screenshot of mine & one of the competitor's site architecture? Because as per me site architecture also play good role in google organic ranking. vXs5dh2 16wre

    | pragnesh9639
    0

  • Hi, Since few weeks now we received this message saying that our website is in conflict with the guidelines of Google's Webmaster. Here is the website for which we received the message from Google: http://www.gocustomized.es/. This url redirect to https://www.gocustomized.es/ I thought after reading your some messages from the forum that the problem was our website reviews which appear on all pages of the site. And I know that the review of the websites shouldn't be considered like the review of the product. But we remove the reviews and our request is still declined. Thanks in advance for your help.

    | steph_ba
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.