Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hello colleagues! I have a client who decided to launch a separate domain so they could offer their content translated for other countries.   Each country (except the US/English) content lives in its own country folder as follows: client.com/01/02/zh
    client.com/01/02/tw etc. The problem is that they post the content in US/English on this domain too.  It does NOT have its own folder, but exists righth after the date (as in the above example) Oh, and the content is the same as on their "main" domain so google likes to index that sometimes vs. the original client on the domain where we want the traffic to go. SO, is there a way to say "hey google, please index the US content only on the main domain, but continue to index the translated content in these folders on this totally separate domain?" Thank you so much in advance.

    | SimpleSearch
    0

  • Working with a new client. They have what I would describe as two virtual websites.  Same domain but different coding, navigation and structure. Old virtual website pages fail mobile friendly, they were not designed to be responsive ( there really is no way to fix them) but they are ranking and getting traffic. New virtual website pages pass mobile friendly but are not SEO optimized yet and are not ranking and not getting organic traffic. My understanding is NOT mobile friendly is a "site" designation and although the offending pages are listed it is not a "page" designation.  Is this correct? If my understanding is true what would be the best way to hold onto the rankings and traffic generated by old virtual website pages and resolve the "NOT mobile friendly" problem until the new virtual website pages have surpassed the old pages in ranking and traffic? A proposal was made to redirect any mobile traffic on the old virtual website pages to mobile friendly pages.  What will happen to SEO if this is done? The pages would pass mobile friendly because they would go to mobile friendly pages, I assume, but what about link equity?  Would they see a drop in traffic ? Any thoughts? Thanks, Toni

    | Toni7
    0

  • My client has a page with the URL https://www.corptexsystems.ca/training/. Google won't index this page because it has detected a canonical at https://www.corptexsystems.ca/training (without the trailing slash). I can't find https://www.corptexsystems.ca/training without the trailing slash anywhere in our site. All the other page URLs end with the slash. How do I sort this out and convince Google that https://www.corptexsystems.ca/training/ is the version to index? Thanks in advance for your help, SEO masters! -AK

    | AndyKubrin
    0

  • Hi, I am somewhat confounded about the right set of channels to join All facebook referrals into one. In the wake of experiencing various renditions, not certain which one is right and will work A few adaptations state that Search string ought to be - ^.*facebook.com$ furthermore, Replace string ought to be - facebook.com While as per other Search string ought to be > .*facebook also, Replace string ought to be - facebook Kindly guide which one of this is right and will work? Much obliged

    | Aashish2020
    0

  • We recently found our site had 65,000 tags (yes 65K). In an effort to consolidate these we've started deleting them. MOZ is now reporting a heap of 404 errors for tag pages. These tag pages should not have links to them so not sure how come they're being crawled. Any suggestions from experience in this area would be useful.

    | wearehappymedia
    0

  • As per Google webmaster my LCP score is low. Please suggest how to improve it. My website URL is https://www.bigrock.in/.

    | wstodayservices
    0

  • We don't want to redirect to a different page, as some people still use it, we just don't want it to appear in search

    | TheIDCo
    0

  • Hi, Everyone.
    I hope you are well So my question about Moz DA. I would like to ask that I have increased the content on my website which is related to Best Machete. I've done all the things that Google asks us to do for ranking and I'm avoiding wrongdoing. So what else do I need to do to increase Moz DA of my website?
    So Please, If any one know also guide me, I would be grateful to him/her.

    | Lauragabriel
    0

  • I am losing 1 point of DA at month? What could it be? I have noticed I have lost 50K (out of 300K) of internal links after a website update, could it be related to that?

    | albertoalchieriefficio
    0

  • Hi, Can anyone tell me - whether using  2 cache plugin helps or it cause any issue? Besides, when i used w3 cache plugin in WordPress its found like inline CSS issue to get cleared. So, i tried auto optimized but my website Soc prollect gone crashed in between while using the some.  Is there any solution and can anyone tell me which plugin advantages to speed the site by removing java script and inline css at a time.

    | nazfazy
    0

  • Hi everyone, I really don't understand why our competitor with lower DR and PA outranks us in Google.lv (Google Latvia). Below is a screenshot showing that our company takes #2 for the following keyword "gāzes baloni" in Google. Our DR is 24 and our PA is: 26, whereas our competitors DR is 23 and their PA is 19. The content on our page is much better too - we have clear Title, description, Q&A section etc, whereas our competitor has very limited content, just photos of the product and titles. Any suggestions would be highly appreciated. Thank you very much in advance. PYNPLNw i1lp4QI NriO6O4

    | Intergaz
    0

  • Hello friends
    This is my site
    https://www.alihosseini.org/ In the search console I have a soft 404 error
    How can I fix this error?
    I use WordPress

    | industriestaedt
    0

  • hi, website: www.snackmagic.com The home page goes out of google index for some hours and then comes back. We are not sure why our home page is getting de-indexed temporarily. This doesn't happen with other pages on our website. This has been happening intermittently in the gap of 2-3 days. Any inputs will be very useful for us to debug this issue Thanks

    | manikbystadium
    0

  • Every single one of my posts meta descriptions are being overritten with the same meta description! I use Yoast and have proper meta descriptions in my posts, but something seems to be overriding them. https://ibb.co/rcgWNXb Any idea what could be causing this? Thank you!!! Mike

    | naturalsociety
    1

  • Hey there, My site had terrible categorization. I did a redesign, and essentially decided to start over using Topics instead of categories - which appear as my site's main navigation. Now I need to assign a Topic to all my posts. Is it safe to assign posts to multiple parent Topics from an SEO point of view? I want to do it since it would be helpful for users to find them in multiple locations some of the time, but I certainly don't want any SEO issues. Also, should I de-categorize all of my posts since I'm assigning them to my new hierarchical taxonomy - Topics? This is very important to finalize. Any help or advice is greatly appreciated. Thanks, Mike

    | naturalsociety
    0

  • Over 40+ pages have been removed from the indexed and this page has been selected as the google preferred canonical. https://studyplaces.com/about-us/ The pages affected by this include: https://studyplaces.com/50-best-college-party-songs-of-all-time-and-why-we-love-them/ https://studyplaces.com/15-best-minors-for-business-majors/ As you can see the content on these pages is totally unrelated to the content on the about-us page.   Any ideas why this is happening and how to resolve.

    | pnoddy
    0

  • Hi, I'm creating a page of 300+ in the near future, on which the content basicly will be unique as it can be. However, upon every refresh, also coming from a search engine refferer, i want the actual content such as listing 12 business to be displayed random upon every hit. So basicly we got 300+ nearby pages with unique content, and the overview of those "listings" as i might say, are being displayed randomly. Ive build an extensive script and i disabled any caching for PHP files in specific these pages, it works. But what about google? The content of the pages will still be as it is, it is more of the listings that are shuffled randomly to give every business listing a fair shot at a click and so on. Anyone experience with this? Ive tried a few things in the past, like a "Last update PHP Month" in the title which sometimes is'nt picked up very well.

    | Vanderlindemedia
    0

  • Hey guys! I am looking into a site that has multiple category pages. These have descriptions that include relevant KWs. However, the text appears only on the desktop version, not on mobile. If I inspect the page, I still see the text's html tags. The site has been indexed by the smartphone google bot. Is there a chance the text's KWs are being detected by Google? I would think adding the text is the ideal route, but I´d like to see what you think. Thanks!

    | Reprise
    0

  • Dear Experts, I want to make an online store like www.abc.com  and I have a plan to buy 3 more domains like www.abc.co.uk, www.abc.com.au, www.abc.ae and redirect all domain to main domain which is www.abc.com but I want If somebody search from UK so He/she will see www.abc.co.uk domain in search result and If somebody search from UAE so He/she will see www.abc.ae domain in search result and same for other extension. How can I safe from duplication multiple domain for one website What would be the SEO strategy should i follow I am hoping a positive reply from your side Thanks

    | jfdagborrbg
    0

  • Excuse the basic question. I host my domain and website on Squarespace. I use a specific theme and after doing a site crawl of my site Moz picked up that Pages and Blog posts 'Missing or Invalid H1' tags (450 issues!). I discovered that my Squarespace theme only using H2 tags. Is this a serious issue that affects my search visibility? What would you recommend that I do to fix this, if anything? I'm starting some SEO and lnikbuilding, but wanted to see if this is an issue that I need to consider. Thanks!!!!

    | twofourseven
    0

  • I recently acquired a business that had a website with great rankings and traffic but also had a few physical locations listed in Google. The physical locations closed and are marked as "Permanently Closed" in Google. The website also expired for several months before I picked it up. The site had amazing rankings and traffic previously but is really struggling to regain rankings now. I'm concerned that the site is not performing well due to the business marked as "permanently closed" in GMB. I'm unable to do anything about that because the website doesn't have any physical locations. Any tips on what to do here?

    | shags313
    0

  • Hello, I hope you guys are doing well. I published an article about 2 months ago and I tried to write a rich article in regards to SEO. Also, did a keywords research and found my competitors. I created a lengthy but meaningful article from all the competitors. Here is the link to my Article and my targeted keyword is "CSC Scholarship 2020". You can check my domain DA & PA also. It is higher than the websites' DA, PA who are ranked on the top of the page. My content is also Lengthy and Meaningful and to-the-point but still, my Article is not generating any result for me. I want that, please investigate the main issue that is causing this problem.

    | HansiAliya
    1

  • Dear Experts,
    This is Julia and I am SEO Executive at J com. I would like to ask that If we are building 150+ quality backlinks per month for any website and 30 backlinks are deleting each month from 150 backlinks so It would be OK (natural) OR it would affect website ranking. I am looking forward to your response

    | jfdagborrbg
    0

  • Hello - We have a site with over 1,000 tags. We added too many and would like a fresh start as they are creating a lot of duplicate pages on the site. What is the best way to go about deleting all of these tags without being penalized by Google? Is there a way to tell Google direclty to stop crawling them? We would prefer to not have that many pages just sit as 404 errors on the site. Thank you.

    | FamiliesLoveTravel
    0

  • I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap. We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that.  We haven't gotten a crawl related error for 2 weeks now. Google appears to be indexing fewer and fewer urls every time it visits.  Any ideas why?  I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks! HnJaXSM.png

    | RyanTheMoz
    0

  • Hi guys! I have a metadata problem with my home page. If I look for the brand's keyword, the SERPs don´t show the metadata I configured, instead it shows the URL with sitelinks. If I use the "site:" command, it doesn't appear at all. This happens only on the home page, not the rest, which are roughly 700 pages. Those appear fine. I already have a meta title and meta description configured, which include the mentioned KW. It used to appear correctly before. GSC shows it indexed. Most audit tools (configured to crawl JS) detect the metadata. Moz's On Page tool doesn't. Could it be because of the JS configuration? Or am I missing something else? Here´s the meta description code:What do you think? I'd appreciate your input. Thanks!

    | Reprise
    0

  • Our site uses Canonicals to address duplicate content issues with product/facet filtering. example: www.mysite.com/product?color=blue Relcanon= www.mysite.com/product However, our site is also using no follow for all of the "filters" on a page (so all ?color=, etc. links are no follow). What is the benefit of utilizing the no follow on the filters if we have the rel canon in place? Is this an effort to save crawl budget? Are we giving up possible SEO juice by having the no follow and not having the crawler get to the canonical tag and subsequently reference the main page? Is this just something we just forget about? I hope we're not giving up SEO juice by

    | Remke
    0

  • for wxamp keyword  A has difficulty of  40 and monthly volume of 100 keyword B has difficulty of 30 and monthly volume of  500

    | calvinkj
    0

  • Hi all, I just went over some post that my page can get penalized for over optimizing. I realized my page has quite a lot of h1 (6 it had 30) and a lot of "bold" keywords, does the bolding affect the page seo/penalizing the page? the page im talking about it palmislander.com/dumaguete-travel-guide Thanks

    | i3arty
    0

  • When I crawl my site through moz, it shows lots of  Pages with Duplicate Content. The thing is all that pages are pagination pages. How should I solve this issue?

    | 100offdeal
    0

  • Hi, I have multiple servers across UK and USA. I have a web site that serves both areas and was looking at cloning my sites and using GeoDNS to route visitors to the closest server to improve speed and experience So UK visitors would connect to UK dedicated server, North America - New York server and so on Is this a good way or would this effect SEO negatively. Cheers Keith

    | Keith-007
    1

  • _ I found that the link from my YouTube channel to my website was do-follow. I checked through Moz bar but no one is indexing it. Why  is it so? _

    | knoiwtall
    0

  • I have recently re-done my entire site (only a few days).  I believe Google is still re-crawling and updating (however, the amount of movement on other searches has been significant).  My buyers guide is ranking very high for its intended keywords, as well as high for the keywords of the category page.  Both are at the beginning of the second page and I wonder if its dragging me down. What do you think I should do?  Is it to early to take action as everything has been completed redone.

    | Code2Chil
    0

  • Hi;
    I have a website that had an authoritative domain of 10 but today it has reached 5. I wanted to know what the reason for these changes was.
    No backlinks from my site have disappeared and even more, but today my authoritative domain has reached 5 authoritative domain.
    Can you guide my servant? My website to troubleshoot  :   bonianservice.com Thank you very much mxyv_untitled-2.jpg

    | 5mizo
    0

  • Hi, I was wondering is negatively influencing the SEO. Woocommerce add-to-cart is, logically, a 302. However, MOZ is alarming that there is a large amount of temporary redirects on my site.  Do I have to act on this or just leave as is? I change the nofollow to follow but not sure if this does more harm then good. Would like to hear some input regarding this issue.

    | ruevoliere
    0

  • I have a weebly site and I put the canonical tag in the header code but the moz crawler still says that I'm missing the canonical tag. Any tips?

    | ctpolarbears
    0

  • Hi all, we want to speed up our website (hosted in Wordpress, traffic around 450,000 page views monthly), we use lots of images. And we're wondering about setting up on Cloudflare, however after searching a bit in Google I have seen some people say the change in IP, or possible sharing of Its with bad neighbourhoods, can really hit search rankings. So, I was wondering what the latest thinking is on this subject, would the increased speed and local server locations be a boost for SEO, moreso than a potential loss of rankings for changing IP? Thanks!

    | tiromedia
    1

  • Hi, I didn't find an answer to my question in the Forum. I attached an example of content carousel, this is what I'm talking about. I understand that Google has no problem anymore with tabbed contents and accordeons (collapsible contents). But now I'm wondering about textual carousels. I'm not talking about an image slider, I'm talking about texts. Is text carousel harder to read for Google than plain text or tabs? Of course, i'm not talking about a carousel using Flash. Let's say the code is proper... Thanks for your help. spfra5

    | Alviau
    0

  • Hi! As above, I wrote this article  on my medium blog but am now launching my site, UnderstandingJiuJitsu.com. I have the post saved as a draft because I don't want to get pinged by google. a) how can I get a canonical tag on medium without importing and b) any issue with claiming the UJJ.com post is original when medium was posted first? Thanks and health, Elliott

    | OpenMat
    0

  • Hi guys, I´m dealing with a website of a client where hreflang tags are implemented as follows: As you can see the hreflang tags reference language & countrycode as well as only the languagecode with the same URL (for french: website/fr/ihr-besuch/online-tickets" hreflang="fr-fr"  as well as hreflang="fr" href="https://www.website/fr/ihr-besuch/online-tickets"). Is this a problem and should be corrected so that either language & countrycode is referenced or only languagecode? Thanks in advance!

    | Julisn
    0

  • We have completed translating our important pages from English to Spanish on our website. I am confused if I should be adding attributes like rel=alternate and hreflang=es to links. On our homepage we have links to our solution pages and the code looks like this: <a  href="https://www.membroz.com/es/club-management-software/">...</a > <a  href="https://www.membroz.com/es/salon-management-software/">...</a > <a  href="https://www.membroz.com/es/pre-school-management-software/">...</a > Should I add the attributes rel & hreflang to them? It would look something like this: ... <a <span>rel="alternate" hreflang="es"</a <span> href="https://www.membroz.com/es/salon-management-software/">... <a <span>rel="alternate" hreflang="es"</a <span> href="https://www.membroz.com/es/pre-school-management-software/">...

    | Krtya
    0

  • I"m trying to fix 4xx errors but I"m not finding the pages in my admin.  Where can I find this page? https://cracklefireplaces.com/collections/ethanol-wall-mounted-fireplaces/products/ignis-maximum-wall-mounted-ethanol-fireplace

    | carlbrekjern
    0

  • Hi all, My client suffered a malware attack a few weeks ago where an external site somehow created 700 plus links on my clients site with their content. I removed all of the content and redirected the pages to the home page. I then created a new temporary xml sitemap with those 700 links and submitted the sitemap to Google 9 days ago. Google has crawled the sitemap a few times but not the individual links. When I click on the crawl report for the sitemap in GSC, I see that the individual links still have the last crawled date from before they were removed. So in Googles eyes, that old malicioud content still exists. What do I do to ensure Google knows the contnt is gone and redirected? Thanks!

    | sk1990
    0

  • Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues.  I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue.  Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on .  The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues.  I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity.  Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap?  Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike

    | mjk26
    0

  • While doing text Google searches for various keywords I have found two sites that have scrapped pages from my site which goes by an old URL of www.tpxcnex.com and a new URL of www.tpxonline.com www.folder.com is one of the sites and if you try to visit that site or any of the scrapped Google index listing, Chrome warns you not to. How can I ask Chrome to deindex www.folder.com or another scrapper site, or atleast deindex the URLs which have clearly scrapped my content?

    | DougHartline
    0

  • Hi all. New here, so please be gentle. 🙂 I've developed a new site, where my client also wanted to rebrand from .co.nz to .nz On the source (co.nz) domain, I've setup a load of 301 redirects to the relevant new page on the new domain (the URL structure is changing as well).
    E.G. On the old domain: https://www.mysite.co.nz/myonlinestore/t-shirt.html
    In the HTACCESS on the old/source domain, I've setup 301's (using RewriteRule).
    So that when **https://www.mysite.co.nz/**myonlinestore/t-shirt.html is accessed, it does a 301 to;
    https://mysite.nz/shop/clothes/t-shirt All these 301's are working fine. I've checked in dev tools and a 301 is being returned. My question is, is having the 301's just on the source domain only enough, in regards to starting a 'Change of Address' in Google's Search Console? Their wording indicates it's enough but I'm concerned, maybe I also need redirects on the target domain as well? I.E. Does the Search Console Change of Address process work this way?
    It looks at the source domain URL (that's already in Google's index), sees the 301 then updates the index (and hopefully pass the link juice) to the new URL. Also, I've setup both source and target Search Console properties as Domain Properties. Does that mean I no longer need to specify that the source and target properties are HTTP or HTTPS? I couldn't see that option when I created the properties. Thanks!

    | WebGuyNZ
    0

  • Hi all, Our website has undergone both a redesign (with new URLs) and a migration to HTTPS in recent years. I'm having difficulties ensuring all URLs redirect to the correct version all the while preventing redirect chains. Right now everything is redirecting to the correct version but it usually takes up to two redirects to make this happen.  See below for an example. How do I go about addressing this, or is this not even something I should concern myself with? Redirects (2) <colgroup><col width="123"><col width="302"></colgroup>
    | Redirect Type | URL |
    | | http://www.theyoungfirm.com/blog/2009/index.html 301 | https://theyoungfirm.com/blog/2009/index.html 301 | https://theyoungfirm.com/blog/ | This code below was what we added to our htaccess file. Prior to adding this, the various subdomain versions (www, non-www, http, etc.) were not redirecting properly.  But ever since we added it, it's now created these additional URLs (see bolded URL above) as a middle step before resolving to the correct URL. RewriteEngine on RewriteCond %{HTTP_HOST} ^www.(.*)$ [NC] RewriteRule ^(.*)$ https://%1/$1 [R=301,L] RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L] Your feedback is much appreciated. Thanks in advance for your help. Sincerely, Bethany

    | theyoungfirm
    0

  • Hi all, We have recently implemented the schema.org structure for our breadcrumbs. In our breadcrumb we include a link to the homepage. Since implementation I'm receiving the following error in Google Search Console: Either "name" or "item.name" should be specified The error is being triggered because we don't have itemprop="name" defined for the homepage on each page. Our breadcrumbs look good in search but I'm wondering if anybody else has experienced this error and what can we do to fix it. Should itemprop="name" be our Brand name? Or should we define this as our root domain? Thanks in advance

    | Brando16
    0

  • My site has hundreds of keyword content landing pages that contain one or two sections of "read more" text that work by calling the page and changing a ChangeReadMore variable. This causes the page to currently get indexed 5 times (see examples below plus two more with anchor tag set to #sectionReadMore2 This causes Google to include the first version of the page which is the canonical version and exclude the other 4 versions of the page.  Google search console says my site has 4.93K valid pages and 13.8K excluded pages. My questions are: 1. Does having a lot of excluded pages which are all copies of included pages hurt my domain authority or otherwise hurt my SEO efforts? 2. Should I add a rel="nofollow" attribute to the read more link? If I do this will Google reduce the number of excluded pages? 3. Should I instead add logic so the canonical tag displays the exact URL each time the page re-displays in another readmore mode? I assume this would increase my "included pages" and decrease the number of "excluded pages".  Would this somehow help my SEO efforts? EXAMPLE LINKS https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp?ChangeReadMore=More#sectionReadMore1 https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp?ChangeReadMore=Less#sectionReadMore1

    | DougHartline
    0

  • Hello there, I'm the owner of the Website https://cours-toujours.com/, dedicated to reviewing running shoes. My Website is pretty young and I'm currently focused on building new reviews (so I keep adding new articles, week after week, I did not really focus on the rest of the website for now).Until a few days ago, I saw growing traffic on my Website, everything seemed good and I kept adding new reviews to my website.And then suddently traffic dropped and went to 0 in 2 days (I went from 550 impressions/day to 49 impressions/day in 2 days :/)When I look in the Google Search Console, I don't see any issue: my sitemaps are submitted and the correct number of URLs are reported I don't have any Manual Action or Security Issue I don't have any Removal Request Everything seems fine... But I can barely find my website in Google Search Results.When I do a site search (site:cours-toujours.com), I find only 2 pages of results, mostly non-important pages (categories, etc.).I asked in Google Community Forums, and i got this reply about my pages being too similar to one another (https://support.google.com/webmasters/thread/44880689?hl=en). But I'm not really happy with this answer, as all my pages have ~1000 words of unique content (even if of course they have the same structure as they are all dedicatd to reviewing a running shoe...)Any idea where this might come from/how I can fix the issue?

    | SimonCoursToujours
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.