Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Track AI Overviews in Keyword Research
      Moz Pro

      Track AI Overviews in Keyword Research

      Try it free!
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      NEW Keyword Suggestions by Topic
      Moz Pro

      NEW Keyword Suggestions by Topic

      Learn more
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Unlock flexible pricing & new endpoints
      Moz API

      Unlock flexible pricing & new endpoints

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. Trailing Slashes for Magento CMS pages - 2 URLS - Duplicate content

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Trailing Slashes for Magento CMS pages - 2 URLS - Duplicate content

    Intermediate & Advanced SEO
    3
    6
    7494
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • iamgreenminded
      iamgreenminded last edited by

      Hello,

      Can anyone help me find a solution to Fixing and Creating Magento CMS pages to only use one URL  and not two URLS?

      www.domain.com/testpage

      www.domain.com/testpage/

      I found a previous article that applies to my issue, which is using htaccess to redirect request for pages in magento 301 redirect to slash URL from the non-slash URL.  I dont understand the syntax fully in htaccess , but I used this code below.

      This code below fixed the CMS page redirection but caused issues on other pages, like all my categories and products with this error:

      "This webpage has a redirect loop

      ERR_TOO_MANY_REDIRECTS"

      Assuming you're running at domain root.  Change to working directory if needed.

      RewriteBase /

      # www check

      If you're running in a subdirectory, then you'll need to add that in

      to the redirected url (http://www.mydomain.com/subdirectory/$1

      RewriteCond %{HTTP_HOST} !^www. [NC]
      RewriteRule ^(.*)$ http://www.mydomain.com/$1 [R=301,L]

      Trailing slash check

      Don't fix direct file links

      RewriteCond %{REQUEST_FILENAME} !-f

      RewriteCond %{REQUEST_URI} !(.)/$
      RewriteRule ^(.
      )$ $1/ [L,R=301]

      Finally, forward everything to your front-controller (index.php)

      RewriteCond %{REQUEST_FILENAME} !-f
      RewriteCond %{REQUEST_FILENAME} !-d
      RewriteRule .* index.php [QSA,L]

      1 Reply Last reply Reply Quote 0
      • iamgreenminded
        iamgreenminded last edited by

        301's are not difficult for me, but handling the code for a logic to re-route requests for "URL" to "URL/" is something I dont know how to do. I can manually 301 or rel canonical my CMS pages on Magento everytime, but that defeats the purpose or the automation in htaccess I am trying to get working.

        thanks

        1 Reply Last reply Reply Quote 0
        • iamgreenminded
          iamgreenminded @kwoolf last edited by

          Thank You Kevin.

          This is almost the default Magento htaccess file(out of the box), I think I had a couple entries to fix a couple other issues, the code I just added that isnt working is in the middle of the htaccess, its commented starting with this: ** "## slash removal re-write done by ALEX MEADE for iamgreenminded.com**

          uncomment these lines for CGI mode

          make sure to specify the correct cgi php binary file name

          it might be /cgi-bin/php-cgi

          Action php5-cgi /cgi-bin/php5-cgi

          AddHandler php5-cgi .php

          ############################################

          GoDaddy specific options

          Options -MultiViews

          you might also need to add this line to php.ini

          cgi.fix_pathinfo = 1

          if it still doesn't work, rename php.ini to php5.ini

          ############################################

          this line is specific for 1and1 hosting

          #AddType x-mapp-php5 .php
          #AddHandler x-mapp-php5 .php

          ############################################

          default index file

          DirectoryIndex index.php

          ############################################

          adjust memory limit

          php_value memory_limit 64M

          php_value memory_limit 256M
          php_value max_execution_time 18000

          ############################################

          disable magic quotes for php request vars

          php_flag magic_quotes_gpc off

          ############################################

          disable automatic session start

          before autoload was initialized

          php_flag session.auto_start off

          ############################################

          enable resulting html compression

          #php_flag zlib.output_compression on

          ###########################################

          disable user agent verification to not break multiple image upload

          php_flag suhosin.session.cryptua off

          ###########################################

          turn off compatibility with PHP4 when dealing with objects

          php_flag zend.ze1_compatibility_mode Off

          <ifmodule mod_security.c="">###########################################

          disable POST processing to not break multiple image upload</ifmodule>

          SecFilterEngine Off
          SecFilterScanPOST Off

          ############################################

          enable apache served files compression

          http://developer.yahoo.com/performance/rules.html#gzip

          Insert filter on all content

          ###SetOutputFilter DEFLATE

          Insert filter on selected content types only

          #AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css text/javascript

          Netscape 4.x has some problems...

          #BrowserMatch ^Mozilla/4 gzip-only-text/html

          Netscape 4.06-4.08 have some more problems

          #BrowserMatch ^Mozilla/4.0[678] no-gzip

          MSIE masquerades as Netscape, but it is fine

          #BrowserMatch \bMSIE !no-gzip !gzip-only-text/html

          Don't compress images

          #SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary

          Make sure proxies don't deliver the wrong content

          #Header append Vary User-Agent env=!dont-vary

          ############################################

          make HTTPS env vars available for CGI mode

          SSLOptions StdEnvVars

          ############################################

          enable rewrites

          Options +FollowSymLinks
          RewriteEngine on

          ############################################

          slash removal re-write done by ALEX MEADE for iamgreenminded.com

          RewriteBase /
          RewriteCond %{REQUEST_FILENAME} !-f
          RewriteCond %{REQUEST_FILENAME} !-d
          RewriteCond %{REQUEST_URI} !(.)/$
          RewriteCond %{REQUEST_FILENAME} !.(gif|jpg|png|jpeg|css|js)$ [NC]
          RewriteRule ^(.
          )$ http://%{HTTP_HOST}/$1/ [L,R=301]
          ############################################

          ############################################

          you can put here your magento root folder

          path relative to web root

          #RewriteBase /magento/

          ############################################

          uncomment next line to enable light API calls processing

          RewriteRule ^api/([a-z][0-9a-z_]+)/?$ api.php?type=$1 [QSA,L]

          ############################################

          rewrite API2 calls to api.php (by now it is REST only)

          RewriteRule ^api/rest api.php?type=rest [QSA,L]

          ############################################

          workaround for HTTP authorization

          in CGI environment

          RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]

          ############################################

          TRACE and TRACK HTTP methods disabled to prevent XSS attacks

          RewriteCond %{REQUEST_METHOD} ^TRAC[EK]
          RewriteRule .* - [L,R=405]

          ############################################

          redirect for mobile user agents

          #RewriteCond %{REQUEST_URI} !^/mobiledirectoryhere/.$
          #RewriteCond %{HTTP_USER_AGENT} "android|blackberry|ipad|iphone|ipod|iemobile|opera mobile|palmos|webos|googlebot-mobile" [NC]
          #RewriteRule ^(.
          )$ /mobiledirectoryhere/ [L,R=302]

          ############################################

          always send 404 on missing files in these folders

          RewriteCond %{REQUEST_URI} !^/(media|skin|js)/

          ############################################

          never rewrite for existing files, directories and links

          RewriteCond %{REQUEST_FILENAME} !-f
          RewriteCond %{REQUEST_FILENAME} !-d
          RewriteCond %{REQUEST_FILENAME} !-l

          ############################################

          rewrite everything else to index.php

          RewriteRule .* index.php [L]

          ############################################

          Prevent character encoding issues from server overrides

          If you still have problems, use the second line instead

          AddDefaultCharset Off
          #AddDefaultCharset UTF-8

          ############################################

          Add default Expires header

          http://developer.yahoo.com/performance/rules.html#expires

          ExpiresDefault "access plus 1 year"

          ############################################

          By default allow all access

          Order allow,deny
          Allow from all

          ###########################################

          Deny access to release notes to prevent disclosure of the installed Magento version

          <files release_notes.txt="">order allow,deny
          deny from all</files>

          ############################################

          If running in cluster environment, uncomment this

          http://developer.yahoo.com/performance/rules.html#etags

          #FileETag none

          Permanent URL redirect - generated by www.rapidtables.com

          Redirect 301 /thebirdword http://www.thebirdword.com

          1 Reply Last reply Reply Quote 0
          • kwoolf
            kwoolf @iamgreenminded last edited by

            You probably have other redirects in your .htaccess and possibly in your website code. The order of your rewrites is also important. Publish your Apache config and I'll take a look.

            FYI, there are better resources for technical issue than MOZ. Most here are not developers/IT specialists; we're more like SEO strategists and business managers.

            iamgreenminded 1 Reply Last reply Reply Quote 0
            • iamgreenminded
              iamgreenminded last edited by

              RewriteEngine On
              RewriteBase /
              RewriteCond %{REQUEST_FILENAME} !-f
              RewriteCond %{REQUEST_URI} !example.php
              RewriteCond %{REQUEST_URI} !(.)/$
              RewriteRule ^(.
              )$ http://domain.com/$1/ [L,R=301]

              I have found both of the articles you linked here, nothing is working - any code I try gives me the same error on most of my pages:

              "This webpage has a redirect loop

              ERR_TOO_MANY_REDIRECTS"

              Still need a fix for this

              thanks

              kwoolf 1 Reply Last reply Reply Quote 0
              • kwoolf
                kwoolf last edited by

                Yes, server redirects are necessary. Try these solutions to see which one works for you:

                http://ralphvanderpauw.com/seo/how-to-301-redirect-a-trailing-slash-in-htaccess/

                http://enarion.net/web/htaccess/trailing-slash/

                You might want to consider moving to Nginx. You'll notice amazing speed and stability improvement with Nginx, Redis Session Cache, Memcached, OpCache, Ngx_pagespeed, and Magento Cache Storage Management. I can help much more with Nginx redirects and conf files--I gave up Apache years ago. Sorry I couldn't be of more help.

                1 Reply Last reply Reply Quote 0
                • 1 / 1
                • First post
                  Last post

                Got a burning SEO question?

                Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                Start my free trial


                Browse Questions

                Explore more categories

                • Moz Tools

                  Chat with the community about the Moz tools.

                • SEO Tactics

                  Discuss the SEO process with fellow marketers

                • Community

                  Discuss industry events, jobs, and news!

                • Digital Marketing

                  Chat about tactics outside of SEO

                • Research & Trends

                  Dive into research and trends in the search industry.

                • Support

                  Connect on product support and feature requests.

                • See all categories

                Related Questions

                • BeckyKey

                  Category Pages & Content

                  Hi Does anyone have any great examples of an ecommerce site which has great content on category pages or product listing pages? Thanks!

                  Intermediate & Advanced SEO | | BeckyKey
                  1
                • ajiabs

                  Duplicate content due to parked domains

                  I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names.  So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.

                  Intermediate & Advanced SEO | | ajiabs
                  0
                • AMHC

                  Removing duplicate content

                  Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?

                  Intermediate & Advanced SEO | | AMHC
                  0
                • couponguy

                  Is a different location in page title, h1 title, and meta description enough to avoid Duplicate Content concern?

                  I have a dynamic website which will have location-based internal pages that will have a <title>and <h1> title, and meta description tag that will include the subregion of a city.  Each page also will have an 'info' section describing the generic product/service offered which will also include the name of the subregion.  The 'specific product/service content will be dynamic but in some cases will be almost identical--ie subregion A may sometimes have the same specific content result as subregion B.  Will the difference of just the location put in each of the above tags be enough for me to avoid a Duplicate Content concern?</p></title>

                  Intermediate & Advanced SEO | | couponguy
                  0
                • browndoginteractive

                  Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)

                  Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
                  2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality:  http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results:  Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index:  robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages.  I say "force" because of the crawl budget required.  Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links.  Best of both worlds:  crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution:  using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.

                  Intermediate & Advanced SEO | | browndoginteractive
                  0
                • MBASydney

                  Duplicate content on sites from different countries

                  Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill

                  Intermediate & Advanced SEO | | MBASydney
                  0
                • dkamen

                  Are duplicate links on same page alright?

                  If I have a homepage with category links, is it alright for those category links to appear in the footer as well, or should you never have duplicate links on one page? Can you please give a reason why as well? Thanks!

                  Intermediate & Advanced SEO | | dkamen
                  0
                • ContentWriterMicky

                  How to resolve Duplicate Page Content issue for root domain & index.html?

                  SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?

                  Intermediate & Advanced SEO | | ContentWriterMicky
                  0

                Get started with Moz Pro!

                Unlock the power of advanced SEO tools and data-driven insights.

                Start my free trial
                Products
                • Moz Pro
                • Moz Local
                • Moz API
                • Moz Data
                • STAT
                • Product Updates
                Moz Solutions
                • SMB Solutions
                • Agency Solutions
                • Enterprise Solutions
                • Digital Marketers
                Free SEO Tools
                • Domain Authority Checker
                • Link Explorer
                • Keyword Explorer
                • Competitive Research
                • Brand Authority Checker
                • Local Citation Checker
                • MozBar Extension
                • MozCast
                Resources
                • Blog
                • SEO Learning Center
                • Help Hub
                • Beginner's Guide to SEO
                • How-to Guides
                • Moz Academy
                • API Docs
                About Moz
                • About
                • Team
                • Careers
                • Contact
                Why Moz
                • Case Studies
                • Testimonials
                Get Involved
                • Become an Affiliate
                • MozCon
                • Webinars
                • Practical Marketer Series
                • MozPod
                Connect with us

                Contact the Help team

                Join our newsletter
                Moz logo
                © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                • Accessibility
                • Terms of Use
                • Privacy

                Looks like your connection to Moz was lost, please wait while we try to reconnect.