Skip to content
    Moz logo Menu open Menu close
    • Products
      • Moz Pro
      • Moz Pro Home
      • Moz Local
      • Moz Local Home
      • STAT
      • Moz API
      • Moz API Home
      • Compare SEO Products
      • Moz Data
    • Free SEO Tools
      • Domain Analysis
      • Keyword Explorer
      • Link Explorer
      • Competitive Research
      • MozBar
      • More Free SEO Tools
    • Learn SEO
      • Beginner's Guide to SEO
      • SEO Learning Center
      • Moz Academy
      • MozCon
      • Webinars, Whitepapers, & Guides
    • Blog
    • Why Moz
      • Digital Marketers
      • Agency Solutions
      • Enterprise Solutions
      • Small Business Solutions
      • The Moz Story
      • New Releases
    • Log in
    • Log out
    • Products
      • Moz Pro

        Your all-in-one suite of SEO essentials.

      • Moz Local

        Raise your local SEO visibility with complete local SEO management.

      • STAT

        SERP tracking and analytics for enterprise SEO experts.

      • Moz API

        Power your SEO with our index of over 44 trillion links.

      • Compare SEO Products

        See which Moz SEO solution best meets your business needs.

      • Moz Data

        Power your SEO strategy & AI models with custom data solutions.

      Save 36% now!
      Moz Pro

      Save 36% now!

      Sign up
    • Free SEO Tools
      • Domain Analysis

        Get top competitive SEO metrics like DA, top pages and more.

      • Keyword Explorer

        Find traffic-driving keywords with our 1.25 billion+ keyword index.

      • Link Explorer

        Explore over 40 trillion links for powerful backlink data.

      • Competitive Research

        Uncover valuable insights on your organic search competitors.

      • MozBar

        See top SEO metrics for free as you browse the web.

      • More Free SEO Tools

        Explore all the free SEO tools Moz has to offer.

      Save 36% now!
      Moz Pro

      Save 36% now!

      Sign up
    • Learn SEO
      • Beginner's Guide to SEO

        The #1 most popular introduction to SEO, trusted by millions.

      • SEO Learning Center

        Broaden your knowledge with SEO resources for all skill levels.

      • On-Demand Webinars

        Learn modern SEO best practices from industry experts.

      • How-To Guides

        Step-by-step guides to search success from the authority on SEO.

      • Moz Academy

        Upskill and get certified with on-demand courses & certifications.

      • MozCon

        Save on Early Bird tickets and join us in London or New York City

      Access 20 years of data with flexible pricing
      Moz API

      Access 20 years of data with flexible pricing

      Find your plan
    • Blog
    • Why Moz
      • Digital Marketers

        Simplify SEO tasks to save time and grow your traffic.

      • Small Business Solutions

        Uncover insights to make smarter marketing decisions in less time.

      • Agency Solutions

        Earn & keep valuable clients with unparalleled data & insights.

      • Enterprise Solutions

        Gain a competitive edge in the ever-changing world of search.

      • The Moz Story

        Moz was the first & remains the most trusted SEO company.

      • New Releases

        Get the scoop on the latest and greatest from Moz.

      Surface actionable competitive intel
      New Feature

      Surface actionable competitive intel

      Learn More
    • Log in
      • Moz Pro
      • Moz Local
      • Moz Local Dashboard
      • Moz API
      • Moz API Dashboard
      • Moz Academy
    • Avatar
      • Moz Home
      • Notifications
      • Account & Billing
      • Manage Users
      • Community Profile
      • My Q&A
      • My Videos
      • Log Out

    The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. Home
    2. SEO Tactics
    3. Intermediate & Advanced SEO
    4. Forwarded vanity domains, suddenly resolving to 404 with appended URL's ending in random 5 characters

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Forwarded vanity domains, suddenly resolving to 404 with appended URL's ending in random 5 characters

    Intermediate & Advanced SEO
    2
    4
    1204
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with question management privileges can see it.
    • SS.Digital
      SS.Digital last edited by

      We have several vanity domains that forward to various pages on our primary domain.
      e.g. www.vanity.com (301)--> www.mydomain.com/sub-page (200)

      These forwards have been in place for months or even years and have worked fine.  As of yesterday, we have seen the following problem.  We have made no changes in the forwarding settings.

      Now, inconsistently, they sometimes resolve and sometimes they do not.  When we load the vanity URL with Chrome Dev Tools (Network Pane) open, it shows the following redirect chains, where xxxxx represents a random 5 character string of lower and upper case letters.  (e.g. VGuTD)

      EXAMPLE:
      www.vanity.com                                  (302, Found) -->
      www.vanity.com/xxxxx                        (302, Found) -->
      www.vanity.com/xxxxx                        (302, Found) -->
      www.vanity.com/xxxxx/xxxxx               (302, Found) -->
      www.mydomain.com/sub-page/xxxxx (404, Not Found)

      This is just one example, the amount of redirects, vary wildly.  Sometimes there is only 1 redirect, sometimes there are as many as 5.

      Sometimes the request will ultimately resolve on the correct mydomain.com/sub-page, but usually it does not (as in the example above).

      We have cross-checked across every browser, device, private/non-private, cookies cleared, on and off of our network etc...   This leads us to believe that it is not at the device or host level.

      Our Registrar is Godaddy.  They have not encountered this issue before, and have no idea what this 5 character string is from.  I tend to believe them because per our analytics, we have determined that this problem only started yesterday.

      Our primary question is, has anybody else encountered this problem either in the last couple days, or at any time in the past?  We have come up with a solution that works to alleviate the problem, but to implement it across hundreds of vanity domains will take us an inordinate amount of time.  Really hoping to fix the cause of the problem instead of just treating the symptom.

      1 Reply Last reply Reply Quote 0
      • SS.Digital
        SS.Digital @MikeTek last edited by

        Yes, we have contacted GoDaddy several times.

        GoDaddy has insisted it is not their problem and they do not have any advice to resolve this issue. GoDaddy support said there can be strange behavior when forward and masking. We tested removing the masking, but it did not make a difference. Nor does 301 vs. 302 redirecting. I understand the latter should not be used as a workaround as these responses have different meanings, but we did test (which also made no difference).

        Check this link for more details:

        https://www.godaddy.com/community/Managing-Domains/My-domain-name-not-resolving-correctly-6-random-characters-are/m-p/64440#M16148

        Others are experiencing the same issue and somewhere in the thread it was stated that GoDaddy recently rolled out a new system which likely created this issue. We can trace the issue beginning in late August 2017 via Google Analytics, Search Console 404s and testing via Chrome Dev Tools (Network pane with Preserve log checked).

        We would also like to understand why in order to address the root cause, instead of using a workaround. This is significant issue. Unfortunately, GoDaddy is not handling the issue professionally and will impact our future business decisions involving GoDaddy.

        1 Reply Last reply Reply Quote 0
        • MikeTek
          MikeTek last edited by

          That's a very strange behavior I have not seen before (and I've had plenty of experience with GoDaddy and their domain forwarding).

          The query workaround is interesting/clever - but I'd also be inclined to want to sort out why this is happening at all and stop it vs reworking all the domain forwards around this symptom.

          Have you contacted GoDaddy's shared hosting support? I'm not the biggest GoDaddy fan overall, but their tech support team can be quite helpful in tracking issues like this down.

          SS.Digital 1 Reply Last reply Reply Quote 0
          • SS.Digital
            SS.Digital last edited by

            It looks like this is a GoDaddy specific issue that many others are experiencing:

            https://www.godaddy.com/community/Managing-Domains/My-domain-name-not-resolving-correctly-6-random-characters-are/td-p/60782

            Although, at the time of this writing GoDaddy has not offered an explanation nor resolution. However, a workaround may be forwarding the domain with a query string appended, which in turn, appends the random six characters to the query string, instead of creating a url segment that the CMS interprets as a non-existent page and throws a 404.

            For example, consider:

            www.vanity.com -> www.primary.com?utm_source=forward

            The GoDaddy issue should then resolve with via:

            www.primary.com?utm_source=forwardxxxxxx

            Alternatively, the fowarding can be accomplished from the reverse angle, if you have access to the hosting account of the primary domain by adding a forwarded domain from something like cPanel or Plesk that points the primary domain name and then updating the GoDaddy A record to point to the primary domain's IP Address (and remove any GoDaddy forwarding).

            Or migrate from GoDaddy!

            1 Reply Last reply Reply Quote 0
            • 1 / 1
            • First post
              Last post

            Got a burning SEO question?

            Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


            Start my free trial


            Browse Questions

            Explore more categories

            • Moz Tools

              Chat with the community about the Moz tools.

            • SEO Tactics

              Discuss the SEO process with fellow marketers

            • Community

              Discuss industry events, jobs, and news!

            • Digital Marketing

              Chat about tactics outside of SEO

            • Research & Trends

              Dive into research and trends in the search industry.

            • Support

              Connect on product support and feature requests.

            • See all categories

            Related Questions

            • Fulanito

              Move domain to new domain, for how much time should I keep forwarding?

              I'm not sure but my website looks like is not getting it's juice as supposed to be. As we already know, google preferred https sites and this is what happened to mine, it was been crawling as https but when the time came to move my domain to new domain, I used 301 or domain forwarding service, unfortunately they didn't have a way to forward from https to new https, they only had regular http to https, when users clicked to my old domain from google search my site was returned to "site does not exist", I used hreflang at least that google would detect my new domain been forwarding and yes it worked but now I'm wondering, for how much time should I keep the forwarding the old domain to the new one, my site looks like is not going up, I have changed all the external links, any help would be appreciated. Thanks!

              Intermediate & Advanced SEO | | Fulanito
              1
            • jayoliverwright

              Duplicate URLs ending with #!

              Hi guys, Does anyone know why a site can contain duplicate URLs ending with hastag & exclamation mark e.g. https://site.com.au/#! We are finding a lot of these URLs (as duplicates) and i was wondering what they are from developer standpoint? And do you think it's worth the time and effort adding a rel canonical tag or 301 to these URLs eventhough they're not getting indexed by Google? Cheers, Chris

              Intermediate & Advanced SEO | | jayoliverwright
              0
            • Ria_

              Partial Match or RegEx in Search Console's URL Parameters Tool?

              So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?

              Intermediate & Advanced SEO | | Ria_
              0
            • Bio-RadAbs

              Can you redirect specific sub domain URLs?

              ello! We host our PDFs, Images, CSS all in a sub domain. For the question, let's call this sub.cyto.com. I've noticed a particular PDF doing really well, infact it has gathered valuable external links from high authoritative sites. To top it off, it gets good visits. I've been going back and forth with our developers to move this PDF to a subfolder structure.
              For example: www.cyto.com/document/xxxx.pdf In my perspective, if I move this and set up a permanent redirect, then all the external links the PDF gathered, link juice and future visits will be attributed to the main website. Since the PDF is existing in the subdomain, I can't even track direct visits nor get the link juice. It appears in top position of Google as well. My developer says it is better to keep images, pdf, css in the subdomain. I see his point and an idea I have is to: convert the pdf to a webpage. Set up a 301 redirect from the existing subdomain to this webpage Upload the pdf with a new name and link to it from the webpage, so users can download if they choose to. This should give me the existing rank juice. However, my question is whether you can set up a 301 redirect for just a single subdomain URL to a folder structure URL? sub.cyto.com/xxx.pdf to www.cyto.com/document/xxxx.pdf?

              Intermediate & Advanced SEO | | Bio-RadAbs
              0
            • esiow2013

              May know what's the meaning of these parameters in .htaccess?

              Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
              RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
              RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist

              Intermediate & Advanced SEO | | esiow2013
              1
            • Townpages

              Culling 99% of a website's pages. Will this cause irreparable damage?

              I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick

              Intermediate & Advanced SEO | | Townpages
              0
            • Romancing

              URL Length or Exact Breadcrumb Navigation URL? What's More Important

              Basically my question is as follows, what's better: www.romancingdiamonds.com/gemstone-rings/amethyst-rings/purple-amethyst-ring-14k-white-gold (this would fully match the breadcrumbs). or www.romancingdiamonds.com/amethyst-rings/purple-amethyst-ring-14k-white-gold (cutting out the first level folder to keep the url shorter and the important keywords are closer to the root domain). In this question http://www.seomoz.org/qa/discuss/37982/url-length-vs-url-keywords I was consulted to drop a folder in my url because it may be to long. That's why I'm hesitant to keep the bradcrumb structure the same. To the best of your knowldege do you think it's best to drop a folder in the URL to keep it shorter and sweeter, or to have a longer URL and have it match the breadcrumb structure? Please advise, Shawn

              Intermediate & Advanced SEO | | Romancing
              0
            • STPseo

              Best approach to launch a new site with new urls - same domain

              www.sierratradingpost.com We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results. The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites. Except for the homepage the URL structure for the new site is different than the old site. What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues? Here is what we got back from a Google post which may highlight our concerns better: http://www.google.com/support/forum/p/Webmasters/thread?tid=62d0a16c4702a17d&hl=en&fid=62d0a16c4702a17d00049b67b51500a6 Thank You, sincerely, Stephan Woo Cude SEO Specialist scude@sierratradingpost.com

              Intermediate & Advanced SEO | | STPseo
              0

            Get started with Moz Pro!

            Unlock the power of advanced SEO tools and data-driven insights.

            Start my free trial
            Products
            • Moz Pro
            • Moz Local
            • Moz API
            • Moz Data
            • STAT
            • Product Updates
            Moz Solutions
            • SMB Solutions
            • Agency Solutions
            • Enterprise Solutions
            • Digital Marketers
            Free SEO Tools
            • Domain Authority Checker
            • Link Explorer
            • Keyword Explorer
            • Competitive Research
            • Brand Authority Checker
            • Local Citation Checker
            • MozBar Extension
            • MozCast
            Resources
            • Blog
            • SEO Learning Center
            • Help Hub
            • Beginner's Guide to SEO
            • How-to Guides
            • Moz Academy
            • API Docs
            About Moz
            • About
            • Team
            • Careers
            • Contact
            Why Moz
            • Case Studies
            • Testimonials
            Get Involved
            • Become an Affiliate
            • MozCon
            • Webinars
            • Practical Marketer Series
            • MozPod
            Connect with us

            Contact the Help team

            Join our newsletter
            Moz logo
            © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
            • Accessibility
            • Terms of Use
            • Privacy

            Looks like your connection to Moz was lost, please wait while we try to reconnect.