Vanity URL's and http codes
-
We have a vanity URL that as recommended is using 301 http code, however it has been discovered the destination URL needs to be updated which creates a problem since most browsers and search engines cache 301 redirects.
Is there a good way to figure out when a vanity should be a 301 vs 302/307?
If all vanity URL's should use 301, what is the proper way of updating the destination URL?
Is it a good rule of thumb that if the vanity URL is only going to be temporary and down the road could have a new destination URL to use 302, and all others 301?
Cheers,
-
Like I said in the last paragraph, if this is temporary, 302 redirect the original destination URL to the new destination URL as well as the redirecting URL to the new destination. If these changes are permanent, make them 301 instead of 302 redirects.
-
That does answer my question partly. How do you handle the cached URL for the original 301 that points to the invalid URL?
Example. www.bob.com/hello points to www.bob.com/directory/folder/file.aspx
It needs to now point to www.bob.com/directory/folder2/file2.aspx
If browsers and search engines cache the first 301 since it's meant to be permanent, visitors that have been to the first URL will not get passed off to the new one.
-
Let me make sure I understand you. You have a vanity URL like bit.ly or something. It redirects to your website which is bitly.com or something like that. This redirect is a 301 permanent redirect.
Are you asking if bitly.com changed what you should do with the redirect? That's how I understood the question. So say bitly.com now goes to bitly.com/new or something along those lines.
If this is the case, all you want to do is change your 301 redirect of bit.ly to the new destination URL and keep it a 301. That is, unless bitly.com/new is only a temporary URL. If it will be reverting back to bitly.com then don't do that.
Instead, when you redirect bitly.com to bitly.com/new use a 302 redirect, keeping the 301 from bit.ly to bitly.com in tact. Hopefully that answers your question. Let me know if your scenario is different.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What does it mean to build a 'good' website.
Hi guys. I've heard a lot of SEO professionals, Google, (and Rand in a couple of whiteboard Friday's) say it's really important to build a 'good' website if you want to rank well. What does this mean in more practical terms? (Context... I've found some sites rank much better than they 'should' do based on the competition. However, when I built my own site (well-optimised (on-page) based on thorough keyword research) it was nowhere to be found (not even top 50 after I'd 'matched' the backlink profile of others on page 1). I can only put this down to there being 'good quality website' signals lacking in the latter example. I'm not a web developer so the website was the pretty basic WordPress site.)
Algorithm Updates | | isaac6630 -
What do you think of SearchMetrics' claim that there are no longer universal ranking factors?
I agree that Google's machine learning/AI means that Google is using a more dynamic set of factors to match searcher intent to content, but this claim feels like an overstatement: Let’s be quite clear: Except for important technical standards, there are no longer any specifc factors
Algorithm Updates | | AdamThompson
or benchmark values that are universally valid for all online marketers and SEOs. Instead, there
are different ranking factors for every single industry, or even every single search query. And these
now change continuously. Keyword-relevant content, backlinks, etc. still seem to be ranking factors across pretty much all queries/industries. For example, I can't think of a single industry where it would be a good idea to try to rank for [keyword] without including [keyword] in the visible text of the page. Also, websites that rank without any backlinks are incredibly rare (unheard of for competitive terms). Doubtless some factors change (eg Google may favor webpages with images for a query like "best hairstyle for men" but not for another query), but other factors still seem to apply to all queries (or at least 95%+). Thoughts?0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Anyone Notice Google's Latest Change Seems to Favor Google Books?
I've noticed a change in the search results lately. As I search around I notice a lot of results from books.google.com Seems a little (ok a lot) self serving... JMHO
Algorithm Updates | | get4it1 -
Is having an identical title, h1 and url considered "over optimization"? Is it better to vary?
To get some new pages out without over-thinking things, I decided to line up the title tag, h1 tag and URLs of my pages exactly. They are dynamically generated based on the content the user is viewing (internal search results pages) They're not ranking very well at the moment, but there are a number of factors that are likely to blame. But, in particular, does anyone know if varying the text in these elements tends to perform better vs. having them all identical? Has there been any information from Google about this? Most if not all of the "over optimization" content I have seen online pertains to backlinks, not on-page content. It's easy to say, "test it!" And of course, that's just what I'm planning to do. But I thought I would leverage the combined knowledge of this forum to see what information I could obtain first, so I can do some informed testing, as tests can take a while to see results. Thanks 🙂
Algorithm Updates | | ntcma0 -
Increased 404 and Blocked URL Notifications in Webmaster Tools
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools. When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May. The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building. I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic. My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot. Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
Algorithm Updates | | CUnet0 -
Dropped off cliff for a partic keyword & can't find out why
At the beginning of Dec we ranked consistently in the top 3 for the keyword 'Suffolk' for the site www.suffolktouristguide.com (apge rank 4, thousands of quality inboud links, site age 5 years +). Since then we've been falling off a cliff and today aren't even in the top 50 for this search term, but most of our othr search terms are unaffected. Our SEOMoz grade remains A for 'Suffolk' and we haven't changed anything in that time that could have had such a material effect (knowingly at least). A similar issue happened to my other site www.suffolkhotelsguide.com back in April and it hasn't recovered despite grade A's on the homepage and key pages. We've checked internal broken links, page download times, external links (used the disavow tool and reconsideration request and got back 'We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google'); etc etc Any thoughts on what I can try next? All suggestions appreciated as I am completely stuck (& have spent a fortune on 'SEO experts' to no effect).
Algorithm Updates | | SarahinSuffolk0 -
Local SEO url format & structure: ".com/albany-tummy-tuck" vs ".com/tummy-tuck" vs ".com/procedures/tummy-tuck-albany-ny" etc."
We have a relatively new site (re: August '10) for a plastic surgeon who opened his own solo practice after 25+ years with a large group. Our current url structure goes 3 folders deep to arrive at our tummy tuck procedure landing page. The site architecture is solid and each plastic surgery procedure page (e.g. rhinoplasty, liposuction, facelift, etc.) is no more than a couple clicks away. So far, so good - but given all that is known about local seo (which is a very different beast than national seo) quite a bit of on-page/architecture work can still be done to further improve our local rank. So here a a couple big questions facing us at present: First, regarding format, is it a given that using geo keywords within the url indispustibly and dramatically impacts a site's local rank for the better (e.g. the #2 result for "tummy tuck" and its SHENANIGANS level use of "NYC", "Manhattan", "newyorkcity" etc.)? Assuming that it is, would we be better off updating our cosmetic procedure landing page urls to "/albany-tummy-tuck" or "/albany-ny-tummy-tuck" or "/tummy-tuck-albany" etc.? Second, regarding structure, would we be better off locating every procedure page within the root directory (re: "/rhinoplasty-albany-ny/") or within each procedure's proper parent category (re: "/facial-rejuvenation/rhinoplasty-albany-ny/")? From what I've read within the SEOmoz Q&A, adding that parent category (e.g. "/breast-enhancement/breast-lift") is better than having every link in the root (i.e. completely flat). Third, how long before google updates their algorithm so that geo-optimized urls like http://www.kolkermd.com/newyorkplasticsurgeon/tummytucknewyorkcity.htm don't beat other sites who do not optimize so aggressively or local? Fourth, assuming that each cosmetic procedure page will eventually have strong link profiles (via diligent, long term link building efforts), is it possible that geo-targeted urls will negatively impact our ability to rank for regional or less geo-specific searches? Thanks!
Algorithm Updates | | WDeLuca0