Hash URLs
-
Hi Mozzers,
Happy Friday! I have a client that has created some really nice pages from their old content and we want to redirect the old ones to the new pages. The way the web developers have built these new pages is to use hashbang url's for example www.website.co.uk/product#newpage
My question is can I redirect urls to these kind of pages? Would it be using the .htaccess file to do it?
Thanks in advance,
Karl
-
Just wanted to clear up a bit of confusion. There is a difference between what can be redirected and what will be indexed by search engines.
It is absolutely possible to redirect the old URL to the new one that includes the local anchor (hash). In this way, user experience is preserved as for example, the old "what is matcha" page can be redirected directly to the new "what is matcha" tab, landing the user exactly where they expect to be. This is done in .htaccess as normal, but don't forget to escape the # symbol in the URL when you write the redirect.
But as Schwaab says, Google will index all the tabs' content as if they were all one page. If you look at the page source for any of those the tabbed pages, you'll see it's actually one primary page that includes separate sections for each tab - you can use GWT's Fetch as Googlebot to confirm this. So getting the main URL indexed means all the tabs' content are indexed, just not under separate URLs.
Having separate pages each targeting different but related matcha-related keywords can be beneficial, but so can having a single, longer-content, authoritative page with many more incoming links (as would be the case if the old separate pages were redirected to one primary page, consolidating all their separate link authority). That becomes a judgment call and is where the "art of SEO" come into play
Hope that helps?
Paul
P.S. Little quirk of local anchor URLs. If you're adding parameters to them such as Google Analytics tracking for incoming links, you need to add the hash after the parameters, or the local anchor won't work. e.g. mysite.com#localanchor becomes mysite.com?utmsource=foo&utm_medium=foo&utm_campaign=bar#localanchor
-
Good luck!
-
I thought that'd be the case! trying to get the developers to create unique pages and try and keep a similar/same design, not sure if it'll be too difficult though. Thanks for the advice though, fingers crossed we'll find a solution.
-
I misunderstood you before, I thought you meant the old URLs had the anchors.
You are correct, technically the tabs are not unique pages. You would have to redirect each of the previous pages to http://www.teapigs.co.uk/tea/matcha_shop rather than to the anchored URL.
Having content under tabs may limit your ability to rank for a variety of keywords. For example, if previously there was a page ranking for "What is Matcha?", it may now be difficult to rank for this term because there is no longer a unique page dedicated to the topic. You lose the ability to have a unique URL, Title Tag, Meta Description, H1, and so on.
-
Hi Schwaab,
Thanks for the reply. Google hasn't cached the new pages.
For example, the old page is http://www.teapigs.co.uk/customer/pages/matcha/what-is-matcha and the new content sits on http://www.teapigs.co.uk/tea/matcha_shop with the different tabs. Are we going to have to make them actual pages with static URL's for them to be crawled and indexed? Got a feeling we will!
-
Is the content technically on one page (ww.website.co.uk/product) and just being displays based on the anchor in the URL?
Has Google indexed the anchored URLs? In my experience Google does not index anchored URLs.
I'd love to see an example to see how it is coded; however, if they are just anchored URLs displaying content that is all located on one page, the products page, then the products page would be the only page you can redirect. Technically, anchored URLs are not unique pages.
If the content is being generated with AJAX and your developers are using the hashbang method to serve a unique URL, I don't believe you would see the hash in the URL.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 vs Canonical - With A Side of Partial URL Rewrite and Google URL Parameters-OH MY
Hi Everyone, I am in the middle of an SEO contract with a site that is partially HTML pages and the rest are PHP and part of an ecommerce system for digital delivery of college classes. I am working with a web developer that has worked with this site for many years. In the php pages, there are also 6 different parameters that are currently filtered by Google URL parameters in the old Google Search Console. When I came on board, part of the site was https and the remainder was not. Our first project was to move completely to https and it went well. 301 redirects were already in place from a few legacy sites they owned so the developer expanded the 301 redirects to move everything to https. Among those legacy sites is an old site that we don't want visible, but it is extensively linked to the new site and some of our top keywords are branded keywords that originated with that site. Developer says old site can go away, but people searching for it are still prevalent in search. Biggest part of this project is now to rewrite the dynamic urls of the product pages and the entry pages to the class pages. We attempted to use 301 redirects to redirect to the new url and prevent the draining of link juice. In the end, according to the developer, it just isn't going to be possible without losing all the existing link juice. So its lose all the link juice at once (a scary thought) or try canonicals. I am told canonicals would work - and we can switch to that. My questions are the following: 1. Does anyone know of a way that might make the 301's work with the URL rewrite? 2. With canonicals and Google parameters, are we safe to delete the parameters after we have ensures everything has a canonical url (parameter pages included)? 3. If we continue forward with 301's and lose all the existing links, since this only half of the pages in the site (if you don't count the parameter pages) and there are only a few links per page if that, how much of an impact would it have on the site and how can I avoid that impact? 4. Canonicals seem to be recommended heavily these days, would the canonical urls be a better way to go than sticking with 301's. Thank you all in advance for helping! I sincerely appreciate any insight you might have. Sue (aka Trudy)
Intermediate & Advanced SEO | | TStorm1 -
URL structure - which one is better?
We are creating a new website and got stuck while deciding the URL structure. Our concern is which url is better in terms of SEO i.e. pune.fabogo.com/spa or fabogo.com/pune/spa and why. Also which one would rank faster if someone searches for **spas in pune if both **pages are same.
Intermediate & Advanced SEO | | fabogo_marketing0 -
Changing URLS: from a short well optimised URL to a longer one – What's the traffic risk
I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages. They've invested in their content over the years and managed to rank well for key search terms. They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed. E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.
Intermediate & Advanced SEO | | moge0 -
Redirect to url with parameter
I have a wiki (wiki 1) where many of the pages are well index in google. Because of a product change I had to create a new wiki (wiki 2) for the new version of my product. Now that most of my customers are using the new version of my product I like to redirect the user from wiki 1 to wiki 2. An example of a redirect could be from wiki1.website.com/how_to_build_kitchen to wiki2.website.com/how_to_build_kitchen. Because of a technical issue the url I redirect to, needs to have a parameter like "?" so the example will be wiki2.website.com/how_to_build_kitchen? Will the search engines see it as I have two pages with same content?
Intermediate & Advanced SEO | | Debitoor
wiki2.website.com/how_to_build_kitchen
and
wiki2.website.com/how_to_build_kitchen? And will the SEO juice from wiki1.website.com/how_to_build_kitchen be transfered to wiki2.website.com/how_to_build_kitchen?0 -
Is it best to have products and reviews on the same URL?
Hi Moz, Is it better to have products and reviews on the same or different URLs? I suspect that combining these into one page will help with rankings overall even though some ranking for product review terms may suffer. This is for a hair products company with tens of thousands if not hundreds of thousands of reviews. Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Expired urls
For a large jobs site, what would be the best way to handle job adverts that are no longer available? Ideas that I have include: Keep the url live with the original content and display current similar job vacancies below - this has the advantage of continually growing the number of indexed pages. 301 redirect old pages to parent categories - this has the advantage of concentrating any acquired link juice where it is most needed. Your thoughts much appreciated.
Intermediate & Advanced SEO | | cottamg0 -
Should /node/ URLs be 301 redirect to Clean URLs
Hi All! We are in the process of migrating to Drupal and I know that I want to block any instance of /node/ URLs with my robots.txt file to prevent search engines from indexing them. My question is, should we set 301 redirects on the /node/ versions of the URLs to redirect to their corresponding "clean" URL, or should the robots.txt blocking and canonical link element be enough? My gut tells me to ask for the 301 redirects, but I just want to hear additional opinions. Thank you! MS
Intermediate & Advanced SEO | | MargaritaS0 -
How canonical url harm our website???
Even though my website has no similar/copied content, i used rel=canonical for all my website pages. Is Google or yahoo make any harm to my SERP's?? EX: http://www.seomoz.org is my site, in that i used canonical as rel="<a class="attribute-value">canonical</a>" href="http://www.seomoz.org" to my home page like that similar to all pages, i created rel=canonical. Is search engine harm my website???
Intermediate & Advanced SEO | | MadhukarSV0