Hash URLs
-
Hi Mozzers,
Happy Friday! I have a client that has created some really nice pages from their old content and we want to redirect the old ones to the new pages. The way the web developers have built these new pages is to use hashbang url's for example www.website.co.uk/product#newpage
My question is can I redirect urls to these kind of pages? Would it be using the .htaccess file to do it?
Thanks in advance,
Karl
-
Just wanted to clear up a bit of confusion. There is a difference between what can be redirected and what will be indexed by search engines.
It is absolutely possible to redirect the old URL to the new one that includes the local anchor (hash). In this way, user experience is preserved as for example, the old "what is matcha" page can be redirected directly to the new "what is matcha" tab, landing the user exactly where they expect to be. This is done in .htaccess as normal, but don't forget to escape the # symbol in the URL when you write the redirect.
But as Schwaab says, Google will index all the tabs' content as if they were all one page. If you look at the page source for any of those the tabbed pages, you'll see it's actually one primary page that includes separate sections for each tab - you can use GWT's Fetch as Googlebot to confirm this. So getting the main URL indexed means all the tabs' content are indexed, just not under separate URLs.
Having separate pages each targeting different but related matcha-related keywords can be beneficial, but so can having a single, longer-content, authoritative page with many more incoming links (as would be the case if the old separate pages were redirected to one primary page, consolidating all their separate link authority). That becomes a judgment call and is where the "art of SEO" come into play
Hope that helps?
Paul
P.S. Little quirk of local anchor URLs. If you're adding parameters to them such as Google Analytics tracking for incoming links, you need to add the hash after the parameters, or the local anchor won't work. e.g. mysite.com#localanchor becomes mysite.com?utmsource=foo&utm_medium=foo&utm_campaign=bar#localanchor
-
Good luck!
-
I thought that'd be the case! trying to get the developers to create unique pages and try and keep a similar/same design, not sure if it'll be too difficult though. Thanks for the advice though, fingers crossed we'll find a solution.
-
I misunderstood you before, I thought you meant the old URLs had the anchors.
You are correct, technically the tabs are not unique pages. You would have to redirect each of the previous pages to http://www.teapigs.co.uk/tea/matcha_shop rather than to the anchored URL.
Having content under tabs may limit your ability to rank for a variety of keywords. For example, if previously there was a page ranking for "What is Matcha?", it may now be difficult to rank for this term because there is no longer a unique page dedicated to the topic. You lose the ability to have a unique URL, Title Tag, Meta Description, H1, and so on.
-
Hi Schwaab,
Thanks for the reply. Google hasn't cached the new pages.
For example, the old page is http://www.teapigs.co.uk/customer/pages/matcha/what-is-matcha and the new content sits on http://www.teapigs.co.uk/tea/matcha_shop with the different tabs. Are we going to have to make them actual pages with static URL's for them to be crawled and indexed? Got a feeling we will!
-
Is the content technically on one page (ww.website.co.uk/product) and just being displays based on the anchor in the URL?
Has Google indexed the anchored URLs? In my experience Google does not index anchored URLs.
I'd love to see an example to see how it is coded; however, if they are just anchored URLs displaying content that is all located on one page, the products page, then the products page would be the only page you can redirect. Technically, anchored URLs are not unique pages.
If the content is being generated with AJAX and your developers are using the hashbang method to serve a unique URL, I don't believe you would see the hash in the URL.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameters
Hi Moz Community, I'm working on a website that has URL parameters. After crawling the site, I've implemented canonical tags to all these URLs to prevent them from getting indexed by Google. However, today I've found out that Google has indexed plenty of URL parameters.. 1-Some of these URLs has canonical tags yet they are still indexed and live. 2- Some can't be discovered through site crawling and they are result in 5xx server error. Is there anything else that I can do (other than adding canonical tags) + how can I discover URL parameters indexed but not visible through site crawling? Thanks in advance!
Intermediate & Advanced SEO | | bbop330 -
URL structure - which one is better?
We are creating a new website and got stuck while deciding the URL structure. Our concern is which url is better in terms of SEO i.e. pune.fabogo.com/spa or fabogo.com/pune/spa and why. Also which one would rank faster if someone searches for **spas in pune if both **pages are same.
Intermediate & Advanced SEO | | fabogo_marketing0 -
How can I make a list of all URLs indexed by Google?
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
Intermediate & Advanced SEO | | Bryggselv.no0 -
¿Disallow duplicate URL?
Hi comunity, thanks for answering my question. I have a problem with a website. My website is: http://example.examples.com/brand/brand1 (good URL) but i have 2 filters to show something and this generate 2 URL's more: http://example.examples.com/brand/brand1?show=true (if we put 1 filter) http://example.examples.com/brand/brand1?show=false (if we put other filter) My question is, should i put in robots.txt disallow for these filters like this: **Disallow: /*?show=***
Intermediate & Advanced SEO | | thekiller990 -
Internal links and URL shortners
Hi guys, what are your thoughts using bit.ly links as internal links on blog posts of a website? Some posts have 4/5 bit.ly links going to other pages of our website (noindexed pages). I have nofollowed them so no seo value is lost, also the links are going to noindexed pages so no need to pass seo value directly. However what are your thoughts on how Google will see internal links which have essential become re-direct links? They are bit.ly links going to result pages basically. Am I also to assume the tracking for internal links would also be better using google analytics functionality? is bit.ly accurate for tracking clicks? Any advice much appreciated, I just wanted to double check this.
Intermediate & Advanced SEO | | pauledwards0 -
CHange insite Urls structure
Hello Guys! I have a situation with a website and I need some opinions. Today, the structured of my site is: (I have had this site architecture since many years) Main country home (www.mysite.com.tld) o Product_1 Home (www.mysite.com.tld/product1/) § Product_1 articles www.mysite.com.tld/product1/product1_art1 www.mysite.com.tld/product1/product1_art2 www.mysite.com.tld/product1/product1_artx o Product_2 Home (www.mysite.com.tld/product2/) § Product_2 articles www.mysite.com.tld/product1/product2_art1 www.mysite.com.tld/product1/product2_art2 www.mysite.com.tld/product1/product2_artx I have several TLDs with their main and their products. We are thinking in modify this structure and begin to use subdomains for each product (The IT guys need this approach because is simpler to distribute the servers load). I not very friendly with subdomains and big changes like this always can produce some problem (although the SEO migration would be ok, problems could appear, like ranking drops), But, the solution (the reasons are technical stuff), requires the mix of directories and subdomains in each product, leaving the structured in this way: Main country home (www.mysite.com.tld) o Product_1 Home (www.mysite.com.tld/product1/) § Product_1 articles product1.mysite.com.tld/product1_art1 product1.mysite.com.tld/product1_art2 product1.mysite.com.tld/product1_artx o Product_2 Home (www.mysite.com.tld/product2/) § Product_2 articles product2.mysite.com.tld/product1_art1 product2.mysite.com.tld/product1_art2 product2.mysite.com.tld/product1_artx So, the product home will be in a directory buy the pages of the articles of this product will be in a subdomain. What do you think about this solution? Beyond that the SEO migration would be fine, 301s, etc, can bring us difficulties in the rankings or the change can be done without any consideration? Thanks very much! Agustin
Intermediate & Advanced SEO | | SEOTeamDespegar0 -
From an SEO Standpoint, which is better for my product category URLs?
With our e-commerce store, we can customize the URL for the product categories, so we could have: http://www.storename.com/product-category-keywords/ or http://www.storename.com/product-category-keywords.html From an SEO standpoint (or even from a "trying to get links" standpoint), which would be better to have? I feel like having a *.html category page would be easier for link building, but that's just my personal feelings. Side Note: Our product pages are: http://www.storename.com/product-name.html Thanks in advance
Intermediate & Advanced SEO | | fenderseo0 -
How to 301 redirect ASP.net URLS
I have a situation where a site that was ASP.net has been replaced with a WordPress site. I've performed a Open Site Explorer analysis and found that most of the old pages, ie www.i3bus.com/ProductCategorySummary.aspx?ProductCategoryId=63 are returning a HTTP Status = NO DATA ... when followed ends up at the 404 catch-all page. Can I code the standard 301 Redirects in the .htaccess file for these ASP URLs? If not, I'm open to suggestions.... Thanks Bill
Intermediate & Advanced SEO | | Marvo0