Hash URLs
-
Hi Mozzers,
Happy Friday! I have a client that has created some really nice pages from their old content and we want to redirect the old ones to the new pages. The way the web developers have built these new pages is to use hashbang url's for example www.website.co.uk/product#newpage
My question is can I redirect urls to these kind of pages? Would it be using the .htaccess file to do it?
Thanks in advance,
Karl
-
Just wanted to clear up a bit of confusion. There is a difference between what can be redirected and what will be indexed by search engines.
It is absolutely possible to redirect the old URL to the new one that includes the local anchor (hash). In this way, user experience is preserved as for example, the old "what is matcha" page can be redirected directly to the new "what is matcha" tab, landing the user exactly where they expect to be. This is done in .htaccess as normal, but don't forget to escape the # symbol in the URL when you write the redirect.
But as Schwaab says, Google will index all the tabs' content as if they were all one page. If you look at the page source for any of those the tabbed pages, you'll see it's actually one primary page that includes separate sections for each tab - you can use GWT's Fetch as Googlebot to confirm this. So getting the main URL indexed means all the tabs' content are indexed, just not under separate URLs.
Having separate pages each targeting different but related matcha-related keywords can be beneficial, but so can having a single, longer-content, authoritative page with many more incoming links (as would be the case if the old separate pages were redirected to one primary page, consolidating all their separate link authority). That becomes a judgment call and is where the "art of SEO" come into play
Hope that helps?
Paul
P.S. Little quirk of local anchor URLs. If you're adding parameters to them such as Google Analytics tracking for incoming links, you need to add the hash after the parameters, or the local anchor won't work. e.g. mysite.com#localanchor becomes mysite.com?utmsource=foo&utm_medium=foo&utm_campaign=bar#localanchor
-
Good luck!
-
I thought that'd be the case! trying to get the developers to create unique pages and try and keep a similar/same design, not sure if it'll be too difficult though. Thanks for the advice though, fingers crossed we'll find a solution.
-
I misunderstood you before, I thought you meant the old URLs had the anchors.
You are correct, technically the tabs are not unique pages. You would have to redirect each of the previous pages to http://www.teapigs.co.uk/tea/matcha_shop rather than to the anchored URL.
Having content under tabs may limit your ability to rank for a variety of keywords. For example, if previously there was a page ranking for "What is Matcha?", it may now be difficult to rank for this term because there is no longer a unique page dedicated to the topic. You lose the ability to have a unique URL, Title Tag, Meta Description, H1, and so on.
-
Hi Schwaab,
Thanks for the reply. Google hasn't cached the new pages.
For example, the old page is http://www.teapigs.co.uk/customer/pages/matcha/what-is-matcha and the new content sits on http://www.teapigs.co.uk/tea/matcha_shop with the different tabs. Are we going to have to make them actual pages with static URL's for them to be crawled and indexed? Got a feeling we will!
-
Is the content technically on one page (ww.website.co.uk/product) and just being displays based on the anchor in the URL?
Has Google indexed the anchored URLs? In my experience Google does not index anchored URLs.
I'd love to see an example to see how it is coded; however, if they are just anchored URLs displaying content that is all located on one page, the products page, then the products page would be the only page you can redirect. Technically, anchored URLs are not unique pages.
If the content is being generated with AJAX and your developers are using the hashbang method to serve a unique URL, I don't believe you would see the hash in the URL.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Url Removes Backlink
Hello MOZ Community, I have question regarding Bad Backlink Removal. My Site's Post's Image got 4 to 5k backlinks from unknown sites and also their is no contact details on their site so that i can contact them to remove. So, I have an idea for which i want suggestion " If I change the url that receieves backlinks" does this will remove backlinks? For Example: https://example.com/test/ got 5k backlinks if I change this url to https://examplee.com/test-failed/ does this will remove those 5k backlinks? If not then How Can I remove those Backlinks? I Know about disavow but this takes time.
Intermediate & Advanced SEO | | Jackson210 -
Internal Links - Different URLs
Hey so, In my product page, I have recommended products at the bottom. The issue is that those recommended products have long parameters such as sitename.com/product-xy-z/https%3A%2F%2Fwww.google.co&srcType=dp_recs The reason why it has that long parameter is due to tracking purposes (internally with the dev and UX team). My question is, should I replace it with the clean URL or as long as it has the canonical tag, it should be okay to have such a long parameter? I would think clean URL would help with internal links and what not...but if it already has a canonical tag would it help? Another issue is that the URL is different and not just the parameter. For instance..the canonical URL is sitename.com/productname-xyz/ and so the internal link used on the product page (same exact page just different URL with parameter) sitename.com/xyz/https%3A%2F%2Fwww.google.co&srcType=dp_recs (missing product name), BUT still has the canonical tag!
Intermediate & Advanced SEO | | ggpaul5620 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Woo Commerce Woo Compare Urls Indexing?
Hi I am using Wordpress/Woo commerce for my site Thetotspot.co.uk http://www.thetotspot.co.uk/?action=yith-woocompare-add-product&id=1412&_wpnonce=a5560b1b07 But I am getting a lot of temporary redirects registering in Moz for things like the above - woo compare / add to cart links Anyone come across this - how did you get solve? I am using Yoast SEO currently, have no indexed archives and pages of archive etc.
Intermediate & Advanced SEO | | Kelly33300 -
CMS Pages - Multiple URLS (/)
Hi guys, this type of question has been asked a few times before but I couldn't find something that told me what i need so apologies if its a tad repetitive. I use Magento, and have several pages using its CMS. However, it produces 2 URLS for each page with a simple /. For example, website.com/hire
Intermediate & Advanced SEO | | ATP
website.com/hire/ I know google treats this as 2 separate pages, which would be the better solution. 1. Write a URL re-write for every CMS page
RewriteRule ^hire$ http://www.website.com/hire/ [R=301,L] (Is this right?) 2. Write a general rewrite rule to always add the /
No idea where to begin with this 3. Add a Canonical tag to the page which i think is possible in magento by adding this to the Custom Design Layout XML option in the page CMS. <action method="addLinkRel"></action> <rel>canonical</rel> <href>http://www.website.com/hire/</href> This would make the /hire/ page self-reference and the /hire page reference the /hire/ page I think. Which one of these solutions is the best and any pointers with the coding would be grand.0 -
Add URL parameters in SEOMoz as per GWT?
Hi, this may be a tall order, or maybe it's already in place and I'm behind the times! Any chance on getting something like this going? Even handier, have SEOMoz import these settings directly from GWT. The issue comes into play when looking at my duplicate page content reports; I'm guessing that SEOMoz will continue showing these as duplicates even after I have tweaked GWT to read them properly. Haven't tested this theory as I just started down this road on GWT myself. Thanks. 🙂
Intermediate & Advanced SEO | | ntcma0 -
Overly-Dynamic URLs & Changing URL Structure w Web Redesign
I have a client that has multiple apartment complexes in different states and metro areas. They get good traffic and pretty good conversions but the site needs a lot of updating, including the architecture, to implement SEO standards. Right now they rank for " <brand_name>apartments" on every place but not " <city_name>apartments".</city_name></brand_name> There current architecture displays their URLs like: http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=overview</client_apartments> http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=floorplans&floorPlanID=121</client_apartments> I know it is said to never change the URL structure but what about this site? I see this URL structure being bad for SEO, bad for users, and basically forces us to keep the current architecture. They don't have many links built to their community pages so will creating a new URL structure and doing 301 redirects to the new URLs drastically drop rankings? Is this something that we should bite the bullet on now for future rankings, traffic, and a better architecture?
Intermediate & Advanced SEO | | JaredDetroit0 -
Could this URL issue be affecting our rankings?
Hi everyone, I have been building links to a site for a while now and we're struggling to get page 1 results for their desired keywords. We're wondering if a web development / URL structure issue could be to blame in what's holding it back. The way the site's been built means that there's a 'false' 1st-level in the URL structure. We're building deeplinks to the following page: www.example.com/blue-widgets/blue-widget-overview However, if you chop off the 2nd-level, you're not given a category page, it's a 404: www.example.com/blue-widgets/ - [Brings up a 404] I'm assuming the web developer built the site and URL structure this way just for the purposes of getting additional keywords in the URL. What's worse is that there is very little consistency across other products/services. Other pages/URLs include: www.example.com/green-widgets/widgets-in-green www.example.com/red-widgets/red-widget-intro-page www.example.com/yellow-widgets/yellow-widgets I'm wondering if Google is aware of these 'false' pages* and if so, if we should advise the client to change the URLs and therefore the URL structure of the website. This is bearing in mind that these pages haven't been linked to (because they don't exist) and therefore aren't being indexed by Google. I'm just wondering if Google can determine good/bad URL etiquette based on other parts of the URL, i.e. the fact that that middle bit doesn't exist. As a matter of fact, my colleague Steve asked this question on a blog post that Dr. Pete had written. Here's a link to Steve's comment - there are 2 replies below, one of which argues that this has no implication whatsoever. However, 5 months on, it's still an issue for us so it has me wondering... Many thanks!
Intermediate & Advanced SEO | | Gmorgan0