URL for offline purposes
-
Hi there,
We are going to be promoting one of our products offline, however I do not want to use the original URL for this product page as it's long for the user to type in, so I thought it would be best practice in using a URL that would be short, easier for the consumer to remember.
My plan:
Replicate the product page and put it on this new short URL, however this would mean I have a duplicate content issue, would It be best practice to use a canonical on the new short URL pointing to the original URL? or use a 301?
Thanks for any help
-
I agree with Matt - as long as your primary, internal links are consistent, it's ok to use a short version for offline purposes. The canonical tag is perfectly appropriate for this.
The other option would be to use a third-party shortener that has built-in tracking, like Bit.ly. It uses a 301-redirect, but also captures the data. If you're just doing a test case, this might be easier all-around.
-
Well I am assuming all your sites internal links are already pointing to the original product page, so in relation to this, as long as you don't create any internal links pointing to your duplicate friendly URL for offline you will be fine and implementing it as DR Pete instructs. Canonical links should be on all pages that are duplicates of the target page which is part of the canonical tag.
-
I read this in Dr.Pete's article in seomoz
Know Your Crawl Paths
Finally, an important reminder – the most important canonical signal is usually your internal links. If you use the canonical tag to point to one version of a URL, but then every internal link uses a different version, you’re sending a mixed signal and using the tag as a band-aid. The canonical URL should actually becanonical in practice – use it consistently. If you’re an outside SEO coming into a new site, make sure you understand the crawl paths first, before you go and add a bunch of tags. Don’t create a mess on top of a mess.
Would this cause me an issue using the method I have used?
Also should I use a canonical on the original URL pointing to itself?
Thanks
-
I don't think you need to remove this Gary if that is the case - take a look here for an updated 2012 article on rel="canonical" from the horses mouth
- http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
This might help you.
-
H,
IMO you can simply disallow the URL with robots.txt. There is no other alternative for this.
Regards,
-
Hi Matt,
I really do not want to create a 301, as I want to see stats in Analytics for this short URL.
I have actually used a canonical, do you recommend removing this and using disallow in robots.txt?
Thanks.
-
I would create a 301 redirect from your new short URL to your original product page as you are essentially just creating a new path to it and not new content.
Here is a post about canonicalisation from Matt Cutts - http://www.mattcutts.com/blog/seo-advice-url-canonicalization/
And another useful insight from SEOMoz on how to deal with duplicate content - http://www.seomoz.org/learn-seo/duplicate-content
Hope this helps
Blurbpoint is also correct using his method will also work - blocking the page in a robots.txt file or using the meta-tags no index, no follow will also stop duplicate content issues! The down side is that any links that your short URL acquires will not pass any link juice unlike with 301s or canonicalization.
-
By using canonical tag we can tell Google, which is the original version of page. Dr pete has written nice post on it few days back.
Here is the URL: http://www.seomoz.org/blog/which-page-is-canonical
Hope this will solve your concern.
-
Hi there,
I have just read this post:
What is the purpose of the canonical tag in this instance if you can you block that URL in robots.txt?
Thanks
-
If you are thinking of promoting that product offline, you can block that page in your robots.txt file or alternatively you can also put noindex, nofollow robot tag in that page. Search engine will not going to index that page as its blocked for all bots so no duplicate content issue will arise.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Country URL Structure
Hey Guys, We have a www.site.com (gTLD) site, the primary market in Australia. We want to expand to US and UK. For the homepage, we are looking to create 3 new subfolders which are: site.com/au/ site.com/uk/ site.com/us/ Then if someone visits the site.com redirect based on their ip address to to the correct location. We are also looking to setup hreflang tags between the 3 sub-folders and set geo-location targeting in google search console at sub-folder level. Just wondering if this setup sounds ok for international SEO? Cheers.
Intermediate & Advanced SEO | | pladcarl90 -
All URLs in the site is 302 redirected to itself
Hi everyone, I have a problem with a website wherein all URLs (homepage, inner pages) are 302 redirected. This is based on Screaming Frog crawl. But the weird thing is that they are 302 redirected to themselves which doesn't make any sense. Example:
Intermediate & Advanced SEO | | alex_goldman
https://www.example.com.au/ is 302 redirected to https://www.example.com.au/ https://www.example.com.au/shop is 302 redirected to https://www.example.com.au/shop https://www.example.com.au/shop/dresses is 302 redirected to https://www.example.com.au/shop/dresses Have you encountered this issue? What did you do to fix it? Would be very glad to hear your responses. Cheers!0 -
Cleaning up backlinks and changing URLs
Currently we are performing very poorly in organic clicks. We are a e-commerce site with over 2000 products. Issues we thought plagued us: Copied Images from competitors Site wide duplicate content duplicate content from competitor site Number of internal links on a page (300+) Bad backlinks (2.3k from 22 domains and ips) being linked to from sites like m.biz URLs URLs are abbreviated, over 50% lack our keywords Lack of meta descriptions, or too long meta descriptions Current State of fixing these issues: 50% images are now our own Site wide duplicate content near 100% completed Internal links have been dealt with Rewrote content for every product 90% of meta descriptions are fixed From all of these changes we have yet to see increase in traffic...10% increase at best in organic clicks. We think we have penalties on certain URLs. My question for the MOZ community is what is the best way to attack the lack of organic clicks. Our main competition is getting 900% more clicks than us. Any more information you need on the topic let me know and will get back to you.
Intermediate & Advanced SEO | | TITOJAX0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
HTML for URL markup
Hi, We are changing our URLs to be more SEO friendly. Is there any negative impact or pitfall of using <base> HTML-tag? Our developers are considering it as a possible solution for relative URLs inside HTML-markup in the Friendly URL context.
Intermediate & Advanced SEO | | theLotter0 -
"Category" word in URLs of blog is it SEO Friendly URL ??
Hello respected community members, I saw many times that "Category" word comes in URL of blog. So my que is that is this negative for SEO or Positive. & if we don't wanna to come CATEGORY in URL how can we remove while URL Optimization ?
Intermediate & Advanced SEO | | sourabhrana390 -
Dynamically change anchor text and URLs remotely
Hey i'm looking to create a widget in javascript which i dynamically change the urls and anchor text which link the widget back to my site remotely (via php) once it spreads. I have heard peopled doing this before, but i can't seem to find a example. Does anyone know of any examples/widgets or anything which can do this?
Intermediate & Advanced SEO | | monster990 -
Googlebot crawling partial URLs
Hi guys, I've checked my email this morning and I've got a number of 404 errors over the weekend where Google has tried to crawl some of my existing pages but not found the full URL. Instead of hitting 'domain.com/folder/complete-pagename.php' it's hit 'domain.com/folder/comp'. This is definitely Googlebot/2.1; http://www.google.com/bot.html (66.249.72.53) but I can't find where it would have found only the partial URL. It certainly wasn't on the domain it's crawling and I can't find any links from external sites pointing to us with the incorrect URL. GoogleBot is doing the same thing across a single domain but in different sub-folders. Having checked Webmaster Tools there aren't any hard 404s and the soft ones aren't related and haven't occured since August. I'm really confused as to how this is happening.. Thanks!
Intermediate & Advanced SEO | | panini0