URL for offline purposes
-
Hi there,
We are going to be promoting one of our products offline, however I do not want to use the original URL for this product page as it's long for the user to type in, so I thought it would be best practice in using a URL that would be short, easier for the consumer to remember.
My plan:
Replicate the product page and put it on this new short URL, however this would mean I have a duplicate content issue, would It be best practice to use a canonical on the new short URL pointing to the original URL? or use a 301?
Thanks for any help
-
I agree with Matt - as long as your primary, internal links are consistent, it's ok to use a short version for offline purposes. The canonical tag is perfectly appropriate for this.
The other option would be to use a third-party shortener that has built-in tracking, like Bit.ly. It uses a 301-redirect, but also captures the data. If you're just doing a test case, this might be easier all-around.
-
Well I am assuming all your sites internal links are already pointing to the original product page, so in relation to this, as long as you don't create any internal links pointing to your duplicate friendly URL for offline you will be fine and implementing it as DR Pete instructs. Canonical links should be on all pages that are duplicates of the target page which is part of the canonical tag.
-
I read this in Dr.Pete's article in seomoz
Know Your Crawl Paths
Finally, an important reminder – the most important canonical signal is usually your internal links. If you use the canonical tag to point to one version of a URL, but then every internal link uses a different version, you’re sending a mixed signal and using the tag as a band-aid. The canonical URL should actually becanonical in practice – use it consistently. If you’re an outside SEO coming into a new site, make sure you understand the crawl paths first, before you go and add a bunch of tags. Don’t create a mess on top of a mess.
Would this cause me an issue using the method I have used?
Also should I use a canonical on the original URL pointing to itself?
Thanks
-
I don't think you need to remove this Gary if that is the case - take a look here for an updated 2012 article on rel="canonical" from the horses mouth
- http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
This might help you.
-
H,
IMO you can simply disallow the URL with robots.txt. There is no other alternative for this.
Regards,
-
Hi Matt,
I really do not want to create a 301, as I want to see stats in Analytics for this short URL.
I have actually used a canonical, do you recommend removing this and using disallow in robots.txt?
Thanks.
-
I would create a 301 redirect from your new short URL to your original product page as you are essentially just creating a new path to it and not new content.
Here is a post about canonicalisation from Matt Cutts - http://www.mattcutts.com/blog/seo-advice-url-canonicalization/
And another useful insight from SEOMoz on how to deal with duplicate content - http://www.seomoz.org/learn-seo/duplicate-content
Hope this helps
Blurbpoint is also correct using his method will also work - blocking the page in a robots.txt file or using the meta-tags no index, no follow will also stop duplicate content issues! The down side is that any links that your short URL acquires will not pass any link juice unlike with 301s or canonicalization.
-
By using canonical tag we can tell Google, which is the original version of page. Dr pete has written nice post on it few days back.
Here is the URL: http://www.seomoz.org/blog/which-page-is-canonical
Hope this will solve your concern.
-
Hi there,
I have just read this post:
What is the purpose of the canonical tag in this instance if you can you block that URL in robots.txt?
Thanks
-
If you are thinking of promoting that product offline, you can block that page in your robots.txt file or alternatively you can also put noindex, nofollow robot tag in that page. Search engine will not going to index that page as its blocked for all bots so no duplicate content issue will arise.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you know if there is a tool that can tell you if a url have backlink?
Hi, Do you know if there is a tool that I can check backlinks for thousands of URLs Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Duplicate page url crawl report
Details: Hello. Looking at the duplicate page url report that comes out of Moz, is the best tactic to a) use 301 redirects, and b) should the url that's flagged for duplicate page content be pointed to the referring url? Not sure where the 301 redirect should be applied... should this url, for example: <colgroup><col width="452"></colgroup>
Intermediate & Advanced SEO | | compassseo
| http://newgreenair.com/website/blog/ | which is listed in the first column of the Duplicate Page Content crawl, be pointed to referring url in the same spreadsheet? Or, what's the best way to apply the 301 redirect? thanks!0 -
Location in URLs question
Hi there, my company is a national theater news publisher. Quick question about a particular use case. When an editor publishes a story they can assign several discrete locations, allowing it to appear on each of those locations within our website. This article (http://www.theatermania.com/denver-theater/news/full-casting-if-then-tour-idina-menzel_74354.html), for example, appears in the Los Angeles, San Francisco, and Denver section. We force the author to choose a primary location from that list, which controls the location displayed in the URL. Is this a bad practice? I'm wondering if the fact that having 'Denver' in the URL is misleading and hurts SEO value, particularly since that article features several other cities.
Intermediate & Advanced SEO | | TheaterMania0 -
Hash URLs
Hi Mozzers, Happy Friday! I have a client that has created some really nice pages from their old content and we want to redirect the old ones to the new pages. The way the web developers have built these new pages is to use hashbang url's for example www.website.co.uk/product#newpage My question is can I redirect urls to these kind of pages? Would it be using the .htaccess file to do it? Thanks in advance, Karl
Intermediate & Advanced SEO | | KarlBantleman0 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0 -
Impact of Non-English target keywords in URL
Hi all, our site language is Farsi (Persian) so at first we tried to create URLs that contain our target keywords in Farsi too. The problem with this approach is that our URLs are not shown in a friendly style anymore: a bunch of unicode numeric codes instead of Farsi characters. Do you know which is the best approach? 1. Creating ugly looking URLs containing Farsi keywords 2. Forget about putting our keywords in URLs and have nice English URLs Thanks in advance for your time and help 🙂
Intermediate & Advanced SEO | | diki0 -
Should I shorten my urls?
For my informational site I have a lot of urls that are way too long. When I first created the site, I wrote a script that takes out the common words of a post and fashions a url. So, for example, if the first few words of a question were: Hi there, I have a question about back pain. I'm wondering what drugs would be good for relief and how I can get some help? then my url may be: www.mydomain.com/question?id=123-question-back-pain-wondering-drugs-good-relief-how-get-some-help Once I got learning about seo I realized that these urls were too long but I never did anything about them. Should I be shortening these, or is my time best spent doing something else?
Intermediate & Advanced SEO | | MarieHaynes2