Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Splitting Page Authority with two URLs for the same page.
-
Hello guys,
My website is currently holding two different URLs for the same page and I am under the impression such set up is dividing my Page Authority and Link Juice.
We currently have the following page with both URLs below:
www.wbresearch.com/soldiertechnologyusa/home.aspx
www.wbresearch.com/soldiertechnologyusa/Analysing the page authority and backlinks I identified that we are splitting the amount of backlinks (links from sites, social media and therefore authority).
"/home.aspx"
PA: 67
Linking Root Domains: 52
Total Links: 272"/"
PA: 64
Linking Root Domains: 29
Total Links: 128I am under the impression that if the URLs were the same we would maximise our backlinks and therefore page authority.
My Question: How can I fix this? Should I have a 301 redirect from the page "/" to the "/home.aspx" therefore passing the authority and link juice of “/” directly to “/homes.aspx”?
Trying to gather thoughts and ideas on this, suggestions are much appreciated?
Thanks!
-
Great help.
Thanks both!
-
Hi Joao, yes a 301 redirect would be preferable to a canonical. A 301 is more "absolute" - it lets search engines know that they should ignore the redirected page. A canonical is more like a piece of advice for search engines.
Canonicals are useful if you don't have the development skills or resources to implement a 301, and they can also be used when it's not practical to add a 301 to lots of web pages.
In short - use a 301 if practical

-
I think it generally depends on the cause of the duplicate. If its system issue then you'll forever be creating 301s for your urls. In that case its best to avoid having to do the 301 and stick with canonical. With canonical are telling the search engine to only index one version of the url.
Both 301 and canonical have their uses but the choice should depend on the issue and what you are trying to achieve. Hope this helps?
Duke
-
Hi Duke and Alice,
Thank you for your both replies. Very helpful.
We currently do a rel="canonical" from the page "/" to the "/home.aspx", which should avoid the content duplication issue.
I have seen mix opinions on where to use rel="canonical" vs 301 redirect. Just found a Matt Cutts' video about that (http://www.youtube.com/watch?v=zW5UL3lzBOA)
Alice - I take that it might be better to do a 301 redirect than a rel="canonical", as per the video. What do think? should I leave the rel-canonical or try to move to a 301?
Cheers guys!
-
Hi Joao, some good advice from Duke here. A 301 redirect will solve this duplicate problem and help to consolidate the authority. However it's worth investigating to see what caused the problem and whether it is a wider issue, in which case canonicals might be more appropriate. Good luck!
-
Hi Joao,
I think you probably need to establish if the those two urls came about due to a cms or system issue. I ask this because some cms system create duplicate/different urls for the same page and the good ones have a canonical set up to avoid duplicate content. If it is a system or cms issue then get a canonical set up. Use screeming frog to run a crawl to see if i picks up any duplicate urls. Currently, your homepage runs the risk of duplicate content penalty.
If its not a system wide issue, then set up a 301 redirect. Think of the home page that people will remember easily and can share on social media platforms without part of it being cut of due to length.
All the best
Duke
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Japanese URL-structured sitemap (pages) not being indexed by Bing Webmaster Tools
Hello everyone, I am facing an issue with the sitemap submission feature in Bing Webmaster Tools for a Japanese language subdirectory domain project. Just to outline the key points: The website is based on a subdirectory URL ( example.com/ja/ ) The Japanese URLs (when pages are published in WordPress) are not being encoded. They are entered in pure Kanji. Google Webmaster Tools, for instance, has no issues reading and indexing the page's URLs in its sitemap submission area (all pages are being indexed). When it comes to Bing Webmaster Tools it's a different story, though. Basically, after the sitemap has been submitted ( example.com/ja/sitemap.xml ), it does report an error that it failed to download this part of the sitemap: "page-sitemap.xml" (basically the sitemap featuring all the sites pages). That means that no URLs have been submitted to Bing either. My apprehension is that Bing Webmaster Tools does not understand the Japanese URLs (or the Kanji for that matter). Therefore, I generally wonder what the correct way is to go on about this. When viewing the sitemap ( example.com/ja/page-sitemap.xml ) in a web browser, though, the Japanese URL's characters are already displayed as encoded. I am not sure if submitting the Kanji style URLs separately is a solution. In Bing Webmaster Tools this can only be done on the root domain level ( example.com ). However, surely there must be a way to make Bing's sitemap submission understand Japanese style sitemaps? Many thanks everyone for any advice!
Technical SEO | | Hermski0 -
Blog Page Titles - Page 1, Page 2 etc.
Hi All, I have a couple of crawl errors coming up in MOZ that I am trying to fix. They are duplicate page title issues with my blog area. For example we have a URL of www.ourwebsite.com/blog/page/1 and as we have quite a few blog posts they get put onto another page, example www.ourwebsite.com/blog/page/2 both of these urls have the same heading, title, meta description etc. I was just wondering if this was an actual SEO problem or not and if there is a way to fix it. I am using Wordpress for reference but I can't see anywhere to access the settings of these pages. Thanks
Technical SEO | | O2C0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Two META Robots tags on a page - which will win?
Hi, Does anybody know which meta-robots tag will "win" if there is more than one on a page? The situation:
Technical SEO | | jmueller
our CMS is not very flexible and so we have segments of META-Tags on the page that originate from templates.
Now any author can add any meta-tag from within his article-editor.
The logic delivering the pages does not care if there might be more than one meta-robots tag present (one from template, one from within the article). Now we could end up with something like this: Which one will be regarded by google & co?
First?
Last?
None? Thanks a lot,
Jan0 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | | ShawnHerrick0 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
Landing Page URL Structure
We are finally setting up landing pages to support our PPC campaigns. There has been some debate internally about the URL structure. Originally we were planning on URL's like: domain.com /california /florida /ny I would prefer to have the URL's for each state inside a "state" folder like: domain.com /state /california /florida /ny I like having the folders and pages for each state under a parent folder to keep the root folder as clean as possible. Having a folder or file for each state in the root will be very messy. Before you scream URL rewriting :-). Our current site is still running under Classic ASP which doesn't support URL rewriting. We have tried to use HeliconTech's ISAPI rewrite module for IIS but had to remove it because of too many configuration issues. Next year when our coding to MVC is complete we will use URL rewriting. So the question for now: Is there any advantage or disadvantage to one URL structure over the other?
Technical SEO | | briankb0 -
Deep Page Link - url no longer exists
I used Open Site Explorer and found a link to our site on http://www.business.com/guides/bedding-supplies-3639/ The link was setup to go to an important, deep page on my website, but the structure of our urls changed and the url no longer exists. The link (anchor text 'National Hospitality Supply') does direct to our homepage, www.nathosp.com. My question is, am I receiving full link juice? Or would I be better served to create a 301 redirect to the revised / new page url? In case it matters, if I had my choice I'd prefer the link to go to the intended deep page. Thanks in advance for your insight. -Josh Fulfer
Technical SEO | | mhans0