Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Missing trailing slash in URL on subpages resulting in Moz PA of 1
-
Even here in moz community I am noticing it. Is it really a factor to have an ending slash on the page? Does it make a difference? Our website has a homepage PA of 63, DA of 56 but all of our sub-pages are just 1 and they have been up for 4 months.
-
The redirect checker website is excellent. Great find!
-
Hope this helps,
Please see: https://github.com/blueprintmrk/htaccess
&
https://github.com/blueprintmrk/htaccess#redirect-using-redirectmatch
Removing "/." from .PHP URLs "win-win"
Alias “Clean” URLs
This snippet lets you use “clean” URLs -- those without a PHP extension, e.g.
example.com/users
instead ofexample.com/users.php
.RewriteEngine On RewriteCond %{SCRIPT_FILENAME} !-d RewriteRule ^([^.]+)$ $1.php [NC,L]
Remove Trailing Slash
This snippet will redirect paths ending in slashes to their non-slash-terminated counterparts (except for actual directories), e.g.
http://www.example.com/blog/
tohttp://www.example.com/blog
That is important for SEO since it’s recommended to have a canonical URL for every page.RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} (.+)/$ RewriteRule ^ %1 [R=301,L]
Force HTTPS
RewriteEngine on RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} # Note: It’s also recommended to enable HTTP Strict Transport Security (HSTS) # on your HTTPS website to help prevent man-in-the-middle attacks. # See https://developer.mozilla.org/en-US/docs/Web/Security/HTTP_strict_transport_security <ifmodule mod_headers.c="">Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains"</ifmodule>
Force HTTPS Behind a Proxy
Useful if you have a proxy in front of your server performing TLS termination.
RewriteCond %{HTTP:X-Forwarded-Proto} !https RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}
PS
checkout
https://www.nginx.com/products/ they are great!
Tom
Alias “Clean” URLs
This snippet lets you use “clean” URLs -- those without a PHP extension, e.g.
example.com/users
instead ofexample.com/users.php
.RewriteEngine On RewriteCond %{SCRIPT_FILENAME} !-d RewriteRule ^([^.]+)$ $1.php [NC,L]
-
<code class="language-htaccess" style="padding: 2px 6px; border: 0px; border-image-source: initial; border-image-slice: initial; border-image-width: initial; border-image-outset: initial; border-image-repeat: initial; margin: 0px; border-radius: 3px; text-shadow: #ffffff 0px 1px; word-break: normal; word-wrap: normal; tab-size: 4; background-image: initial; background-attachment: initial; background-color: initial; background-size: initial; background-origin: initial; background-clip: initial; background-position: 0px 0px; background-repeat: initial;">Glad I can help Try useing this to check it http://www.redirect-checker.org/index.php </code>
`#removes trailing slash if not a directory RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.+)/$ /$1 [R=301,L]
or
https://css-tricks.com/snippets/htaccess/remove-file-extention-from-urls/Take the / off the end of this
https://regex101.com/r/oK8xL9/3 like this` ^((?:\w+/\w+)+)$Seach & replace might be needed
<code class="language-htaccess" style="padding: 2px 6px; border: 0px; border-image-source: initial; border-image-slice: initial; border-image-width: initial; border-image-outset: initial; border-image-repeat: initial; margin: 0px; border-radius: 3px; text-shadow: #ffffff 0px 1px; word-break: normal; word-wrap: normal; tab-size: 4; background-image: initial; background-attachment: initial; background-color: initial; background-size: initial; background-origin: initial; background-clip: initial; background-position: 0px 0px; background-repeat: initial;">Hope that helps, Tom</code>
-
Brilliant!
Thank you so much Thomas!!! I will see what I can do about cleaning this all up!
I believe I have located it the issue. The redirects are occurring after a base rewrite rule:
Rewrite URLs to / from .html. SEO friendly. Added by David Turner 12/26/15
RewriteBase /
Rewrite requests for index.php to directory to avoid 500 errors when added to paths. Added by David Turner 12/30/15
RewriteCond %{THE_REQUEST} ^./index.php
RewriteRule ^(.)index.php$ /$1 [R=301,L]remove the .html extension
RewriteCond %{THE_REQUEST} ^GET\ (.).html\ HTTP
RewriteRule (.).html$ $1 [R=301]remove index and reference the directory
RewriteRule (.*)/index$ $1/ [R=301]
remove trailing slash if not a directory
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} /$
RewriteRule (.*)/ $1 [R=301]forward request to html file, but don't redirect (bot friendly)
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteCond %{REQUEST_URI} !/$
RewriteRule (.*) $1.html [L]Moving the 301s above these and cleaning these up a bit should restore the 301 redirects properly and regain Moz PA.
-
Found it your non-https .php URL has backlinks & you are 301 redirecting it to the "/" URL.
After that redirects to the non-/. Thus creating a redirect chain
You need to redirect the non-HTTPS version of the site/URL to the non-/ version of the site. This will give you the domain and page authority that you are missing.
I confirmed the back links using majestic.com
Result
http://www.ultrawebsitehosting.com/hosting-dedicated.php backlinks
301 Moved Permanently
https://www.ultrawebhosting.com/dedicated-servers/
301 Moved Permanently
https://www.ultrawebhosting.com/dedicated-servers lost PA when redirected so many times.
200 OKHTTP Headers
301 Moved Permanently
| Status: | 301 Moved Permanently |
| Code: | 301 |
| Server: | UltraSpeed Hosting by UltraWebHosting.com |
| Date: | Sun, 03 Apr 2016 23:42:41 GMT |
| Content-Type: | text/html; charset=iso-8859-1 |
| Content-Length: | 258 |
| Connection: | close |
| Location: | https://www.ultrawebhosting.com/dedicated-servers/ |
301 Moved Permanently
| Status: | 301 Moved Permanently |
| Code: | 301 |
| Server: | UltraSpeed Hosting by UltraWebHosting.com |
| Date: | Sun, 03 Apr 2016 23:42:47 GMT |
| Content-Type: | text/html; charset=iso-8859-1 |
| Content-Length: | 257 |
| Connection: | close |
| Location: | https://www.ultrawebhosting.com/dedicated-servers |
| X-Cache: | HIT from Backend |
200 OK
| Status: | 200 OK |
| Code: | 200 |
| Server: | UltraSpeed Hosting by UltraWebHosting.com |
| Date: | Sun, 03 Apr 2016 23:42:48 GMT |
| Content-Type: | text/html |
| Content-Length: | 40741 |
| Connection: | close |
| Vary: | Accept-Encoding |
| Last-Modified: | Fri, 01 Apr 2016 02:03:57 GMT |
| Access-Control-Allow-Origin: | * |
| X-Cache: | HIT from Backend |
| Accept-Ranges: | bytes |Features
This Redirect Checker supports several features like:
- · Select different User Agents like
· Desktop-Browsers (Chrome, Internet Explorer, Safari, Firefox,...)
· Mobile Devices (IPad, Iphone, Android, Windows Phone, Kindle, Nokia...
· Search Engine Bots (GoogleBot, Google Mobile Bot, Yandex, BingBot, Baidu, Yahoo Slurp, Naver,... - · checking 302 and 301 redirects
- · supports & checks https redirects
- · checks meta refresh redirects
- · analysis of common javascript redirects
- · check and show redirect chains
- · check http headers like Status Code, X-Robots-Tag, Rel Canonical Header Tag "Link:"
- · Select different User Agents like
-
I did not show what the PA was when I dropped the /
its 0 but when I add it is PA 28 see & try it.
Video of what I'm saying http://cl.ly/faXF
-
The 301 has to point to the / it shows PA
I'm about to grab dinner when I get back I will do it deep crawl your site and I'll find out the problem for you because it's definitely not a hard issue to figure out and I will dedicate some time to find out.
-
The 301 redirect has existed for 4 months and a day. Why has it not assumed PR with Moz?
-
It's because there are back links pointing to the URLs that you redirected to dedicated servers for instance. The others have no back links therefore they do not have any page rank.
-
The original question is if it is a factor for the trialing slash to not exist as I am seeing Moz PRs of 1 on these pages after four months.
I appreciate all the rewrites but this is all common knowledge to me.
-
Was not able to fix the problem? If not you may want to force a / with a /?$ that way it will only be forced if needed.
Hope that helps, Tom
-
Hello Thomas,
Thank you for your time.
Redirect 301 /hosting-dedicated.php https://www.ultrawebhosting.com/dedicated-servers
has been set since 01/02/16 via .htaccess
I have removed the duplicate access-control as one was arbitrating font extensions and the other everything.
-
Try //Rewrite to www
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^ultrawebhosting.com[nc]
RewriteRule ^(.*)$ http://www.ultrawebhosting.com/$1 [r=301,nc]
//301 Redirect Old File
Redirect 301 .php /
See http://www.askapache.info//2.3/mod/mod_alias.html#redirectmatch
Sorry for all the duplicate stuff everything posts that way is annoying sorry about that. Nevertheless, you have to remove the PHP from your site. And redirect it correctly.
Let me know if that helps,
See below
|
Purpose
|
Example formatting
Include an entire directory but nothing beneath it
|
http://www.yourdomain.com/shop/
^/shop/?$
Include all subdirectories
|
http://www.yourdomain.com/shop/*
^/shop/.*
Include a single file
|
http://www.yourdomain.com/shop.php
^/shop.php
Include any file of a specific type
|
^/shop/.*.php – any php file
|
-
Look at this http://cl.ly/faXF
compare with
It is still showing up with .php
https://www.ultrawebhosting.com/hosting-dedicated.php needs to 301 to
https://www.ultrawebhosting.com/dedicated-servers/
Its the .php & different link that has back links to it that is not properly pointing to it. Check
-
You have 2
Access-Control-Allow-Origin: *
Access-Control-Allow-Origin: *
Server: UltraSpeed Hosting by UltraWebHosting.com
Date: Sun, 03 Apr 2016 08:06:06 GMT
Content-Type: text/html
Content-Length: 34133
Connection: keep-alive
Vary: Accept-Encoding
Last-Modified: Sat, 26 Mar 2016 05:37:53 GMT
Access-Control-Allow-Origin: *
Access-Control-Allow-Origin: *
X-Cache: HIT from Backend
Accept-Ranges: bytes
Access-Control-Allow-Origin: *
Access-Control-Allow-Origin: *
-
thank you for providing me with that URL I will take a look right now
-
Unfortunately this does not quite answer the question. The structure is by design but I am having my second thoughts after reviewing Moz and seeing this occurrence. Why are all sub-directories which do not end with / have a moz trust of 1? This even occurs here in the community forum. When the DA is 56 and the pages have been around for four months and are all linked from the homepage shouldn't they have a PA? Is the lack of a trailing slash a factor?
https://www.ultrawebhosting.com
Ex:
https://www.ultrawebhosting.com/about
https://www.ultrawebhosting.com/dedicated-servers -
Are your subpages subdomains? Or subfolders? I'm going to assume they are subfolders.
If you're domain authority changes because of your page, that would be the only thing that would make me think you're talking about a subdomain.
PA 63 & DA 56 your site will be crawled quickly because it has decent domain authority just because your homepage has high page authority does not mean the rest of the site will.
It is not unusual for a brand-new page to have little page authority you can check if your forward slash "/" is being forced use screaming frog, redirect mapper, or https://varvy.com/tools/redirects/
You can then force a "/" or prevent one depending on what you find. Using regex
Name: Redirect my contact page
Domain: www.domain.com
Source: ^/old-path/contact-us/?$
Destination: /new-path/contact-us/
Redirect type: 301 Permanent- This Redirect Rule will match a URL of http://www.domain.com/old-path/contact-us -or- http://www.domain.com/old-path/contact-us/
- The variation is because of the Regex Syntax “/?$”
- The Question Mark “?” makes the Trailing slash Optional
- It will also only match the Source if it Starts with a “/” (note the carrot “^” ), or ends with either “s” or “/” (note the ending “$” )
https://wpengine.com/support/regex/
http://stackoverflow.com/questions/16657152/matching-a-forward-slash-with-a-regex
This depends on your server, and what language are using so, I strongly suggest you use tool to verify your changes before making them.
https://regex101.com/r/oK8xL9/1
I hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scroll and URL Changing
Hi, So my website is having an issue indexing. Much like other sports sites like ESPN or MLB or a variety of others my site changes the URL as you go down the page. So if you go on a news article and continue scrolling you'll go to another news article. I believe that this is creating errors in Search Console with the article being given an error of being "too long". I don't know how to keep this infinite scroll and url changing which increases my pageviews and eliminate the errors. Can someone help?
Web Design | | mattdinbrooklyn0 -
Interlinking using Dynamic URLs Versus Static URLs
Hi Guys, Could you kindly help us in choosing best approach out of mentioned below 2 cases. Case. 1 -We are using: We interlink our static pages(www.abc.com/jobs-in-chennai) through footer, navigation & by showing related searches. Self referential Canonical tags have been implemented. Case. 2 -We plan to use: We interlink our Dynamic pages(www.abc.com/jobs-in-chennai?source=footer) through footer, navigation & by showing related searches. Canonical tags have been implemented on dynamic urls pointing to corresponding static urls Query 1. Which one is better & expected to improve rankings. Query 2. Will shifting to Case 2 negatively affect our existing rankings or traffic. Regards
Web Design | | vivekrathore0 -
Does Google count the domain name in its 115-character "ideal" URL length?
I've been following various threads having to do with URL length and Google's happiness therewith and have yet to find an answer to the question posed in the title. Some answers and discussions have come close, but none I've found have addressed this with any specificity. Here are four hypothetical URLs of varying lengths and configurations: EXAMPLE ONE:
Web Design | | RScime25
my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (115 characters) EXAMPLE TWO: sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (126 characters) EXAMPLE THREE: www.sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (130 characters) EXAMPLE FOUR: http://www.sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (137 characters) Assuming the examples contain appropriate keywords and are linked to appropriate anchor text (etc.,) how would Google look upon each? All I've been able to garner thus far is that URLs should be as short as possible while still containing and contextualizing keywords. I have 500+ URLs to review for the company I work for and could use some guidance; yes, I know I should test, but testing is problematical to the extreme; I look to the collective/accumulated wisdom of the MOZVerse for help. Thanks.1 -
Yes or No for Ampersand "&" in SEO URLs
Hi Mozzers I would like to know how crawlers see the ampersand (& or &) in your URLs and if Google frown upon this or not? As far as I know they purely recognise this as "and" is this correct and is there any best practice for implementing this, as I know a lot of people complained before about & in links and that it is better to use it as &, but this is not on links, this is on URLs. Reason for this is that we looking to move onto an ASP.Net MVC framework (any suggestions for a different framework are welcome, we still just planning out future development) and in order to make use of the filter options we have on our site we need a parameter to indicate the difference on a routing level (routing sends to controller, controller sends to model, model sends to controller and controller sends to view < this is pattern of a request that comes in on the framework we will be using). I already have -'s and /'s in the URLs (which is for my SEO structuring) so these syntax can't be used for identifying filters the user clicks or uses to define their search as it will create a complete mess in the system. Now we looking at & to say; OK, when a user lands on /accommodation and they selects De Kelders (which is a destination in our area) the page will be /accommodation/de-kelders on this page they can define their search further to say they are looking for 5 star accommodation and it should be close to the beach, this is where the routing needs some guidance and we looking to have it as follow: /accommodation/de-kelders/5-star&close-to-the-beach. Now, does the "&" get identified by search engines on a URL level as "and" and does this cause any issues with crawling or indexation or would it be best to look at another solution? Thanks, Chris Captivate
Web Design | | DROIDSTERS0 -
URLs with Hashtags - Does Google Index Them?
Hi there, I have a potential issue with a site whereby all pages are dynamically populated using Javascript. Thus, an example of an URL on their site would be www.example.com/#!/category/product. I have read lots of conflicting information on the web - some says Google will ignore everything after the hashtag; other people say that Google will now index everything after the hashtag. Does anybody have any conclusive information about this? Any links to Google or Matt Cutts as confirmation would be brilliant. P.S. I am aware about the potential issue of duplicate content, but I can assure you that has been dealt with. I am only concerned about whether Google will index full URLs that contain hashtags. Thanks all! Mark
Web Design | | markadoi840 -
URL structure for multiple cities?
Hi, i am in the process of setting up a business directory site that will be used in a number of cities, though i am initially launching with only one city. My question is, what is the best URL structure to use for the site and should i start using this URL structure from day one? At the moment i am using www.mysite.com.au as my primary website where it contains all listings for the the one initial launch city. Though to plan for the future i was considering this URL structure: www.mysite.com.au/cityname so for example, if i launch in the city Sydney initially then all website traffic that goes to www.mysite.com.au would simply be redirected (302 temp redirect?) to www.mysite.com.au/sydney. When i expand to other cities www.mysite.com.au would simply be a "select your city" screen that then redirects to the city of choice (similar to www.groupon.com page). How would doing a 302 redirect from www.mysite.com.au to www.mysite.com.au/city impact on SEO for the initial launch? Or should i just place this on the root domain since no other cities exist at the moment?
Web Design | | adamkirk0 -
Missing Meta Description Tag - Wordpress Tag
I am going through my crawl diagonostics issues and I have lots of "Missing Meta Description Tags". However when I look at the url's they are Wordpress Tags, which do not have a meta description. Shall I just ignore these errors or should I find a way to add a meta description? Is it important?
Web Design | | petewinter0 -
Custom URL's with Bigcommerce Issue (Is it worth it?)
We're building out a store in Bigcommerce, who for all intensive purposes is perfect for SEO besides the fact that you can not change the URL's to be custom. My question is, does this kill the SEO value of bigcommerce, despite everything else being great? So for example the URL's for a category page would be something like this www.mysite.com/categories/keyword and the product URL's are pulled in by product name, so product URL's could be something like www.mysite.com/products/Product-Description-Long-223.html (notice the words will be capitalized and their is no way to remove the trailing .html) I could go with Interspire (the liscenced version of Bigcommerce) or Magento so I can custom edit this stuff. But then its a lot more work for my employee's on the buildout.
Web Design | | iAnalyst.com0