Missing trailing slash in URL on subpages resulting in Moz PA of 1
-
Even here in moz community I am noticing it. Is it really a factor to have an ending slash on the page? Does it make a difference? Our website has a homepage PA of 63, DA of 56 but all of our sub-pages are just 1 and they have been up for 4 months.
-
The redirect checker website is excellent. Great find!
-
Hope this helps,
Please see: https://github.com/blueprintmrk/htaccess
&
https://github.com/blueprintmrk/htaccess#redirect-using-redirectmatch
Removing "/." from .PHP URLs "win-win"
Alias “Clean” URLs
This snippet lets you use “clean” URLs -- those without a PHP extension, e.g.
example.com/users
instead ofexample.com/users.php
.RewriteEngine On RewriteCond %{SCRIPT_FILENAME} !-d RewriteRule ^([^.]+)$ $1.php [NC,L]
Remove Trailing Slash
This snippet will redirect paths ending in slashes to their non-slash-terminated counterparts (except for actual directories), e.g.
http://www.example.com/blog/
tohttp://www.example.com/blog
That is important for SEO since it’s recommended to have a canonical URL for every page.RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} (.+)/$ RewriteRule ^ %1 [R=301,L]
Force HTTPS
RewriteEngine on RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} # Note: It’s also recommended to enable HTTP Strict Transport Security (HSTS) # on your HTTPS website to help prevent man-in-the-middle attacks. # See https://developer.mozilla.org/en-US/docs/Web/Security/HTTP_strict_transport_security <ifmodule mod_headers.c="">Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains"</ifmodule>
Force HTTPS Behind a Proxy
Useful if you have a proxy in front of your server performing TLS termination.
RewriteCond %{HTTP:X-Forwarded-Proto} !https RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}
PS
checkout
https://www.nginx.com/products/ they are great!
Tom
Alias “Clean” URLs
This snippet lets you use “clean” URLs -- those without a PHP extension, e.g.
example.com/users
instead ofexample.com/users.php
.RewriteEngine On RewriteCond %{SCRIPT_FILENAME} !-d RewriteRule ^([^.]+)$ $1.php [NC,L]
-
<code class="language-htaccess" style="padding: 2px 6px; border: 0px; border-image-source: initial; border-image-slice: initial; border-image-width: initial; border-image-outset: initial; border-image-repeat: initial; margin: 0px; border-radius: 3px; text-shadow: #ffffff 0px 1px; word-break: normal; word-wrap: normal; tab-size: 4; background-image: initial; background-attachment: initial; background-color: initial; background-size: initial; background-origin: initial; background-clip: initial; background-position: 0px 0px; background-repeat: initial;">Glad I can help Try useing this to check it http://www.redirect-checker.org/index.php </code>
`#removes trailing slash if not a directory RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.+)/$ /$1 [R=301,L]
or
https://css-tricks.com/snippets/htaccess/remove-file-extention-from-urls/Take the / off the end of this
https://regex101.com/r/oK8xL9/3 like this` ^((?:\w+/\w+)+)$Seach & replace might be needed
<code class="language-htaccess" style="padding: 2px 6px; border: 0px; border-image-source: initial; border-image-slice: initial; border-image-width: initial; border-image-outset: initial; border-image-repeat: initial; margin: 0px; border-radius: 3px; text-shadow: #ffffff 0px 1px; word-break: normal; word-wrap: normal; tab-size: 4; background-image: initial; background-attachment: initial; background-color: initial; background-size: initial; background-origin: initial; background-clip: initial; background-position: 0px 0px; background-repeat: initial;">Hope that helps, Tom</code>
-
Brilliant!
Thank you so much Thomas!!! I will see what I can do about cleaning this all up!
I believe I have located it the issue. The redirects are occurring after a base rewrite rule:
Rewrite URLs to / from .html. SEO friendly. Added by David Turner 12/26/15
RewriteBase /
Rewrite requests for index.php to directory to avoid 500 errors when added to paths. Added by David Turner 12/30/15
RewriteCond %{THE_REQUEST} ^./index.php
RewriteRule ^(.)index.php$ /$1 [R=301,L]remove the .html extension
RewriteCond %{THE_REQUEST} ^GET\ (.).html\ HTTP
RewriteRule (.).html$ $1 [R=301]remove index and reference the directory
RewriteRule (.*)/index$ $1/ [R=301]
remove trailing slash if not a directory
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} /$
RewriteRule (.*)/ $1 [R=301]forward request to html file, but don't redirect (bot friendly)
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteCond %{REQUEST_URI} !/$
RewriteRule (.*) $1.html [L]Moving the 301s above these and cleaning these up a bit should restore the 301 redirects properly and regain Moz PA.
-
Found it your non-https .php URL has backlinks & you are 301 redirecting it to the "/" URL.
After that redirects to the non-/. Thus creating a redirect chain
You need to redirect the non-HTTPS version of the site/URL to the non-/ version of the site. This will give you the domain and page authority that you are missing.
I confirmed the back links using majestic.com
Result
http://www.ultrawebsitehosting.com/hosting-dedicated.php backlinks
301 Moved Permanently
https://www.ultrawebhosting.com/dedicated-servers/
301 Moved Permanently
https://www.ultrawebhosting.com/dedicated-servers lost PA when redirected so many times.
200 OKHTTP Headers
301 Moved Permanently
| Status: | 301 Moved Permanently |
| Code: | 301 |
| Server: | UltraSpeed Hosting by UltraWebHosting.com |
| Date: | Sun, 03 Apr 2016 23:42:41 GMT |
| Content-Type: | text/html; charset=iso-8859-1 |
| Content-Length: | 258 |
| Connection: | close |
| Location: | https://www.ultrawebhosting.com/dedicated-servers/ |
301 Moved Permanently
| Status: | 301 Moved Permanently |
| Code: | 301 |
| Server: | UltraSpeed Hosting by UltraWebHosting.com |
| Date: | Sun, 03 Apr 2016 23:42:47 GMT |
| Content-Type: | text/html; charset=iso-8859-1 |
| Content-Length: | 257 |
| Connection: | close |
| Location: | https://www.ultrawebhosting.com/dedicated-servers |
| X-Cache: | HIT from Backend |
200 OK
| Status: | 200 OK |
| Code: | 200 |
| Server: | UltraSpeed Hosting by UltraWebHosting.com |
| Date: | Sun, 03 Apr 2016 23:42:48 GMT |
| Content-Type: | text/html |
| Content-Length: | 40741 |
| Connection: | close |
| Vary: | Accept-Encoding |
| Last-Modified: | Fri, 01 Apr 2016 02:03:57 GMT |
| Access-Control-Allow-Origin: | * |
| X-Cache: | HIT from Backend |
| Accept-Ranges: | bytes |Features
This Redirect Checker supports several features like:
- · Select different User Agents like
· Desktop-Browsers (Chrome, Internet Explorer, Safari, Firefox,...)
· Mobile Devices (IPad, Iphone, Android, Windows Phone, Kindle, Nokia...
· Search Engine Bots (GoogleBot, Google Mobile Bot, Yandex, BingBot, Baidu, Yahoo Slurp, Naver,... - · checking 302 and 301 redirects
- · supports & checks https redirects
- · checks meta refresh redirects
- · analysis of common javascript redirects
- · check and show redirect chains
- · check http headers like Status Code, X-Robots-Tag, Rel Canonical Header Tag "Link:"
- · Select different User Agents like
-
I did not show what the PA was when I dropped the /
its 0 but when I add it is PA 28 see & try it.
Video of what I'm saying http://cl.ly/faXF
-
The 301 has to point to the / it shows PA
I'm about to grab dinner when I get back I will do it deep crawl your site and I'll find out the problem for you because it's definitely not a hard issue to figure out and I will dedicate some time to find out.
-
The 301 redirect has existed for 4 months and a day. Why has it not assumed PR with Moz?
-
It's because there are back links pointing to the URLs that you redirected to dedicated servers for instance. The others have no back links therefore they do not have any page rank.
-
The original question is if it is a factor for the trialing slash to not exist as I am seeing Moz PRs of 1 on these pages after four months.
I appreciate all the rewrites but this is all common knowledge to me.
-
Was not able to fix the problem? If not you may want to force a / with a /?$ that way it will only be forced if needed.
Hope that helps, Tom
-
Hello Thomas,
Thank you for your time.
Redirect 301 /hosting-dedicated.php https://www.ultrawebhosting.com/dedicated-servers
has been set since 01/02/16 via .htaccess
I have removed the duplicate access-control as one was arbitrating font extensions and the other everything.
-
Try //Rewrite to www
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^ultrawebhosting.com[nc]
RewriteRule ^(.*)$ http://www.ultrawebhosting.com/$1 [r=301,nc]
//301 Redirect Old File
Redirect 301 .php /
See http://www.askapache.info//2.3/mod/mod_alias.html#redirectmatch
Sorry for all the duplicate stuff everything posts that way is annoying sorry about that. Nevertheless, you have to remove the PHP from your site. And redirect it correctly.
Let me know if that helps,
See below
|
Purpose
|
Example formatting
Include an entire directory but nothing beneath it
|
http://www.yourdomain.com/shop/
^/shop/?$
Include all subdirectories
|
http://www.yourdomain.com/shop/*
^/shop/.*
Include a single file
|
http://www.yourdomain.com/shop.php
^/shop.php
Include any file of a specific type
|
^/shop/.*.php – any php file
|
-
Look at this http://cl.ly/faXF
compare with
It is still showing up with .php
https://www.ultrawebhosting.com/hosting-dedicated.php needs to 301 to
https://www.ultrawebhosting.com/dedicated-servers/
Its the .php & different link that has back links to it that is not properly pointing to it. Check
-
You have 2
Access-Control-Allow-Origin: *
Access-Control-Allow-Origin: *
Server: UltraSpeed Hosting by UltraWebHosting.com
Date: Sun, 03 Apr 2016 08:06:06 GMT
Content-Type: text/html
Content-Length: 34133
Connection: keep-alive
Vary: Accept-Encoding
Last-Modified: Sat, 26 Mar 2016 05:37:53 GMT
Access-Control-Allow-Origin: *
Access-Control-Allow-Origin: *
X-Cache: HIT from Backend
Accept-Ranges: bytes
Access-Control-Allow-Origin: *
Access-Control-Allow-Origin: *
-
thank you for providing me with that URL I will take a look right now
-
Unfortunately this does not quite answer the question. The structure is by design but I am having my second thoughts after reviewing Moz and seeing this occurrence. Why are all sub-directories which do not end with / have a moz trust of 1? This even occurs here in the community forum. When the DA is 56 and the pages have been around for four months and are all linked from the homepage shouldn't they have a PA? Is the lack of a trailing slash a factor?
https://www.ultrawebhosting.com
Ex:
https://www.ultrawebhosting.com/about
https://www.ultrawebhosting.com/dedicated-servers -
Are your subpages subdomains? Or subfolders? I'm going to assume they are subfolders.
If you're domain authority changes because of your page, that would be the only thing that would make me think you're talking about a subdomain.
PA 63 & DA 56 your site will be crawled quickly because it has decent domain authority just because your homepage has high page authority does not mean the rest of the site will.
It is not unusual for a brand-new page to have little page authority you can check if your forward slash "/" is being forced use screaming frog, redirect mapper, or https://varvy.com/tools/redirects/
You can then force a "/" or prevent one depending on what you find. Using regex
Name: Redirect my contact page
Domain: www.domain.com
Source: ^/old-path/contact-us/?$
Destination: /new-path/contact-us/
Redirect type: 301 Permanent- This Redirect Rule will match a URL of http://www.domain.com/old-path/contact-us -or- http://www.domain.com/old-path/contact-us/
- The variation is because of the Regex Syntax “/?$”
- The Question Mark “?” makes the Trailing slash Optional
- It will also only match the Source if it Starts with a “/” (note the carrot “^” ), or ends with either “s” or “/” (note the ending “$” )
https://wpengine.com/support/regex/
http://stackoverflow.com/questions/16657152/matching-a-forward-slash-with-a-regex
This depends on your server, and what language are using so, I strongly suggest you use tool to verify your changes before making them.
https://regex101.com/r/oK8xL9/1
I hope this helps,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Second Store URL
Our store has been up and running for about 18 months and has been more successful than expected. Unfortunately we have run into a few minor issues with customers wanting to pick up in store but much of the online parts we sell are drop ship. We have made the decision to open a second store. One will support our brand (using current URL)and reflect products and pricing we offer at our B&M locations. The second will continue down the path we have been going but under a completely different brand (and URL) in no way tied to our B&M stores. My question is this: Would it be smarter to re brand the store we have now and change the URL.?We would then create the second site as our corporate branded site. Or do we adjust the currently site and then create a second site with the new brand. The only real hold up is that the re branded store will generate far more revenue and the current site is optimized very well. However if i change the URL the optimization will go out the window .
Web Design | | Rillik0 -
Google result showing old Meta Title / Description even though page view source shows new info.
Hey guys! I'm struggling with why Google is ignoring my Meta Title / Description. I made a pretty drastic change to both about a week ago and on the results it hasn't changed. I'm on first page with several keywords and I think this weird caching is hurting me on where I'm at on the page. Thoughts / Ideas?
Web Design | | curtis_williams0 -
Live website is an addon domain - Need site old development url inaccessable from live domain
Hi everyone, I have a website which is built in Joomla 2.5. The development site is located at www,abc.com/subdomain/. We have set the site live using an addon domain which is www.xyz.com. The problem is, www.abc.com/subdomain/ is still accessible and being crawled by Google. How is the best way to make the development url inaccessible? Any help would be appreciated!
Web Design | | DougHosmer0 -
Bizarre PDF URL string
Hey folks, I'm getting literally hundreds of Duplicate Title and Duplicate Content errors for a site, and most of them are a result of the same issue. The site uses javascript container pages a lot, but each gets their own URL. Unfortunately, it seems like each page is also loading all the content for all the other pages, or something. For instance, I have a section of the site under /for-institutions/, and then there are 5 container pages under that. Each container page has it's own URL, so when you select it, you get the URL /for-institutions/products/ or /for-institutions/services/ etc. However, the institutions container page doesn't change, just the content within. In my SEO results, I'm getting the following: /for-institutions/$%7Bpdf%7D/ /for-institutions/$%7Bpdf%7D/$%7Bpdf%7D/ etc, each as a duplicate title and content page. How can I eliminate this? Is there a regular expression that rewrites URL segments beginning with $ ? For your reference: The page is set up so that any URL that doesn't exist just refers to the subdirectory. /for-institutions/$%7Bpdf%7D/ displays /for-institutions/, but does not rewrite the URL. So too if I were to enter /for-institutions/dog.
Web Design | | SpokeHQ0 -
Sites went from page 1 to page 40 + in results
Hello all We are looking for any insight we can get as to why all (except 1) of our sites were effected very badly in the rankings by Google since the Panda updates. Several of our sites londonescape.com dublinescape.com and prague, paris, florence, delhi, dubai and a few others (all escape.com urls) have had major drop in their rankings. LondonEscape.net (now.com (changed after rank drop) ), was ranked between 4th & 6th but is now down around 400th and DelhiEscape.net and MunichEscape.com were both number 1 for several years for our main key words We also had two Stay sites number 1 , AmsterdamStay and NewYorkstay both .com ranked number 1 for years , NewYork has dropped to 10th place so far the Amsterdam site has not been effected. We are not really sure what we did wrong. MunichEscape and DelhiEcape should never have been page 1 sites ) just 5 pages and a click thru to main site WorldEscape) but we never did anything to make them number 1. London, NewYork and Amsterdam sites have had regular new content added, all is checked to make sure its original. **Since the rankings drop ** LondonEscape.com site We have redirected the.net to the .com url Added a mountain of new articles and content Redesigned the site / script Got a fair few links removed from sites, any with multiple links to us. A few I have not managed yet to get taken down. So far no result in increased rankings. We contacted Google but they informed us we have NOT had a manual ban imposed on us, we received NO mails from Google informing us we had done anything wrong. We were hoping it would be a 6 month ban but we are way past that now. Anyone any ideas ?
Web Design | | WorldEscape0 -
Best Wordpress Themes or Theme Creators for Best SEO Results
Hi, I just recently joined SEOMOZ and am excited to be apart of the community. I am launching a blog to educate mu readers on a variety of topics. Is there any specific themes or theme creators that do a great job at structuring their themes from a technical perspective for the best SEO results? Thanks!
Web Design | | ROYINOW0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0 -
Has Anyone Had Issues With ASP.NET 4.0 URL Routing?
I'm seeing some odd results in my SEOMOZ results with a new site I just released that is using the ASP.NET 4.0 URL routing. I am seeing thousands(!) of duplicate results, for instance, because the crawl has uncovered something like this: http://www.mysite.com/
Web Design | | TroyCarlson
http://www.mysite.com/default.aspx (so far, so good, though I wish it wouldn't show both)
http://www.mysite.com/default.aspx/about/ (what the heck -?)
http://www.mysite.com/default.aspx/about/about/ (WTF!?)
http://www.mysite.com/default.aspx/about/about/products/ (and on and on ad infinitum) I'm also seeing problems pop up in my sitemap because extensionless urls have an odd "eurl.axd/abunchofnumbersgohere" appended to the end of every address which is breaking links. sigh Buyer beware. I've found articles that discuss the "eurl.axd" issue here and there (this one seems very good), but nothing about the weird crawl issue I outlined above. Any advice?0