Index.php duplicate content
-
Hi, new here.
Im looking for some help with htaccess file.
index.php is showing duplicate content errors with:
ive managed to use the following code to remove the www part of the url:
IfModule mod_rewrite.c>
RewriteCond %{HTTPS} !=on
RewriteCond %{HTTP_HOST} ^www.(.+)$ [NC]
RewriteRule ^ http://%1%{REQUEST_URI} [R=301,L]but how can i redirect the mysite.com/index.php and mysite.com/ to mysite.com.
Please help
-
Of course! Glad I could help.
-
Great thankyou so much.
Also another interesting person to follow on twitter
-
Once your trial runs out, you won't be able to respond, but at least you'll still be able to see my answer!
I think you can use the .htaccess file redirect command:
Redirect 301 /location/from/root/file.ext http://www.othersite.com/new/file/location.xyz
So, it should be:
Redirect 301 /index.php http://mysite.com
Redirect 301 / http://mysite.com
If that doesn't work, here's someone with the same problem on StackExchange.
P.S. Glad you like the Distilled course.
-
Hi Kristina,
No unfortunately my question was misunderstood.
It wasn't the file extension i was looking to hide.
I was looking to redirect mysite.com/index.php to mysite.com/ or mysite.com via an htaccess file.
Im currently working through distilled's online course (nice course btw) , I was hoping to get an answer here :http://www.distilled.net/u/technical/#technical-duplicate-content, (Homepage Canonicalization), but i couldnt find one
This is not a critical question, as im just tinkering around on friends sites, but i would like to learn this.
Unfortunately my free seomoz trial runs out today and im waiting until I have completed distilled's course's before i renew my moz subscription, so i can make better use of moz tools, so i may not be able to see your reply.
Is it cool to tweet @ you?
-
Hey David,
Just wanted to follow up with you on this - did TextMarketing's method work?
-
<code>## hide .php extension # To externally redirect /dir/foo.php to /dir/foo RewriteCond%{THE_REQUEST}^[A-Z]{3,}\s([^.]+)\.php [NC]RewriteRule^%1[R,L,NC]</code>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content incorrectly being duplicated on microsite
So bear with me here as this is probably a technical issue and i am not that technical. We have a microsite for one of our partner organisations and recently we have detected that content from our main site appearing in the URLs for the microsite - both in search results and then when you click through to the SERP. However, this content does not exist on the actual website at all. Anyone have a possible explanation for this? I have tried searching the web but nothing. I assume there is something in the set up of the microsite that is associating it with the content on the main site.
Technical SEO | | Discovery_SA0 -
Duplicate content on job sites
Hi, I have a question regarding job boards. Many job advertisers will upload the same job description to multiple websites e.g. monster, gumtree, etc. This would therefore be viewed as duplicate content. What is the best way to handle this if we want to ensure our particular site ranks well? Thanks in advance for the help. H
Technical SEO | | HiteshP0 -
Duplicate content / title caused by CAPITALS
What is the best way to stop duplicate content warning (and Google classing them as duplicate content), when it is caused by CAPITALS (i.e www.domain.com/Directory & www.domain.com/directory ). I try to always use lower case (unless a place name then i use Capitals for the first letter), but it looks like i have slipped up and got some mixed up and other sites will also be linking to Capitals Thanks Jon
Technical SEO | | jonny5123790 -
Index.php and 301 redirect with Joomla
Hi, I'm running Joomla 1.7 with SEF on and I'm trying to do a htaccess redirect which fails. I have approximately 100 in effect so far and all working fine, but I have one snag. Index.php is not working as I need it to when it's redirected to www.myurl.com/ If I turn on index.php redirect to root using this code #index.php to root
Technical SEO | | NaescentAdam
RewriteCond %{HTTP_HOST} ^myurl.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.myurl.com$
RewriteRule ^index.php$ "http://www.myurl.com/" [R=301,L] And then go to www.myurl.com/test.html I'm redirected to the homepage. I think this is because all pages are index.php in joomla. SEOMOZ and Google both think that index.php and root are duplicate pages. Does anyone have any advice for overcoming this? Thanks, Adam0 -
Link Structure & Duplicate Content
I am struggling with how I should handle the link structure on my site. Right now most of my pages are like this: Home -> Department -> Service Groups -> Content Page For Example: Home -> IT Solutions -> IT Support & Managed Services -> IT Support Home -> IT Solutions -> IT Support & Managed Services -> Managed Services Home -> IT Solutions -> IT Support & Managed Services -> Help Desk Services Home -> IT Solutions -> Virtualization & Data Center Solutions -> Virtualization Home -> IT Solutions -> Virtualization & Data Center Solutions -> Data Center Solutions This structure lines up with our business and makes logical sense but I am not sure how to handle the department and service group pages. Right now you can click them and it just brings you to a page with a small snippet for the links below. The real content is on the content pages. What I am worried about is that the snippets on those pages are just a paragraph or two of the content that's on the content page. Will this hurt me and get considered duplicate content? What is the best practice for dealing with this? Those department/service group pages have some good content on them but it's just parts of other pages. Am I okay doing this because there are not direct duplicates of other pages just parts of a few pages? Any help on this would be great. Thanks in advance.
Technical SEO | | ZiaTG0 -
Duplicate content error from url generated
We are getting a duplicate content error, with "online form/" being returned numerous times. Upon inspecting the code, we are calling an input form via jQuery which is initially called by something like this: Opens Form Why would this be causing it the amend the URL and to be crawled?
Technical SEO | | pauledwards0 -
Duplicate Page title - PHP Experts!
After running a crawl diagnostics i was surprised to see 336 duplicate page titles. But i am wondering if it is true or not. Most of them are not a page at all but a .php variation. for example: The following are all the same page, but a different limit on viewing listings. Limiting your view to 5, 10, 15, 20, 25 as you choose. .com/?lang=en&limit=5 .com/?lang=en&limit=5&limitstart=10
Technical SEO | | nahopkin
.com/?lang=en&limit=5&limitstart=15
.com/?lang=en&limit=5&limitstart=20
.com/?lang=en&limit=5&limitstart=25 Same type of things are going on all over the site causing 228 duplicate content errors and the already mentioned 336 duplicate pages. But is "crawl diagnostic telling the truth" or is it just some php thing? I am not a php expert so any input would be much appreciated. What should i do?0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0