Htaccess - multiple matches by error
-
Hi all,
I stumbled upon an issue on my site.
We have a video section:
htaccess rule:
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^video index.php?area=video [L,QSA]Problem is that these URLs give the same content:
www.holdnyt.dk/anystring/video
www.holdnyt.dk/whatsoever/videoAny one with a take on whats wrong with the htaccess line?
-Rasmus
-
Hi Rasmus,
You can try what Emanuele mentioned before and if it doesn't work please let me know exactly which are the different alternatives of the URLs that you want to rewrite and the output URLs that you're looking to get so I can help you.
Thanks!
-
Hi Rasmus, maybe I'm missing something but it seems that you've set up a rule only for your www.domain.com/video page. The string ^video matches only that one and not domain.com/whateveryouwriteherewillbeignored/video
Try to add a (.*) before and associate that with a parameter since the only value you're passing is that the area is video I don't see anything specified in htaccess for that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
60,000 404 errors
Do 404 errors on a large scale really matter? I'm just aware that I now have over 60,000 and was wondering if the community think that I should address them by putting 301 redirects in place. Thanks
Technical SEO | | the-gate-films0 -
Wordpress 404 Errors
Hi Guys, One of my clients is scratching his head after a site migration. He has moved to wordpress and now GWT is creating weird and wonderful strange 404 errors. For example http://www.allsee-tech.com/digital-signage-blog/category/clients.html There are loads like the above which seem to be made up out of his blog and navigation http://www.allsee-tech.com/clients.html works! Any ideas? Is it a rogue plugin? How do we fix? Kind Regards Neil
Technical SEO | | nezona0 -
Multiple H1s for Testimonials
Hey all, I've got another HTML5 + multiple H1s question, although I feel like my situation is unique from the other discussions/questions on Moz. On our homepage at http://www.strutta.com, near the bottom of the page, we have a section for testimonials. Here, we have the names of three other organizations that have written testimonials for us. Following proper HTML5 guidelines, we have placed their company names in H1s. Could having the names of other companies in H1s potentially dilute the subject/meaning of our page? Cheers
Technical SEO | | danny.wood0 -
Use of Multiple Tags
Hi, I have been monitoring some of the authority sites and I noticed something with one of them. This high authority site suddenly started using multiple tags for each post. And I mean, loads of tags, not just three of four. I see that each post comes with at least 10-20 tags. And these tags don't always make sense either. Let's say there is a video for "Bourne Legacy", they list tags like bourne, bourney legacy, bourne series, bourne videos, videos, crime movies, movies, crime etc. They don't even seem to care about duplicate content issues. Let's say the movie is named The Dragon, they would inclue dragon and the-dragon in tags list and despite those two category pages(/dragon and /the-dragon) being exactly the same now, they still wouldn't mind listing both the tags underneath the article. And no they don't use canonical tag. (there isn't even a canonical meta on any page of that site) So I am curious. Do they just know they have a very high DA, they don't need to worry about duplicate content issues? or; I am missing something here? Maybe the extra tags are doing more good than harm?
Technical SEO | | Gamer070 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
Get rid of a large amount of 404 errors
Hi all, The problem:Google pointed out to me that I have a large increase of 404 errors. In short I had software before that created pages (automated) for long tale search terms and feeded them to google. Recently I quit this service and all those pages (about 500000) were deleted. Now google GWM points out about 800000 404 errors. What I noticed: I had a large amount of 404's before when I changed my website. I fixed it (proper 302) and as soon as all the 404's in GWM were gone I had around 200 visitors a day more. It seems that a clean site is better positioned. Anybody any suggestion on how to tell google that all urls starting with www.domain/webdir/ should be deleted from cache?
Technical SEO | | hometextileshop0 -
Robot.txt pattern matching
Hola fellow SEO peoples! Site: http://www.sierratradingpost.com robot: http://www.sierratradingpost.com/robots.txt Please see the following line: Disallow: /keycodebypid~* We are trying to block URLs like this: http://www.sierratradingpost.com/keycodebypid~8855/for-the-home~d~3/kitchen~d~24/ but we still find them in the Google index. 1. we are not sure if we need to specify the robot to use pattern matching. 2. we are not sure if the format is correct. Should we use Disallow: /keycodebypid*/ or /*keycodebypid/ or even /*keycodebypid~/? What is even more confusing is that the meta robot command line says "noindex" - yet they still show up. <meta name="robots" content="noindex, follow, noarchive" /> Thank you!
Technical SEO | | STPseo0 -
Crawl Errors and Duplicate Content
SEOmoz's crawl tool is telling me that I have duplicate content at "www.mydomain.com/pricing" and at "www.mydomain.com/pricing.aspx". Do you think this is just a glitch in the crawl tool (because obviously these two URL's are the same page rather than two separate ones) or do you think this is actually an error I need to worry about? Is so, how do I fix it?
Technical SEO | | MyNet0