Htaccess - multiple matches by error
-
Hi all,
I stumbled upon an issue on my site.
We have a video section:
htaccess rule:
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^video index.php?area=video [L,QSA]Problem is that these URLs give the same content:
www.holdnyt.dk/anystring/video
www.holdnyt.dk/whatsoever/videoAny one with a take on whats wrong with the htaccess line?
-Rasmus
-
Hi Rasmus,
You can try what Emanuele mentioned before and if it doesn't work please let me know exactly which are the different alternatives of the URLs that you want to rewrite and the output URLs that you're looking to get so I can help you.
Thanks!
-
Hi Rasmus, maybe I'm missing something but it seems that you've set up a rule only for your www.domain.com/video page. The string ^video matches only that one and not domain.com/whateveryouwriteherewillbeignored/video
Try to add a (.*) before and associate that with a parameter since the only value you're passing is that the area is video I don't see anything specified in htaccess for that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I prevent duplicate page title errors from being generated by my multiple shop pages?
Our e-commerce shop has numerous pages within the main shop page. Users navigate through the shop via typical pagination. So while there may be 6 pages of products it's all still under the main shop page. Moz keeps flagging my shop pages as having duplicate titles (ie shop page 2). But they're all the same page. Users aren't loading unique pages each time they go to the next page of products and they aren't pages I can edit. I'm not sure how to prevent this issue from popping up on my reports.
Technical SEO | | NiteSkirm0 -
Error got removed by request
hey there, I have website, Which is Doing best on google, But from last 2-3 times i am Getting error in Google Search Console, removed by request., here is screenshot http://prntscr.com/iva5y0 My Question is i have Not Requested Google to Remove My Homepage URL, Then 1. Why I am Getting This warning again & again, ?? 2. Will this Harm My Sites Current traffic & Ranking Status ? 3.What Steps I need Take to stop This warning came again & again ??? Thnx in advance & please Help Out on this Query
Technical SEO | | innovative.rohit0 -
Redirect Error
Hello, I was sent a report from a colleague containing redirect errors: The link to "http://www.xxxx.com/old-page/" has resulted in HTTP redirection to "http://www.xxxx.com/new-page".Search engines can only pass page rankings and other relevant data through a single redirection hop. Using unnecessary redirects can have a negative impact on page ranking. Our site is host on Microsoft Servers (IIS). I'm not sure what is causing these errors. Would it be the way the redirect was implemented.
Technical SEO | | 3mobileIreland0 -
Multiple sub domain appearing
Hi Everyone, Hope were well!. Have a strange one!!. New clients website http://www.allsee-tech.com. Just found out he is appearing for every subdomain possible. a.alsee-tech.com b.allsee-tech.com. I have requested htaccess as this is where I think the issue lies but he advises there isn't anything out of place there. Any ideas in case it isn't? Regards Neil
Technical SEO | | nezona0 -
Wordpress 4xx errors from comment re-direct
About a month ago, we had a massive jump in 4XX errors. It seems the majority are being caused by the comment tool on wordpress, which is generating a link that looks like this "http://www.turnerpr.com/blog/wp-login.php?redirect_to=http%3A%2F%2Fwww.turnerpr.com%2Fblog%2F2013%2F09%2Fturners-crew-royal-treatment-well-sort-of%2Fphoto-2-2%2F" On every single post. We're using Akismet and haven't had issues in the past....and I can't figure out the fix. I've tried turning it off and back on; I'm reluctant to completely switch commenting systems because we'd lose so much history. Anyone seen this particular re-direct love happen before? Angela
Technical SEO | | TurnerPR0 -
Are W3C Validators too strict? Do errors create SEO problems?
I ran a HTML markup validation tool (http://validator.w3.org) on a website. There were 140+ errors and 40+ warnings. IT says "W3C Validators are overly strict and would deny many modern constructs that browsers and search engines understand." What a browser can understand and display to visitors is one thing, but what search engines can read has everything to do with the code. I ask this: If the search engine crawler is reading thru the code and comes upon an error like this: …ext/javascript" src="javaScript/mainNavMenuTime-ios.js"> </script>');}
Technical SEO | | INCart
The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element
in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed).
One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create
cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer
the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). and this... <code class="input">…t("?");document.write('>');}</code> ✉ The element named above was found in a context where it is not allowed. This could mean that you have incorrectly nested elements -- such as a "style" element in the "body" section instead of inside "head" -- or two elements that overlap (which is not allowed). One common cause for this error is the use of XHTML syntax in HTML documents. Due to HTML's rules of implicitly closed elements, this error can create cascading effects. For instance, using XHTML's "self-closing" tags for "meta" and "link" in the "head" section of a HTML document may cause the parser to infer the end of the "head" section and the beginning of the "body" section (where "link" and "meta" are not allowed; hence the reported error). Does this mean that the crawlers don't know where the code ends and the body text begins; what it should be focusing on and not?0 -
Remove a directory using htaccess
Hi, Can someone tell me if there's a way using htaccess to say that everything in a particular directory, let's call it "A", is gone (http 410 code)? i.e. all the links should be de-indexed? Right now, I'm using the robots file to deny access. I'm not sure if it's the right thing to do since Google webmaster tools is showing me the link as indexed still and a 403 error code. Thanks.
Technical SEO | | webtarget0 -
Redirect Multiple Domains
This is a follow-up question from one posted earlier this month. I can't linked to that because it's a private question so I'm trying to summarize it below. We have a number of domains – about 20 - (e.g. www.propertysharp.com) that point to our main domain ip adress (www.propertyshark.com) and share the same content. This is no black-hat strategy whatsoever, the domains were acquired several years ago in order to help people who mistyped the websites url to reach their desired destination. The question was whether to redirect them to our main domain or not. Pros were the reportedly millions of incoming links from these domains - cons was the fact that lots of issues regarding duplicate content could arise and we actually saw lots of some pages from these domains ranking in the search engines. We were recommended to redirect them, but to take it gradually. I have a simple question - what does gradually mean - one domain per week, per month?
Technical SEO | | propertyshark0