Multiple values for a single URL parameter - Is that possible?
-
I have a website which supports three languages - German, Spanish, Portuguese - by using language parameters.
How do I configure my URL parameters so that the only the Spanish and Portuguese URLs are crawled but not the German URLs. Basically, how do i specify two values for a single parameter.
-
I know you posted this a while ago, but I've just seen it.
The simplest way would be to generate your ROBOTS meta tag based on the parameter. I code in PHP so this is my example, but it will depend on your platform
Say your URL is domain.com?lang=es or ?lang=po or ?lang=de
$lang = $_GET['lang'];
if($lang=="de"){ echo ''; }
else{ echo ''; }
I left 'follow' in the German line because you may already have pages in the index, so this means Google DE has free reign over all your site, but any pages it finds should not be indexed.
If the site's not live yet, then you can change the 'follow' in the DE line to 'nofollow'.
Hope this helps.
-
I know you posted this a while ago, but I've just seen it.
The simplest way would be to generate your ROBOTS meta tag based on the parameter. I code in PHP so this is my example, but it will depend on your platform
Say your URL is domain.com?lang=es or ?lang=po or ?lang=de
$lang = $_GET['lang'];
if($lang=="de"){ echo ''; }
else{ echo ''; }
I left 'follow' in the German line because you may already have pages in the index, so this means Google DE has free reign over all your site, but any pages it finds should not be indexed.
If the site's not live yet, then you can change the 'follow' in the DE line to 'nofollow'.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to deindex old URLs that contain duplicate content?
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website). Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file. Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages? In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs? Any help is greatly appreciated!
Technical SEO | | ClickHub-Harry0 -
Shortening URL's
Hello again Mozzers, I am debating what could be a fairly drastic change to the company website and I would appreciate your thoughts. The URL structure is currently as follows Product Pages
Technical SEO | | ATP
www.url.co.uk/product.html Category Pages
www.url.co.uk/products/category/subcategory.html I am debating removing the /products/ section as i feel it doesn't really add much and lengthens the url with a pointless word. This does mean however redirecting about 50-60 pages on the website, is this worth it? Would it do more damage than good? Am i just being a bit OCD and it wont really have an impact? As always, thanks for the input0 -
SEO friendly url strategy...
Hi guys i just wanted your expert opinion on keywords in urls. The example i'm giving you is in regards to a ecommerce website: Option 1: www.example.com/shop/coffee/coffee-beans/brand-coffee-beans-500gr Option 2: www.example.com/shop/coffee/beans/brand-coffee-beans-500gr We sell coffee so i'll keep the example relevant 🙂 Does it make a difference on how the keywords are stacked throughout? Would the search engine combine the two keywords eg. .../coffee/beans/... or would i be better of having .../coffee/coffee-beans/... and is there a penalty for having the same phrase more than once in the url? I hope my question makes sense... 😉 Looking forward to your opinions and ideas!
Technical SEO | | Immanuel0 -
Www or not www base url
Here is the situation. Developer custom coded a magento commerce shop for a seo client and is having problems adding www to the URL without breaking the site. They wont be able to get this completed until a couple months down the road. We are starting monthly SEO this June. Most directories and websites link to the www version of a site not the non www. What should I expect since we are ranking for the non-www and linking to the www version. In web master tools i'm telling google to display the URL as www.
Technical SEO | | waqid0 -
URL rewriting causing problems
Hi I am having problems with my URL rewriting to create seo friendly / user friendly URL's. I hope you follow me as I try to explain what is happening... Since the creation of my rewrite rule I am getting lots of errors in my SEOMOZ report and Google WMT reports due to duplicate content, titles, description etc For example for a product detail, it takes the page and instead of a URL parameter it creates a user friendly url of mydomain.com/games-playstation-vita-psp/B0054QAS However in the google index there is also the following friendly URL which is the same page - which I would like to remove domain.com/games-playstation-vita/B0054QAS The key to the rewrite on the above URLs is the /B0054QAS appended at the end - this tells the script which product to load, the details preceeding this could be in effect rubbish i.e. domain.com/a-load-of-rubbish/B0054QAS and it would still bring back the same page as above. What is the best way of resolving the duplicate URLs that are currently in the google index which is causing problems The same issue is causing a quite serious a 5XX error on one of the generated URLs http://www.mydomain.com/retailersname/1 - , if I click on the link the link does work - it takes you to the retailers site, but again it is the number appended at the end that is the key - the retailersname is just there for user friendly search reasons How can I block this or remove it from the results? Hope you are still with me and can shed some light on these issues please. Many Thanks
Technical SEO | | ocelot0 -
URL rewriting from subcategory to category
Hello everybody! I have quite simple question about URL rewriting from subcategory to category, yet I can't find any solution to this problem (due to lack of my deeper apache programming knowledge). Here is my problem/question: we have two website url structures that causes dublicate problems: www.website.lt/language/category/ www.website.lt/language/category/1/ 1 and 2 pages are absolutely same (both also returns 200 OK). What we need is 301 redirect from 2 to 1 without any other deeper categories redirects (like www.website.com/language/category/1/169/ redirecting to .../category/1/ or .../category/). Here goes .htaccess URL rewrite rules: RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&par3=$5&par4=$6&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&par3=$5&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/$ /index.php?lang=$1&idr=$2&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/$ /index.php?lang=$1&%{QUERY_STRING} [L] There are other redirects that handles non-www to www and related issues: RedirectMatch 301 ^/lt/$ http://www.domain.lt/ RewriteCond %{HTTP_HOST} ^domain.lt RewriteRule (.*) http://www.domain.lt/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$RewriteRule ^(.)$ http://www.domain.lt/$1/ [R=301,L] At this moment we cannot solve this problem with rel canonical (due to our CMS limits). Thanks for your help guys! If You need any other details on our coding, just let me know.
Technical SEO | | jkundrotas0 -
URL paths and keywords
I'm recommending some on-page optimization for a home builder building in several new home communities. The site has been through some changes in the past few months and we're almost starting over. The current URL structure is http://homebuilder.com/oakwood/features where homebuilder = builder name Oakwood Estates= name of community features = one of several sub-paths including site plan, elevations, floor plans, etc. The most attainable keyword phrases include the word 'home' and 'townname' I want to change the URL path to: http://homebuilder.com/oakwood-estates-townname-homes/features Is there any problem with doing this? It just seems to make a lot of sense. Any input would be appreciated.
Technical SEO | | mikescotty0 -
Slashes In Url's
If your cms has created two urls for the same piece of content that look like the following, www.domianname.com/stores and www.domianname.com/stores/, will this be seen as duplicate content by google? Your tools seem to pick it up as errors. Does one of the urls need 301 to the other to clear this up, or is it not a major problem? Thanks.
Technical SEO | | gregster10000