Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best way to noindex long dynamic urls?
-
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing.
Any assistance will be greatly appreciated : )
-
If you have a page that lists all the villas outside the search results, then you don't lose anything by blocking that folder on the robots.txt
But still, somebody, the guy that wrote the custom theme knows how to do the changes needed.
If you want I can help you with it, for free
Just PM me (I'll need FTP access).
-
Having some trouble... Because the site is Wordpress, which dynamically generates pages, there is no /property-search-page/ nor is there a property-search-page.php in the editor files, so the only option I have is to put disallow: /property-search-page/ in the robots.txt file, correct?
-
You guys rock! I'll try these out tomorrow. Thanks a million.
-
If you have a /all-villas/ page then you should go ahead and noindex the search results as Google Guidelines suggests. You can either do it in the /property-search-page/ or using the robots.txt file.
In the robots.txt, add:
disallow: /property-search-page/
The robots method guarantees that no page inside that folder is indexed or even crawled (including /property-search-page/?whatever).
Or on the page /property-search-page/ you can add the meta noindex as such:
Then check if that meta tag is shown in all search results (just check a couple of them).
Hope that works!
-
Yes, it will. Also looks like custom code, it depends on how the header is coded. But it should work. Test it, if you can. This should solve your problems relatively easily. If nothing works, you can always do a robots.txt deny for /property-search-page/?* pages, but that's not a recommended solution. Try the canonical way to see if it works first.
-
We already have Yoast installed, but the errors are still showing up in the Moz report.
To clarify, let's assume we have another page that lists all the villas (/all-villas/). If I go to the property-search-page php file and canonical=rel it to /all-villas/, will it canonical=rel all /property-search-page/?whatever pages to the /all-villas/ page?
-
Well, that will make a little easier from one side and harder from the other.
You can try installing SEO by Yoast, that will put all the canonical tags for you, however, I think it won't link the search result pages to the canonical page that lists them all.
That might require a little coding.
If there's another page, outside /property-search-page/ folder that lists all villas, then you can disallow that folder in the robots.txt file, and that should fix it. If there isn't, well, then you will need to edit the /property-search-page/ page to use a static canonical tag that points to the page that lists all the villas removing any kind of filtering.
Hope that helps!
-
Thanks for the response. The site is Wordpress - is there an easy way to write some sort of rule that would canonical any of these types of pages to a category page? How would you go about doing that?
-
Thanks for the response. The site is Wordpress - is there an easy way to write some sort of rule that would canonical any of these types of pages to a category page? How would you go about doing that?
-
I agree with Federico one hundred percent. Figure out what your primary SEO friendly URLs are for these kinds of pages and canonical them back to that page.
-
I wouldn't put a noindex meta on them, instead I would consider using a canonical tag pointing to the page that lists all the villas.
Anyway, what programming language are you using?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB3 -
What is the best way to redirect visitors to certain pages of your site based on their location?
One website I manage wants to redirect users to state specific pages based on their location. What is the best way to accomplish this? For example a user enters the through site.com but they are in Colorado so we want to direct them to site.com/colorado.
Technical SEO | | Firestarter-SEO0 -
Best URL format for pagination
We're currently changing the URL format of our website search, we have been discussing a lot and cannot decide the past way to pass the pagination parameter for SEO. We narrowed down to the options. www.website.com/apples/p2 - www.website.com/apples?page=2 - www.website.com/apples/page/2 What would give us best ranking returns? What do you think?
Technical SEO | | HelpSaude0 -
URLs in Greek, Greeklish or English? What is the best way to get great ranking?
Hello all, I am Greek and I have a quite strange question for you. Greek characters are generally recognized as special characters and need to have UTF-8 encoding. The question is about the URLs of Greek websites. According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages. To give you an example let's look at A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks. On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)! So the question is: For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ? Thank you very much for your help! Regards, Lenia
Technical SEO | | tevag0 -
Approved Word Separators in URLs
Hi There, We are in the process of revamping our URL structure and my devs tell me they have a technical problem using a hyphen as a word separator. There's a whole lot of competing recommendations out there and at this point I'm just confused. Does anyone have any idea what character would be next-best to the hyphen for separating words in a URL? Any reason to prefer one over another? Some links I've found discussing the topic: This page says that "__Google has confirmed that the point (.), the comma (,) and the hyphen (-) are valid word separators in URL’s.": http://www.internetofficer.com/seo/google-word-separator/ This page suggests the plus (+) symbol would be best: http://labs.phurix.net/posts/word-separators-in-urls This guy says he's tested and there's a whole bunch of symbols that will work as word separators: http://www.webproguide.com/articles/Symbols-as-word-separators-a-look-inside-the-search-engine-logic/ I'm leaning towards the tilde (~) or the plus (+) sign. Usage would be like so: http://www.domain.com/shop/sterling~silver OR /shop/sterling+silver etc... Thanks in advance for your help!
Technical SEO | | Richline_Digital1 -
What is the best way to find stranded pages?
I have a client that has a site that has had a number of people in charge of it. All of these people have very different opinions about what should be on the site itself. When I look at their website on the server I see pages that do not have any obvious navigation to them. What is the best way to find out the internal linking structure of a site and see if these pages truly are stranded?
Technical SEO | | anjonr0 -
Products with discrete URLs for each color
here is the issue. i have an ecommerce site that on a category page, shows each individual color for each product sold. and there is a distinct URL for each color. each product page shares the same content, with the only potentially differentiating factor being customer reviews (not nearly enough of these to differentiate anything). so we have URLs like: www.domain.com/product-green www.domain.com/product-yellow www.domain.com/product-red and so on. i am looking for a way to consolidate these URL while still showing all colors on the category page. the first solution i am considering is using the hash tag. so we would create www.domain.com/product#green, www.domain.com/product#yellow, www.domain.com/product#red. if possible, i would set the canonical tag as www.domain.com/product. the second solution would be to use the canonical tag and keep the URLs as is. the issue i see here is that we would need to create www.domain.com/product and show that page somewhere. www.domain.com/product would the URL that the above color URLs would canonicalize to. what would be the preferred solution? or is there something else?
Technical SEO | | rakesh_patel0