Dynamic parameters
-
Our site has numerous filters and on each results page, we have the rel canonical tag. So, I'm not sure if we should concern ourselves or not about the crawl stats reporting that we have a bunch of pages that have more than two parameters. If so, do you have any suggestions? This url is an example:
Thanks!
-
Sorry, didn't read properly. Yes you should still block those pages anyway though... internal search result pages will still cause you issues in one way or another, whether it's circular navigation or competing for keywords... I would block them anyway.
-
I thought the rel canonical tag took care of the duplicate content issue.
-
That's not the actual product page itself, it's because it's a search result page.You should block those pages anyway as not only is there the parametre problem but they also have duplicate content of the product pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is better? One dynamically optimised page, or lots of optimised pages?
For the purpose of simplicity, we have 5 main categories in the site - let's call them A, B, C, D, E. Each of these categories have sub-category pages e.g. A1, A2, A3. The main area of the site consists of these category and sub-category pages. But as each product comes in different woods, it's useful for customers to see all the product that come in a particular wood, e.g. walnut. So many years ago we created 'woods' pages. These pages replicate the categories & sub-categories but only show what is available in that particular wood. And of course - they're optimised much better for that wood. All well and good, until recently, these specialist page seem to have dropped through the floor in Google. Could be temporary, I don't know, and it's only a fortnight - but I'm worried. Now, because the site is dynamic, we could do things differently. We could still have landing pages for each wood, but of spinning off to their own optimised specific wood sub-category page, they could instead link to the primary sub-category page with a ?search filter in the URL. This way, the customer is still getting to see what they want. Which is better? One page per sub-category? Dynamically filtered by search. Or lots of specific sub-category pages? I guess at the heart of this question is? Does having lots of specific sub-category pages lead to a large overlap of duplicate content, and is it better keeping that authority juice on a single page? Even if the URL changes (with a query in the URL) to enable whatever filtering we need to do.
On-Page Optimization | | pulcinella2uk0 -
Google is indexing urls with parameters despite canonical
Hello Moz, Google is indexing lots of urls despite the canonical in my site. Those urls are linked all over the site with parameters like ?, and looks like Google is indexing them despite de canonical. Is Google deciding to index those urls because they are linked all over the site? The canonical tag is well implemented.
On-Page Optimization | | Red_educativa0 -
Long Meta Titles on Dynamic Pages
What to do with long meta titles on press release pages. Unlike other pages on the site, press release pages have no physical value and are dynamically created picking data from the database. Such pages i notice are automatically picking the URL/H1 as meta title and meta description. How to shorten such meta titles and descriptions? Do such errors (related to dynamically created pages) matter? Tanveer
On-Page Optimization | | Sequelmed0 -
Do parameters in a URL make a difference from an SEO point of view
We us a number of different parameters in a number of our URLs to track how the user has navigated to the page. So for example we will have a page www.example.com/product/?banner to show that the user has navigated to the page from the banner as opposed to www.example.com/product/?footer to show that the user has navigated to the page from the footer. Do search engines treat these pages as the same page or different pages? Thanks
On-Page Optimization | | cbarron0 -
Static content VS Dynamic changing content
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
Does Google index dynamically generated content/headers, etc.?
To avoid dupe content, we are moving away from a model where we have 30,000 pages, each with a separate URL that looks like /prices/<product-name>/<city><state>, often with dupe content because the product overlaps from city to city, and it's hard to keep 30,000 pages unique, where sometimes the only distinction is the price & the city/state.</state></city></product-name> We are moving to a model with around 300 unique pages, where some of the info that used to be in the url will move to the page itself (headers, etc.) to cut down on dupe content on those unique 300 pages. My question is this. If we have 300 unique-content pages with unique URL's, and we then put some dynamic info (year, city, state) into the page itself, will Google index this dynamic content? The question behind this one is, how do we continue to rank for searches for that product in the city-state being searched without having that info in the URL? Any best practices we should know about?
On-Page Optimization | | editabletext0 -
Would removing high dynamic pages though nofollow help or hurt?
We have a sub-domain that is hosted by a third party. These pages are highly dynamic (change daily or more often) as they are product search results. Unfortunately they are raising several errors and warnings including duplicate page content, title missing or empty, long URLs, overly dynamic URL Would putting nofollows on the links to this sub-domain help, hurt or not affect page rank? As an example: Links in the middle of this page (prices) http://targetvacations.ca go to a page such as this http://travel.targetvacations.ca/cgi-bin/resultadv.cgi?id=16294922&code_ag=tgv&alias=tgv which is then redirected to a dynamic URL and presents the results.
On-Page Optimization | | TSDigital0 -
Prevent indexing of dynamic content
Hi folks! I discovered bit of an issue with a client's site. Primarily, the site consists of static html pages, however, within one page (a car photo gallery), a line of php coding: dynamically generates a 100 or so pages comprising the photo gallery - all with the same page title and meta description. The photo gallery script resides in the /gallery folder, which I attempted to block via robots.txt - to no avail. My next step will be to include a: within the head section of the html page, but I am wondering if this will stop the bots dead in their tracks or will they still be able to pick-up on the pages generated by the call to the php script residing a bit further down on the page? Dino
On-Page Optimization | | SCW0