Hey guys i have this issues on my crawling report what should i do to exlude the pages? are d
-
Overly-Dynamic URL
Overly-Dynamic URL
Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in a given URL. Search engines may also see dynamic versions of the same URL as unique URLs, creating duplicate content.
-
understand thanks
-
So you use dynamic url's with specific parameters that generate this issue because you use more then 2 parameters. According to the moz.com data warning (not error) you should reduce the number of parameters. You want to know how to exclude this page from your crawl report? Fix the error. More important, is this page ranked in Google? Why use parameters? Do you have the ability to change the urls to static versions? USe 301 to redirect. If you have an issue with duplicate page content due to the parameters prehaps rel="canonical" can solve this for you. It all depends on your exact problem with the parameters and to what end you want it solved. If the pages are not ranked in Google and you don't want them to rank in Google/Bing or any other search engine, use noindex, nofollow of disallow them in the robots.txt. If you do want them to rank maybe it would be wiser to make the static url change with a nice keyword in it.
Hope this helps some. If not, please explain yourself as clear as possible so we can assist you with this.
regards
Jarno
-
on my crawling report on moz campaign i have this issues : Overly-Dynamic URL Overly-Dynamic URL i have a page with dinamic url is page that generates diferent versions of it is a leed form from quinstreet should i use noindex for this page?
-
Adulter,
can you conform one solid question? I don't see what you are asking here...
regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing massive number of no index follow page that are not crawled
Hi, We have stackable filters on some of our pages (ie: ?filter1=a&filter2=b&etc.). Those stacked filters pages are "noindex, follow". They were created in order to facilitate the indexation of the item listed in them. After analysing the logs we know that the search engines do not crawl those stacked filter pages. Does blocking those pages (by loading their link in AJAX for example) would help our crawl rate or not? In order words does removing links that are already not crawled help the crawl rate of the rest of our pages? My assumption here is that SE see those links but discard them because those pages are too deep in our architecture and by removing them we would help SE focus on the rest of our page. We don't want to waste our efforts removing those links if there will be no impact. Thanks
Intermediate & Advanced SEO | | Digitics0 -
How to optimize count of interlinking by increasing Interlinking count of chosen landing pages and decreasing for less important pages within the site?
We have taken out our interlinking counts (Only Internal Links and not Outbound Links) through Google WebMaster tool and discovered that the count of interlinking of our most significant pages are less as compared to of less significant pages. Our objective is to reverse the existing behavior by increasing Interlinking count of important pages and reduce the count for less important pages so that maximum link juice could be transferred to right pages thereby increasing SEO traffic.
Intermediate & Advanced SEO | | vivekrathore0 -
Sitemaps and dynamic pages
Hi all, I have a gigantic website and they are adding another subdirectory to it. My question is regarding html sitemaps for better optimisation. 1. Should a keyword focussed front end (html) sitemap be made for all the dynamic URLs or 2. Should a category focussed front end (html) sitemap be made for all the dynamic URLs what would be your approach to doing a sitemap with thousands of pages with a structure like Directory > Sub directory > Subdirectory > Files
Intermediate & Advanced SEO | | Malika10 -
Client has moved to secured https webpages but non secured http pages are still being indexed in Google. Is this an issue
We are currently working with a client that relaunched their website two months ago to have hypertext transfer protocol secure pages (https) across their entire site architecture. The problem is that their non secure (http) pages are still accessible and being indexed in Google. Here are our concerns: 1. Are co-existing non secure and secure webpages (http and https) considered duplicate content?
Intermediate & Advanced SEO | | VanguardCommunications
2. If these pages are duplicate content should we use 301 redirects or rel canonicals?
3. If we go with rel canonicals, is it okay for a non secure page to have rel canonical to the secure version? Thanks for the advice.0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Page 1 Reached, Further Page Improvements and What Next ?
Moz, I have a particularly tricky competitive keyword that i have finally climbed our website to the 10th position of page 1, i am particularly pleased about this as all of the website and content is German which i have little understanding of and i have little support on this, I am pleased with the content and layout of the page and i am monitoring all Google Analytics values very closely, as well as the SERP positions, So as far as further progression with this page and hopefully climbing further up page 1, where do you think i should focus my efforts ? Page Speed optimization?, Building links to this page ?, blogging on this topic (with links) , Mobile responsive design (More difficult), further improvements to pages and content linked from this page ? Further improvements to the website in general?,further effort on tracking visitors and user experience monitoring (Like setting up Crazyegg or something?) Any other ideas would be greatly appreciated, Thanks all, James
Intermediate & Advanced SEO | | Antony_Towle0 -
Optimize the category page or a content page?
Hi, We wish to start ranking on a specific keyword ("log house prices" in italian). We have two options on what pages we should optimize for this keyword: A long content page (1000+ words with images) Log houses category page, optimized for the keyword (we have 50+ houses on this page, together with a short price summary). I would think that we have better chances with ranking with option nr.2 , but then we can't use that page for ranking with a more short-tail keyword (like "log houses"). What would you suggest? Is there maybe a third option for this?
Intermediate & Advanced SEO | | JohanMattisson0 -
Canonical category pages
A couple of years ago I used to receive a lot of traffic via my category pages but now I don't receive as much, in the past year I've modified the category pages to canonical. I have 15 genres for the category pages, other than the most recent sorting there is no sorting available for the users on the cat pages, a recent image link added can over time drop off to page 2 of the category page, for example mysite.com/cat-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-23. New image link can drop off to page 2. mysite.com/dog-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-53. New image link can drop off to page 2. mysite.com/turtle-page1.html = 100 image links per page with numbered page navigation, number of cat pages 1-2. New image link can drop off to page 2. Now on the first page (eg mysite.com/cat-page1.html) I've set this up to rel= canonical = mysite.com/cat-page1.html One thing that I have noticed is the unique popup short description tooltips that I have on the image links only appears in google for the first pages of each category page, it seems to ignore the other pages. In view of this am I right in applying canonical ref or just treating it as normal pages.? thanks
Intermediate & Advanced SEO | | Flapjack0