Duplicate URL Parameters for Blog Articles
-
Hi there,
I'm working on a site which is using parameter URLs for category pages that list blog articles.
The content on these pages constantly change as new posts are frequently added, the category maybe for 'Heath Articles' and list 10 blog posts (snippets from the blog). The URL could appear like so with filtering:
-
www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general
-
www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016
-
www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016&page=1
-
All pages currently have the same Meta title and descriptions due to limitations with the CMS, they are also not in our xml sitemap
I don't believe we should be focusing on ranking for these pages as the content on here are from blog posts (which we do want to rank for on the individual post) but there are 3000 duplicates and they need to be fixed.
Below are the options we have so far:
Canonical URLs
Have all parameter pages within the category canonicalize to www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general and generate dynamic page titles (I know its a good idea to use parameter pages in canonical URLs).
WMT Parameter tool
Tell Google all extra parameter tags belong to the main pages (e.g. www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general&year=2016&page=3 belongs to www.domain.com/blog/articles/?taxonomy=health-articles&taxon=general).
Noindex
Remove all the blog category pages, I don't know how Google would react if we were to remove 3000 pages from our index (we have roughly 1700 unique pages)
We are very limited with what we can do to these pages, if anyone has any feedback suggestions it would be much appreciated.
Thanks!
-
-
Hard to say these days if they do respect the scroll effect there unfortunately.
-
Thanks Martijn,
That sounds like a good idea, we were also considering a Javascript loading option where we remove the pagination and load content on scroll - I am still 50/50 whether or not hidden content like this is crawled or ignored.
-
Thanks Anthony,
We are using rel=prev/next on the pagination for these blog pages which does reduce duplication, but because of the parameter filters we still have thousands of duplicates.
That's a good point about the indexing of older blogs!
-
I would simply set up rel=next/prev on the paginated series and not so much worry about duplicate title tags or using canonical tags. You want to make sure Google continues to crawl deep into your blog pagination and can access older blog posts.
-
Hi,
What I would do is go with both the canonical URLs as the Google Search Console parameters, in order to make sure first that the pages won't be seen as duplicates with the canonical URLs and in addition to that you might want to make sure that Google isn't visiting these pages at all in order to save your crawl budget for the more important pages on your site.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidate URLs on Wordpress?
Hi Guys, On a WordPress site, we are working with currently has multiple different versions of each URL per page. See screenshot: https://d.pr/i/ZC8bZt Data example: https://tinyurl.com/y8suzh6c Right now the non-https version redirects to the equivalent https versions while some of the https versions don't redirect and are status code 200. We all want all of them to redirect to the highlighted blue version (row a).Is this easily doable in wordpress and how would one go about it? Cheers.
Intermediate & Advanced SEO | | wickstar1 -
Identifying Duplicate Content
Hi looking for tools (beside Copyscape or Grammarly) which can scan a list of URLs (e.g. 100 pages) and find duplicate content quite quickly. Specifically, small batches of duplicate content, see attached image as an example. Does anyone have any suggestions? Cheers. 5v591k.jpg
Intermediate & Advanced SEO | | jayoliverwright0 -
Is 1:1 301 redirect required on indexed URL when restructing URL even if the new URL is canonicalized?
Hello folks, We are restructuring some URLS which forms a fair chunk of the content of the domain.
Intermediate & Advanced SEO | | HB17
These content are auto generated rather than manually created unlike other parts of the website. The same content is currently accessible from two URLs: /used-books/autobiography-a-long-walk-to-freedom-isbn
/autobiography/used-books/a-long-walk-to-freedom-isbn The URL 1 uses the URL 2 as the canonical url and it has worked allright since Moz does
not show the two as duplicate of each other. Google has also indexed the canonical URL although
there is still a few 'URL 1s' which were indexed before the canonical was implemented. The updated URL structure will look like something like this: /used-books/autobiography-a-long-walk-to-freedom-author-name-isbn
/autobiography/used-books/a-long-walk-to-freedom-authore-name-isbn It would be great to have just a single URL but a few business requirement prevents
us from having just the canonical URL only even with the new structure. Since we will still have two URLs to access the same content and we were wondering
whether we will need to do a 1:1 301 redirect on the current URLs or since there will be canonical URL
(/autobiography/used-books/a-long-walk-to-freedom-authore-name-isbn),
we won't need to worry about doing the 1:1 redirect on the the indexed content? Please note that the content will still be accessible from the OLD URL (unless 301ed of course). If it is advisable to do a 1:1 301 redirect this is what we intend to do: /used-books/autobiography-a-long-walk-to-freedom-isbn 301 to
/used-books/autobiography-a-long-walk-to-freedom-author-name-isbn /autobiography/used-books/a-long-walk-to-freedom-isbn 301 to
/autobiography/used-books/a-long-walk-to-freedom-authore-name-isbn Any advice/suggestions would be greated appreciated. Thank you.0 -
Back links Building and article/blog posting
Hi all, I have been researching the best way for back links building, and I would like to ask few questions before I start. Which one of these tools would you recommend for back link building diagnostics. www.linkrisk.com - www.linkdetox.com What would be the best procedure to begin creating healthy back links? Would looking at my competitors back links help me? What would be the recommended amount of back links created per week? Also how many blogs entries should we aim to create per week? The website i'm working on is manvanlondon.co.uk If you guys have any further suggestions please let me know. Many thanks for your time.
Intermediate & Advanced SEO | | monicapopa0 -
URL Parameter Handling In GWT to Treat Overindexation - how aggressive?
Hi, My client recently launched a new site and their index went from about 20K up to about 80K - which is a severe over indexation. I believe this was caused by parameter handling as some category pages now have 700 pages in the results for "site:domain.com/category1" - and apart from the top result, they are all parameters being indexed. My question is how active/aggressive should I be in blocking these parameters in Google Webmaster Tools? Currently, everything is set to 'let googlebot decide'.
Intermediate & Advanced SEO | | LukeyJamo0 -
Where Does Blogging Fit Into SEO
I read an article yesterday that said blogging comes under the heading of social media, which is at the top of the so called SEO pyramid. I have taken this to mean less time should be spent in social media compared to other areas of SEO. Yet content creation was at the bottom of the pyramid (more time allocation here). Isn't blogging part of content creation? I would have thought there is a limit to what can be done for service/product & landing pages. Whereas blogs are a great way to produce more unique content for a website. Any clarification would be appreciated. Thanks - Christina
Intermediate & Advanced SEO | | ChristinaRadisic0 -
Lots of incorrect urls indexed - Googlebot found an extremely high number of URLs on your site
Hi, Any assistance would be greatly appreciated. Basically, our rankings and traffic etc have been dropping massively recently google sent us a message stating " Googlebot found an extremely high number of URLs on your site". This first highligted us to the problem that for some reason our eCommerce site has recently generated loads (potentially thousands) of rubbish urls hencing giving us duplication everywhere which google is obviously penalizing us with in the terms of rankings dropping etc etc. Our developer is trying to find the route cause of this but my concern is, How do we get rid of all these bogus urls ?. If we use GWT to remove urls it's going to take years. We have just amended our Robot txt file to exclude them going forward but they have already been indexed so I need to know do we put a redirect 301 on them and also a HTTP Code 404 to tell google they don't exist ? Do we also put a No Index on the pages or what . what is the best solution .? A couple of example of our problems are here : In Google type - site:bestathire.co.uk inurl:"br" You will see 107 results. This is one of many lot we need to get rid of. Also - site:bestathire.co.uk intitle:"All items from this hire company" Shows 25,300 indexed pages we need to get rid of Another thing to help tidy this mess up going forward is to improve on our pagination work. Our Site uses Rel=Next and Rel=Prev but no concanical. As a belt and braces approach, should we also put concanical tags on our category pages whereby there are more than 1 page. I was thinking of doing it on the Page 1 of our most important pages or the View all or both ?. Whats' the general consenus ? Any advice on both points greatly appreciated? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
Removing Duplicate Page Content
Since joining SEOMOZ four weeks ago I've been busy tweaking our site, a magento eCommerce store, and have successfully removed a significant portion of the errors. Now I need to remove/hide duplicate pages from the search engines and I'm wondering what is the best way to attack this? Can I solve this in one central location, or do I need to do something in the Google & Bing webmaster tools? Here is a list of duplicate content http://www.unitedbmwonline.com/?dir=asc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=asc&mode=list&order=name
Intermediate & Advanced SEO | | SteveMaguire
http://www.unitedbmwonline.com/?dir=asc&order=name http://www.unitedbmwonline.com/?dir=desc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=desc&mode=list&order=name http://www.unitedbmwonline.com/?dir=desc&order=name http://www.unitedbmwonline.com/?mode=grid http://www.unitedbmwonline.com/?mode=list Thanks in advance, Steve0