Best URL format for pagination
-
We're currently changing the URL format of our website search, we have been discussing a lot and cannot decide the past way to pass the pagination parameter for SEO.
We narrowed down to the options.
- www.website.com/apples?page=2
- www.website.com/apples/page/2
What would give us best ranking returns? What do you think?
-
Honestly, none of these URL structures is going to have an impact on your rankings, but my preference would be for www.website.com/apples?page=2.
The more important thing is to make sure that you have your canonical tags and rel=next/prev implemented properly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB3 -
Special characters in URL
Will registered trademark symbol within a URL be bad? I know some special characters are unsafe (#, >, etc.) but can not find anything that mentions registered trademark. Thanks!
Technical SEO | | bonnierSEO0 -
404 Best Practices
Hello All, So about 2 months ago, there was a massive spike in the number of crawl errors on my site according to Google Webmaster tools. I handled this by sending my webmaster a list of the broken pages with working pages that they should 301 redirect to. Admittedly, when I looked back a couple weeks later, the number had gone down only slightly, so I sent another list to him (I didn't realize that you could 'Mark as fixed' in webmaster tools) So when I sent him more, he 301 redirected them again (with many duplicates) as he was told without really digging any deeper. Today, when I talked about more re-directs, he suggested that 404's do have a place, that if they are actually pages that don't exist anymore, then a ton of 301 re-directs may not be the answer. So my two questions are: 1. Should I continue to relentlessly try to get rid of all 404's on my site, and if so, do I have to be careful not to be lazy and just send most of them to the homepage. 2. Are there any tools or really effective ways to remove duplicate 301 redirect records on my .htaccess (because the size of it at this point could very well be slowing down my site). Any help would be appreciated, thanks
Technical SEO | | CleanEdisonInc0 -
Temporary Redirect - on nonexistant URL
I'm getting a Temporary Redirect issue on | http://www.luckygemstones.com/botswana-legends.htm http://www.luckygemstones.com/botswana-legends.htm | http://www.luckygemstones.com/page-not-found.htm | 1 | 0 | 302 | YET! There is no such page on my site. I believe I had one once, but has been corrected for a while now. WHY is SEOMOZ picking this up as an error and how can I fix? Kathleen http://www.luckygemstones.com
Technical SEO | | spkcp1110 -
I need help to define which is the best friendly url structure
Hi, I need some help to define which is the best friendly url structure for my new project, I'm in doubt for some cases, anyone could help me define which would be the best way? domain.com/buy-online/0-1,this-cool-model or
Technical SEO | | LeonardoLima
domain.com/buy-online/this-cool-model,0-1 or
domain.com/buy-online/0-1/this-cool-model or
domain.com/buy-online/this-cool-model/0-1 or
domain.com/buy-online/this-cool-model_0-1 or
domain.com/buy-online/this-cool-model?Model=0&OtherParam=1 Thanks! Best Regards,
Leonardo Lima0 -
Best Practice to Remove a Blog
Note: Re-posting since I accidentally marked as answered Hi, I have a blog that has thousands of URL, the blog is a part of my site. I would like to obsolete the blog, I think the best choices are 1. 404 Them: Problem is a large number of 404's. I know this is Ok, but makes me hesitant. 2. meta tag no follow no index. This would be great, but the question is they are already indexed. Thoughts? Thanks PS A 301 redirect to the main page would be flagged as a soft 404
Technical SEO | | Bucky0 -
URL rewriting from subcategory to category
Hello everybody! I have quite simple question about URL rewriting from subcategory to category, yet I can't find any solution to this problem (due to lack of my deeper apache programming knowledge). Here is my problem/question: we have two website url structures that causes dublicate problems: www.website.lt/language/category/ www.website.lt/language/category/1/ 1 and 2 pages are absolutely same (both also returns 200 OK). What we need is 301 redirect from 2 to 1 without any other deeper categories redirects (like www.website.com/language/category/1/169/ redirecting to .../category/1/ or .../category/). Here goes .htaccess URL rewrite rules: RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&par3=$5&par4=$6&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&par3=$5&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&par2=$4&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/([^/]+)/$ /index.php?lang=$1&idr=$2&%{QUERY_STRING} [L] RewriteRule ^([^/]{1,3})/$ /index.php?lang=$1&%{QUERY_STRING} [L] There are other redirects that handles non-www to www and related issues: RedirectMatch 301 ^/lt/$ http://www.domain.lt/ RewriteCond %{HTTP_HOST} ^domain.lt RewriteRule (.*) http://www.domain.lt/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.)/$RewriteRule ^(.)$ http://www.domain.lt/$1/ [R=301,L] At this moment we cannot solve this problem with rel canonical (due to our CMS limits). Thanks for your help guys! If You need any other details on our coding, just let me know.
Technical SEO | | jkundrotas0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0