URL parameters causing duplicate content errors
-
My ISP implemented product reviews. In doing so, each page has a possible parameter string of ?wr=1.
I am not receiving duplicate page content and duplicate page title errors for all my product URLs. The report shows the base URL and the base URL?wr=1.
My ISP says that the search engines won't have a problem with the parameters and a check of Google Webmaster Tools for my site says I don't have any errors and recommends against configuring URL parameters.
How can I get SEOmoz to stop reporting these errors?
-
So far I have not found a way to exclude that from the SEOmoz scan of the website.
The good news is the GWT doesn't show duplicate content errors like SEOmoz does. If anything you can send a feature request to see if that can get taken care of.
-
You always have to remember that it is more important how the search engines view your site than your SEO reporting tools.
If you see the search engines having any problems with duplicate pages being indexed, do these steps.
-
Use Google Webmaster Tools to Ignore the Parameters (?wr=1)
-
Use the canonical tag on the pages with this parameter added to reflect the regular URL
-
Make sure you link internally/externally to the canonical version of your page
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Copy partial content to other pages ?
One of our clients looking to redesign their website since we're redesigning the whole website we thought it would be good idea to separate services into individual pages so every service will have it's own page (currently there is 1 page that describes all of the services). what we're planing to do is to write unique content for each service page (about 300-400 keywords), but we also want to use some of the existing content which is kind of explains the process of provided services. so here i need your help! what would be the best practice to use same part of existing content on every service page without getting penalized for duplicated content? here is how we want to structure the page with h1 and h2 <main> Service name (same as page title) Subline new and unique content about 300-400 keywords Part of old content which is going to be placed on every service page </main> any help would be much appreciated!
Web Design | | MozPro30 -
Help with error: Not Found The requested URL /java/backlinker.php was not found on this server.
Hi all, We got this error for almost a month now. Until now we were outsourcing the webdesign and optimization, and now we are doing it in house, and the previous company did not gave us all the information we should know. And we've been trying to find this error and fix it with no result. Have you encounter this issue before? Did anyone found or knows a solution? Also would this affect our website in terms of SEO and in general. Would be very grateful to hear from you. Many thanks. Here is what appears on the bottom of the site( www.manvanlondon.co.uk) Not Found The requested URL /java/backlinker.php was not found on this server. <address>Apache/2.4.7 (Ubuntu) Server at 01adserver.com Port 80</address> <address> </address> <address> </address>
Web Design | | monicapopa0 -
W3C My site has 157 Errors, 146 warning(s) Is it an issue?
Is having this number of W3C errors & warnings an issue and will be impacting my site's performance? When the site was built 6 months ago my developers told me that it "was nothing to worry about", but I have read that any errors aren't good, let alone the huge number my site has? Your advice please Thanks Ash
Web Design | | AshShep10 -
How to handle International Duplicated Content?
Hi, We have multiple international E-Commerce websites. Usually our content is translated and doesn't interfere with each other, but how do search engines react to duplicate content on different TLDs? We have copied our Dutch (NL) store for Belgium (BE) and i'm wondering if we could be inflicting damage onto ourselves... Should I use: for every page? are there other options so we can be sure that our websites aren't conflicting? Are they conflicting at all? Alex
Web Design | | WebmasterAlex0 -
Could our drop in organic rankings have been caused by improper mobile site set-up?
Site: 12 year old financial service 'information' site with lead gen business model. Historically has held top 10 positions for top keywords and phrases. Background: The organic traffic from Google has fallen to 50% of what it was over the past 4 months compared to the same months last year. While several potential factors could be responsible/contributing (not limited to my pro-active removal of a dozen old emat links that may be perceived as unnatural despite no warning), this drop coincides with the same period the 'mobile site' was launched. Because I admittedly know the least about this potential cause, I am turning to the forum for assistance. Because the site is ~200 pages and contains many 'custom' pages with financial tables, forms, data pulled from 3rd parties, custom/different layouts we opted for creating a mobile site of only the top 12 most popular pages/topics just to have a mobile presence (instead of re-coding the entire site to make it responsive utilizing a mobile css). -These mobile pages were set up in an "m." subdomain. -We used bi-directional tagging placing a rel=canonical tag on the mobile page, and a rel=alternate tag on the desktop page. This created a loop between the pages, as advised by Google. -Some mobile pages used content from a sub page, not the primary desktop page for a particular topic. This may have broken the bi-directional 'loop', meaning the rel=canonical on the mobile page would point to a subpage, where the rel=alternate would point to the primary desktop page, even though the content did not come from that page, necessarily. The primary desktop page is the one that ranks for related keywords. In these cases, the "loop" would be broken. Is this a cause for concern? Could the authority held by the desktop page not be transferred to the mobile version, or the mobile page 'pull away' or disperse the strength of the desktop page if that 'loop' was not connected? Could not setting up the bi-directional tags correctly cause a drop in the organic rankings? -Our developer verified the site is set up according to Google's guidelines for identifying device screen size and serving appropriate version of page. -Are there any tools or utilities that I can use to identify issues, and/or verify everything is configured correctly? -Are we missing anything important in the set-up/configuration? -Could the use of a brand new subdomain 'm.' in and of itself be causing issues? -Have I identified any negative seo practices or pitfalls? Am I missing or overlooking something? While i would have preferred maintaining a single, responsive, site with mobile css, it was not realistic given the various layouts, and owner's desire to only offer the top pages in mobile format. The mobile site may have nothing to do with the organic drop, but I'd like to rule it out if so, and I have so many questions. If anyone could address my concerns, it would be greatly appreciated. Thanks! Greg
Web Design | | seagreen0 -
Tips on website redesign on site with messy URLs?
So I've inherited quite a messy website. It was in drupal and the owner wants it in wordpress. One of the problems is the link paths. Should I try to recreate them exactly? i.e. something/somethingelse/page/ or use redirects (which I'm not confident in doing). Also, some of the pages end in .html, others in a back slash and others without slahes, there's no consistency. Do you have any tips in general? I remember an older seomoz blogpost about successful website relaunches (with press releases and mass emails and stuff being sent out on launch to boot). Thanks!
Web Design | | seonubblet0 -
Need help in website URL Structure
I have been working on a brand new website currently it is live but I have disallow Googlebots temporarily as I dint want any negative impact. The business of the site is to generate leads , they install and sell Stairlifts and used Stairlifts. There are two main categories New Stairlifts and Reconditioned Stairlifts Currently the URL for new Stairlifts is : http://willowstairlifts.co.uk/stairlifts/ and for Reconditioned Stairlifts is: http://willowstairlifts.co.uk/reconditioned-stairlifts/ My concerns are that the word Stairlifts is mentioned twice in the urls so is it going to have a negative impact or panda penalty? I am thinking of changing them to http://willowstairlifts.co.uk/new/ and the product pages to display as http://willowstairlifts.co.uk/new/brooks/ Currently its http://willowstairlifts.co.uk/stairlifts/brooks/ Same with reconditioned Stairlifts I like to change it to : http://willowstairlifts.co.uk/reconditioned Also its product pages to http://willowstairlifts.co.uk/reconditioned/brooks/ As currently its http://willowstairlifts.co.uk/reconditioned-stairlifts/brooks/ Thanks
Web Design | | conversiontactics0 -
Is there any difference in using an underscore vs. a dash in the directory portion of the url?
A friend who is a software developer asked this question regarding the directory portion of the url: Is it better to use dashes or underscores? I know in the domain name Matt Cutts recommends dashes, but what about the directory extension?
Web Design | | RobertFisher0