Duplicate Content Errors
-
Ok, old fat client developer new at SEO so I apologize if this is obvious.
I have 4 errors in one of my campaigns. two are duplicate content and two are duplicate title.
Here is the duplicate title error
Rare Currency And Old Paper Money Values and Information.
http://www.antiquebanknotes.com/Rare Currency And Old Paper Money Values and Information.
http://www.antiquebanknotes.com/Default.aspxSo, my question is... What do I need to do to make this right? They are the same page. in my page load for default.aspx I have this:
this.Title = "Rare Currency And Old Paper Money Values and Information.";
And it occurs only once...
-
They offer II7 but I didn't see any point to it...
Now it's dawning on me I may need to reconsider.
-
I dont think it works on IIS6
There is your problem.
You can fix your www problem by code, see tutorials, but fixing the default page is not so easy. as in ASP.Net you can not detect the difference between domain.com and domain.com/default.aspx
Best thing you can do is make sure all your internal links point to domain.com and not domain.com/default.aspx any external links pointing to default.aspx will be watsed, but sicne no internal links point to default.aspx you are unlikely to get any mopre external links pointing to it.
GoDaddy much be using old servers.
-
It's II6 hosted on Godaddy. I will see if I can get my IIS restarted.
-
Cant see anything wrong with web.config
I should of asked before if you are using IIS7?
if so try to debug with the article http://blogs.iis.net/ruslany/archive/2008/10/30/debug-and-troubleshoot-rewrite-rules-easily.aspx
I have the same code as you in many sites all working fine
you may need to restart IIS?
-
nor is the non www redirected to the www
the code i gave you should be inside the configuration tag
<configuration></configuration>
place here
and you should only have one <system.webserver>tag</system.webserver>
if you want you cany send me your web.config and ill have a look
-
Ok, I have that in place... if I am testing right, the default.aspx isn't working. Maybe it's a hosting issue?
-
What this line is saying is, if not antiquebanknotes.com then redirect, note the negate=true
<add input="{HTTP_HOST}" pattern="^.antiquebanknotes.com$" negate="true"></add>
What you should have is
<system.webserver><rewrite><rules><rule name="CanonicalHostNameRule1"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^www.antiquebanknotes.com$" negate="true"></add></conditions>
<action type="Redirect" url="http://www.antiquebanknotes.com/{R:1}"></action></match></rule>
<rule name="Default Page" enabled="true" stopprocessing="true"><match url="^default.aspx$"><conditions logicalgrouping="MatchAll"><add input="{REQUEST_METHOD}" pattern="GET"></add></conditions>
<action type="Redirect" url="/"></action></match></rule></rules></rewrite></system.webserver>Try that, let me know when you have it in place and I will test it for you.
-
I thought this pattern:
was ok for me bcause I am looking to take AntiqueBanknotes.com and make it into www.AntiqueBankNotes etc
I think my hostname redirect is working ok.... it's the default.aspx redir that isn't working. I have it in the same Rules section.
Look ok?
<rule name="Default Page" enabled="true" stopprocessing="true"><match url="^default.aspx$"><conditions logicalgrouping="MatchAll"><add input="{REQUEST_METHOD}" pattern="GET"></add></conditions>
<action type="Redirect" url="/"></action></match></rule> -
You have an error above, you should have
pattern="^www.antiquebanknotes.com$"
not pattern="^antiquebanknotes\.com$"
The other rule does work i just tested it
When adding it place stright under the last rule inside the same rules tag, there should only be one rules tag ```
-
Sorry, I forgot to add that I put that rule in before but it doesn't seem to be working for me. Not sure why but sadly it's time for my day job so I will take a look tonight.
-
Add the rule on this page to it
http://perthseocompany.com.au/seo/tutorials/how-to-fix-canonical-issues-involving-the-default-pageit should work no worries
-
Ah, I see... I have added the snippets to the web config for default page and www vs non www pages. it seems like the www issue is working correctly. I will have to work on the default issue as it seems to ignore my rule.
This is what I have done on that score.
<rule name="CanonicalHostNameRule1"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^antiquebanknotes.com$" negate="true"></add></conditions>
<action type="Redirect" url="<a href=" http:="" www.antiquebanknotes.com="" {r:1"="">http://www.AntiqueBanknotes.com/{R:1}" />
</action></match></rule>Thanks for all your help!
-
Using iis writes to teh web.config file fior you, the tutorials show you the code it writes, you can just copy it into the web.config
There is also some tuorials on how to make a httpmodule that works much like the global.asax file but intercepts the requerst before it reaches the website, much better preformace.
Also i cheked your non www and it resolved to this
http://www.antiquebanknotes.com/antiquebanknotes/default.aspx
I notive above you dont have "" on the first url, is that just a typo?
this tutorial uses the httpmodule, it resolves the other way from www to nonwww, but you can alter it
http://perthseocompany.com.au/seo/tutorials/using-ihttpmodule-c-sharp
or you can just edit the web.config as I sugested above
-
Hi Alan,I appreciate the links and now I know what the problem is... sadly though I am not able to access IIS as this is running on Godaddy... I guess I will have to implement something in the global.asax file?I have this code in the global.asax to handle the www and non www page issue. string request = HttpContext.Current.Request.Url.ToString().ToLower();if (request.Contains("http://antiquebanknotes.com")){ HttpContext.Current.Response.Clear(); HttpContext.Current.Response.Status = "301 Moved Permanently"; HttpContext.Current.Response.AddHeader("Location", request.Replace (http://antiquebanknotes.com", "http://www.antiquebanknotes.com));}
-
They are seen as 2 different pages to search engines adn spliting your rank,
Here is a tutoprial just for this problem
http://perthseocompany.com.au/seo/tutorials/how-to-fix-canonical-issues-involving-the-default-page
You may want to have a look at this one also for yourt www and non www pages.
http://perthseocompany.com.au/seo/tutorials/how-to-fix-canonical-domain-name-issues
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Footer Content Issue
Please check given screenshot URL. As per the screenshot we are using highlighted content through out the website in the footer section of our website (https://www.mastersindia.co/) . So, please tell us how Google will treat this content. Will Google count it as duplicate content or not? What is the solution in case if the Google treat it as duplicate content. Screenshot URL: https://prnt.sc/pmvumv
Technical SEO | | AnilTanwarMI0 -
Duplicate content
I have one client with two domains, identical products to appear on both domains. How should I handle this?
Technical SEO | | Hazel_Key0 -
Affiliate Url & duplicate content
Hi i have checked passed Q&As and couldn't find anything on this so thought I would ask.
Technical SEO | | Direct_Ram
I have recently noticed my URLS adding the following to the end: mydomain.com/?fullweb=1 I cant seem to locate where these URLS are coming from and how this is being created? This is causing duplicate content on google. I wanted to know ig anyone has had any previous experience with something like this? If anyone has any information on this it would be a great help. thanks E0 -
Duplicate Content Issues - Where to start???
Dear All I have recently joined a new company Just Go Holidays - www.justgoholidays.com I have used the SEO Moz tools (yesterday) to review the site and see that I have lots of duplicate content/pages and also lots of duplicate titles all of which I am looking to deal with. Lots of the duplicate pages appear to be surrounding, additional parameters that are used on our site to refine and or track various marketing campaigns. I have therefore been into Google Webmaster Tools and defined each of these parameters. I have also built a new XML sitemap and submitted that too. It looks as is we have two versions of the site, one being at www.justgoholidays.com and the other without the www It appears that there are no redirects from the latter to the former, do I need to use 301's here or is it ok to use canonicalisation instead? Any thoughts on an action plan to try to address these issues in the right order and the right way would be very gratefully received as I am feeling a little overwhelmed at the moment. (we also use a CMS system that is not particularly friendly and I think I will have to go directly to the developers to make lots of the required changes which is sure to cost - therefore really don't want to get this wrong) All the best Matt
Technical SEO | | MattByrne0 -
WordPress Duplicate Content Caused By Categories
Hello, We have a wordpress blog that has around 250 categories. Due to our platform we have a hierarchy structure for 3 separate stores. For example iPhone > Apps > Books. Placing a blog post in the books category automatically places it into iPhone and iPhone/Apps category, causing 3 instances of any blog post in this category. Is this an issue? I have seen 2 schools of thought on categories, 1 index follow and 2 noindex follow. I know some of our categories get indexed, but with so many, maybe it is better to noindex them. We also considered reducing our categories to 10 to 12 and use tags to provide the indexed site navigation as follows: Reviews (category) iPhone Book App, iPhone App Store (tags) but this seems a little redundant? Anyone want to take this on? thank you Mike
Technical SEO | | crazymikesapps10 -
Duplicated content in news portal: should we use noindex?
Hello, We have a news portal, and like other newspapers we have our own content and content from other contributors. Both our content and our contributors content can be found in other websites (we sell our content and they give theirs to us). In this regard, everything seems to work fine from the business and users perspective. The problem is that this means duplicated content... so my question is: "Should we add the noindex,nofollow" tag to these articles? Notice that there might be hundreds of articles everyday, something like a 1/3 of the website. I checked one newspaper which uses news from agencies, but they seem not to use any noindex tag. Not sure what others do. I would appreciate any opinion on that.
Technical SEO | | forex-websites0 -
Duplicate homepage content across multiple websites
Hi, I work for a company that runs 30+ membership based websites on separate domains and across multiple markets. The homepage for each site contains a section of content that highlights the site benefits and features. While each website serves a different market/niche, this section of content is essentially the same as each website offers the same benefits and features. What is the best way to avoid duplicate content issues while still being able to show the same section of content across 30+ sites? This particular section of content isn't valuable from an SEO perspective, but the rest of the content on that page is. Any ideas or suggestions would be much appreciated. Thanks
Technical SEO | | CupidTeam0 -
Tips and duplicate content
Hello, we have a search site that offers tips to help with search/find. These tips are organized on the site in xml format with commas... of course the search parameters are duplicated in the xml so that we have a number of tips for each search parameter. For example if the parameter is "dining room" we might have 35 pieces of advice - all less than a tweet long. My question - will I be penalized for keyword stuffing - how can I avoid this?
Technical SEO | | acraigi0