Duplicate Content - Captcha on Contact Form
-
I am going to be working on a site where the contact form is being flagged as duplicate content the URL is the same apart from having:
- /contact/10119
- contact/31010
...at the end of it.
The only difference in the content of the page that I can see is the Captcha numbers?
Is there a way to overcome this to stop duplicate content?
Thanks in advance
-
You're welcome. Good luck with your project.
-
Thank you I was thinking something like that but I wasn't sure...
Thanks again!
-
The rel=canonical tag should fix the issue. Something like this in the head section of the page.
Here is the Google Article about Rel Canonical.
Hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO And Duplicate Content Within The Same Language
Hello, Currently, we have a .com English website serving an international clientele. As is the case we do not currently target any countries in Google Search Console. However, the UK is an important market for us and we are seeing very low traffic (almost entirely US). We would like to increase visibility in the UK, but currently for English speakers only. My question is this - would geo-targeting a subfolder have a positive impact on visibility/rankings or would it create a duplicate content issue if both pieces of content are in English? My plan was: 1. Create a geo-targeted subfolder (website.com/uk/) that copies our website (we currently cannot create new unique content) 2. Go into GSC and geo-target the folder to the UK 3. Add the following to the /uk/ page to try to negate duplicate issues. Additionally, I can add a rel=canonical tag if suggested, I just worry as an already international site this will create competition between pages However, as we are currently only targeting a location and not the language at this very specific point, would adding a ccTLD be advised instead? The threat of duplicate content worries me less here as this is a topic Matt Cutts has addressed and said is not an issue. I prefer the subfolder method as to ccTLD's, because it allows for more scalability, as in the future I would like to target other countries and languages. Ultimately right now, the goal is to increase UK traffic. Outside of UK backlinks, would any of the above URL geo-targeting help drive traffic? Thanks
Technical SEO | | Tom3_150 -
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
Duplicate Content?
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.) But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it? Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
Technical SEO | | sakeith0 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
Duplicate content
I'm getting an error showing that two separate pages have duplicate content. The pages are: | Help System: Domain Registration Agreement - Registrar Register4Less, Inc. http://register4less.com/faq/cache/11.html 1 27 1 Help System: Domain Registration Agreement - Register4Less Reseller (Tucows) http://register4less.com/faq/cache/7.html | These are both registration agreements, one for us (Register4Less, Inc.) as the registrar, and one for Tucows as the registrar. The pages are largely the same, but are in fact different. Is there a way to flag these pages as not being duplicate content? Thanks, Doug.
Technical SEO | | R4L0 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0 -
Duplicate content?
I have a question regarding a warning that I got on one of my websites, it says Duplicate content. I'm canonical url:s and is also using blocking Google out from pages that you are warning me about. The pages are not indexed by Google, why do I get the warnings? Thanks for great seotools! 3M5AY.png
Technical SEO | | bnbjbbkb0