Duplicate Content Reports
-
Hi
Dupe content reports for a new client are sjhowing very high numbers (8000+) main of them seem to be for sign in, register, & login type pages, is this a scenario where best course of action to resolve is likely to be via the parameter handling tool in GWT ?
Cheers
Dan
-
Cool - Many thanks Kurt !
All Best
Dan
-
You don't absolutely have to do both, but by doing the parameter handling you are sending another signal to Google of what you want them to do (keep in mind that both canonical and parameters are only considered suggestions by Google). It's pretty simple to setup the parameter handling, so if you are really concerned about the duplicate content issues, why not do both?
Also, technically, the canonical tag tells Google which URL they've crawled to give prominence to when they are duplicate content, whereas my understanding is that parameter handling (when Google follows your suggestions) actually prevents Google from even crawling URLs with those parameters. In other words, canonical tags tell Google what to do with URLs they've crawled and parameter handling tells Google what URLs not to even crawl.
-
Thanks Kurt
and what about the parameter handling tool ? if canonical tag method you mention will deal with this then is there any need to do anything with parameter handling tool ?
cheers
dan
-
I would answer the same as Kurt for the install. You put the noindex tag in the header of the core page and so when all the other pages are generated with the parameters it will be added to those pages automatically. Once you get the pages out of the index, then I would nofollow links or use robots.txt to those pages to keep the bots out to start with.
-
Hi Dan,
I mean both. The canonical tag will help with duplicate content issues and the parameter handling will help with indexing.
Setting up the canonical tag shouldn't be an issue. If the same page content is being displayed and the only difference is that the URL has some parameters in it, then the canonical tag should naturally be included with the rest of the page's code. Since the canonical tag doesn't change, it should work perfectly.
For example, if you have a page, login.php, and that page always has a parameter, ?visitor=### (where ### is a random number), then you simply put the canonical tag in the head of the login.php page (). That canonical tag will always be in the login.php page no matter whether the URL is login.php?visitor=123 or login.php?visitor=56, etc. It will always tell the search engines that the original page login.php.
-
Thanks Clever PHD
So is there a way of setting a general rule to apply noindex to all of these duplicates or do you mean to the main actual sign in/login pages which will hence apply to all new, sessions specific, duplicate versions of the main sign-in/log-in pages etc when generated ?
Cheers
Dan
-
HI Kurt
Do you mean both or one or the other ?
Isn't setting up canonical tags on all the possible dynamically generated login, sign up and registration type pages impossible e can you set up some sort of rule that applies to those unpredictable (since we dont know what they are until they are generated by a user session etc) pages ?
Cheers
Dan
-
You can also noindex those pages to simply take them out of the index and then later nofollow links to them.
-
You can use the parameter handling and setup canonical tags on the pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle one section of duplicate content
Hi guys, i'm wondering if I can get some best practice advice in preparation for launching our new e-commerce website. For the new website we are creating location pages with a description and things to do which will lead the user to hotels in the location. For each hotel page which relates to the location we will have the same 'Things to do' content. This is what the content will look like on each page: Location page Location title (1-3 words) Location description (150-200 words) Things to do (200-250 words) Reasons to visit location (15 words) Hotel page Hotel name and address (10 words) Short description (25 words) Reasons to book hotel (15 words) Hotel description (100-200 words) Friendly message why to visit (15 words) Hotel reviews feed from trust pilot Types of break and information (100-200 words) Things to do (200-250 words) My question is how much will we penalised for having the same 'Things to do' content on say up to 10 hotels + 1 location page? In an ideal world we want to develop a piece of code which tells search engines that the original content lies on the location page but this will not be possible before we go live. I'm unsure whether we should just go and take the potential loss in traffic or remove the 'Things to do' section on hotel pages until we develop the piece of code?
Technical SEO | | CHGLTD1 -
Categories VS Tag Duplicate Content
Hello Moz community, I have a question about categories and tags . Our customer www.elshow.pe just had a redesign of its website. We use the same categories listed before . The only change was that two sub categories were added ( these sub-categories were popular tags before ) .Then now I have 2 URL's covering the same content: The first is the URL of the subcategory : www.elshow.pe/realitys/combate/ The second is the URL that is generated by the tag "combate" that is www.elshow.pe/noticias/combate/ I have the same with the second sub category: "Esto es guerra" www.elshow.pe/realitys/esto-es-guerra/ www.elshow.pe/noticias/esto-es-guerra/ The problem is when I search the keyword "combate" in my country (Perú), the URL that positions is the tag URL in 1st page. But, when I search for "esto es guerra" the URL that positions is the **sub category **in the second page. I also check in OSE both links and sub categories goes better than tags. So what do you guys recommend for this? 301 redirect? canonicals? Any coment is welcome. Thanks a lot for your time. Italo,
Technical SEO | | neoconsulting
@italominano WmzlklG.png 1RKcoX8.png0 -
How to avoid duplicate content
Hi, I have a website which is ranking on page 1: www.oldname.com/landing-page But because of legal reason i had to change the name.
Technical SEO | | mikehenze
So i moved the landing page to a different domain.
And 301'ed this landing page to the new domain (and removed all products). www.newname.com/landing-page All the meta data, titles, products are still the same. www.oldname.com/landing-page is still on the same position
And www.newname.com/landing-page was on page 1 for 1 day and is now on page 4. What did i do wrong and how can I fix this?
Maybe remove www.oldname.com/landing-page from Google with Google Webmaster Central or not allow crawling of this page with .htaccess ?0 -
Using canonical for duplicate contents outside of my domain
I have 2 domains for the same company, example.com and example.sg Sometimes we have to post the same content or event on both websites so to protect my website from duplicate content plenty i use canonical tag to point to either .com or .sg depend on the page. Any idea if this is the right decision Thanks
Technical SEO | | MohammadSabbagh0 -
Removed .html - Now Get Duplicate Content
Hi there, I run a wordpress website and have removed the .html from my links. Moz has done a crawl and now a bunch of duplicated are coming up. Is there anything I need to do in perhaps my htaccess to help it along? Google appears to still be indexing the .html versions of my links
Technical SEO | | MrPenguin0 -
Partially duplicated content on separate pages
TL;DR: I am writing copy for some web pages. I am duplicating some bits of copy exactly on separate web pages. And in other cases I am using the same bits of copy with slight alterations. Is this bad for SEO? Details: We sell about 10 different courses. Each has a separate page. I'm currently writing copy for those pages. Some of the details identical for each course. So I can duplicate the content and it will be 100% applicable. For example, when we talk about where we can run courses (we go to a company and run it on their premises) – that's applicable to every course. Other bits are applicable with minor alterations. So where we talk about how we'll tailor the course, I will say for example: "We will the tailor the course to the {technical documents|customer letters|reports} your company writes." Or where we have testimonials, the headline reads "Improving {customer writing|reports|technical documents} in every sector and industry". There is original content on each page. The duplicate stuff may seem spammy, but the alternative is me finding alternative re-wordings for exactly the same information. This is tedious and time-consuming and bizarre given that the user won't notice any difference. Do I need to go ahead and re-write these bits ten slightly different ways anyway?
Technical SEO | | JacobFunnell0 -
Business/Personal Blog Duplicate Content
Quick Question. I am in the process of launching a new website for my IT business which will include a blog. I also want to start up my personal blog again. I want to publish some blog posts to both my business and personal blogs but I don't want to have any duplicate content issues. I am not concerned with building the SERPs of my personal blog but I am very focused on the business blog/site. I am looking for some ideas of how I can publish content to both sites without getting hurt by duplicate content. Again, I am not concerned with building up the placement of my personal site but I do want to have a strong personal site that helps build my name. Any help on this would be great. Thanks!
Technical SEO | | ZiaTG0 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0