Webmaster > HTML Improvements > Duplicate title tags
-
Hi,
Webmaster > HTML Improvements > Duplicate title tags shows as in screenshot. How can i set this right as its a single page which duplicates?
Regards
-
-
First - I would noindex them with a robots meta noindex - and not use the robots.txt disallow. The whole point is to not have them in the index. The robots.txt will prevent crawling but not remove from the index. So noindex the archives and remove the robots.txt disallow.
Then - just wait. WMT data can take months to catch up. I would not worry about the data in WMT so much though if you know you've got the right settings.
-Dan
-
Kindly follow the format that I have mentioned and check out results in next 3 days,
-
Disallow: /archive/ does not work for urls like archive?page=333
I suggest to block it as Christopher described above or like this:
Disallow: /archive?*
-
The screenshot is from webmaster tools
I do have robots with
Disallow: /archive/ but still dhows as duplicates in webmasters
-
I simply suggest you use the robots.txt file.
Create a robots.txt file like it.
User-agent: * Disallow: /archive Upload it into your root folder of the Site. The Other way to deal with duplicates is changing the title of each duplicated page. For more details you can read this detail article. [http://www.seomoz.org/learn-seo/title-tag](http://www.seomoz.org/learn-seo/title-tag)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicates - How to know if trailing slashes are creating duplicate pages?
Hi, How do you determine whether trailing slashes are creating duplicate pages? Search Console is showing both /about and about/ for example but how do I know whether this is a problem? Thanks James
Technical SEO | | CamperConnect140 -
Are duplicate page titles fixed by the canonical tag
Google Web Master Tools is saying that some of my pages have duplicate page titles because of pagination. However, I have implemented the canonical tag on the paginated pages which I thought would keep my site from being penalized for duplicate page titles. Is this correct? Or does canonical tag only relate to duplicate content issues?
Technical SEO | | Santaur0 -
Webmaster tools
Hello, My sites are showing odd "links to your site" data in WMT. Its not showing any links to the homepages and reduced links for other pages. Anyone else seeing this? Penguin refresh maybe?
Technical SEO | | jwdl0 -
Duplicate Page Title for multilingual wordpress site
Hello all, I have received my first crawl reports and I find a lot of errors of duplicate page title. In the wordpress site I use the qtranslate plugin in order to have the site in 2 languages. I also use the Yoast SEO plugin in order to put titles, description and keywords to each web page. By looking deeply in the duplicate page title errors I think I found that the problem is that every web page takes the same SEO Title for each language. But I am not 100% sure. I tried to use some shortcodes of the qtranslate plugin like the following ABOUT [:en]About in order to indicate and give different titles per language for one web page but that doesn't seem to work. Does anybody here has experienced the same problem as me? Do you have any suggestions about how to ressolve the problem of the duplicate page title? I can give you the URL of the website if you need it to have a look. Thank you in advanced for your help. I really appreciate that. Regards, Lenia
Technical SEO | | tevag0 -
Title Tags & Url Structure
So I'm working on a website for a client in the Tourism Industry. We've got a comprehensive list of museums & other attractions in a number of cities that have to go online. And we have to come up with the correct url structure, title tags and obviously content. My current line of thought was to work the urls in the following way. http://domain.com/type-of-attraction/city/name-of-attraction/ This is mainly because we think that the type of attraction is far more important then the city (SEO wise) as the country as a whole receives more searches, however we require a city in the url to make it unique because some attractions across cities happen to share names and we don't want to have the names of attractions littered with city names. However for title-tags I wanted to go the other way around, again due to the attraction type being more important then the city. Name of Attraction - Type of Attraction - City - Brand Name or Name of Attraction - Type of Attraction in City - Brand Name I am quite confident in working it this way; however I would appreciate if I receive some feedback on this structure, you think its good or you would make any suggestions / alterations. One last thing, There's the possibility of having many urls ending up with the same city names (For each type of attraction) I would think that just providing a list of links & duplicate text is not enough; would you suggest a canonical pointing to a link containing just information on the city? and using the other pages for user-navigation only? or should i set variables in the text which are replaced by the types of attraction so that the text looks different for each one?
Technical SEO | | jonmifsud0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
Duplicate Content Question
Just signed up for pro and did my first diagnostic check - I came back with something like 300 duplicate content errors which suprised me because every page is unique. Turns out my pages are listed as www.sportstvjobs.com and just sportstvjobs.com does that really count as duplicate? and if so does anyone know what I should be doing differently? I thought it was just a canonical issue, but best I can tell I have the canonical in there but this still came up as a duplicate error....maybe I did canonical wrong, or its some other issue? Thanks Brian Clapp
Technical SEO | | sportstvjobs0 -
Duplicate content and tags
Hi, I have a blog on posterous that I'm trying to rank. SEOMoz tells me that I have duplicate content pretty much everywhere (4 articles written, 6 errors at the last crawl). The problem is that I tag my posts, and apparently SEOMoz thinks that it's duplicate content only because I don't have so many posts, so pages end up being very very similar. What can I do in these situations ?
Technical SEO | | ngw0