Duplicate Page Title
-
Virtually all of my pages are coming up with a "Duplicate Page Title" error even though the page title are different. I assume this is down to the end of the page title having the company name. Is this the reason and is it a problem to have a page title like below...
"Page title description - Company Name"
-
Hi Justin
I have now resolved the duplicate page title as mentioned.
However I am still getting lots of "duplicate page content" errors on all my wordpress tags pages. Any idea's why and how to resolve this?
Thanks
Pete
-
Facepalm. Such a good call. Can't believe I didn't think of that.
-
It isn't possible to change from a root to a subdomain, the only way of doing this is to create a new campaign, you will however start from scratch on the historical data of rankings etc.
I would strongly recommend you keep the campaign set up as is and use 301's to resolve the duplicate issues, as if you don't you rankings are likely to be hurt by duplicate content.
I'm sure you are aware already, but if not, here is the info for setting up 301
For apache
http://www.isitebuild.com/301-redirect.htmFor Windows
http://thatsit.com.au/seo/tutorials/how-to-fix-canonical-domain-name-issuesAs I say, you really should fix this issue as opposed to working around it.
-
Thanks for the reply. Your right. I can't see the option in my settings to change from a root domain to a sub domain. Any idea's?
-
I assume here that you are referring to a "Duplicate page Title" error showing up in the SeoMOZ web app?
The most likely issue is that you have set your campaign up as root domain (i.e. with no www prefix for the domain), Root domains are fine, however if the links in your site point to www.domain.com and domain.com this will show as a duplicate page.
This behaviour is by design of SeoMOZ as Google will also be seeing your pages as duplicates which will most likely harm your SERPs
The best solution to this problem is to 301 or rel=canonical to the preferred domain
ie: if a user types in domain.com you 301 to www.domain.com
Make sure you do this for all domains/subdomains you have pointing to your site and add the trailing slash to the domain i.e. http://www.domain.com/ as this gets added anyway.
This will stop the page reporting as duplicates, also Google will only see one version of your pages which should pack some rankings bonus' for you.
Hope that helps
Justin
-
Depends on the tool you are using but duplicate page title would have to be an exact match on the page title I would expect...
-
Where are you seeing this error appear?
And no, it's unlikely that if all your page titles are being appended with your company name that this would flag duplicate page titles.
If your website is dynamic, it is likely that your page titles are being rewritten but something is delaying the time it takes to rewrite them so any bot/crawler is just detecting them as duplicates... Possibly.
-
Thanks, but I'm still unsure.
Does "Duplicate Page Title" mean the whole page title is used on another page or can it be just certain words are repeated as I am getting lots of errors even thought page title is different?
-
If the words are different it is highly unlikely that this is the issue. It might be possible if the titles were (for example) Sugar-company name Sugars-company name, sugary-company name.
But even then it would be a stretch. Are you sure that your website platfor hasn't dynamically created mirror pages? This happened to us in Magento recently. The developer didn't know any better and we had to set him straight.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we still have this Page Rank / Link juice / Link equity? So this dilution concept?
Hi all, As per the traditional or standard SEO rules, we have this link juice and dilution concept. Many websites have changed their linking structure with this with the beleif "the more number of pages, the PR will get diluted". Then many websites avoided more number of pages from homepage to avoid link juice dilution. Even we followed same. But I just wonder it's still the same way Google handles websites and rankings as per the links. And many websites even avoid more number of 2nd tier/hierarchy pages to avoid link dilution. I have gone through our competitors where they been employing lot of top level pages like 2nd tier/hierarchy pages but still doing good at rankings. Please share your views and suggestions on this. Thanks
Web Design | | vtmoz0 -
Should i be using shortcodes for my my page content.
Hello, I have a question. Sorry if this is been answered before. Recently I decided to do a little face lift to my main website pages. I wanted to make my testimonials more pretty. Found this great plugin for testimonials which creates shortcodes. I love how it looks like, but just realised that when I use images in shortcodes, these are not picked up by search engines 😞 only text is. Image search ability is pretty important for me and I'm not sure if I should stick with my plain design and upload images manually with all alt tags and title tags or there is a way to adjust shortcode so it shows images to search engines. You can see example here. https://a-fotografy.co.uk/maternity-photographer-edinburgh/ Let me know your thoughts guys. Regards, Armands
Web Design | | A_Fotografy1 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Content thin for new home page been told to change it? any suggestions?
Hi guys, I'm newbie.... I have been told that my home page is content thin, and if I want to rank really well in the search i need to have more relevant content on my homepage - the site is only new 2months and I can see we are now at 39th place in the search, if i make changes to the home page design and add more content will this effect this current ranking?
Web Design | | edward-may0 -
How to avoid duplicate title tags?
I've got roughly 1200 location pages for a travel client. Since the business does the same thing at every location, the title tags and descriptions are almost identical except for the location name. I know Google likes tags and meta descriptions to be unique, but how many different ways can I write the same title in a 55 character limit? For example, here's how the titles look: Things to do in San Jose, CA | Company Name
Web Design | | Masbro
Things to do in Dallas, TX | Company Name
Things to do in Albuquerque, NM | Company Name **My question: Are 1200 title tags structured this way unique enough for Google? ** I have got the same problem with the meta descriptions, but I can vary those a bit more because i have more characters to work with. Thanks for your input,
Dino2 -
Using (duplicate) content in different contexts
I have three distinct hosting products, each solving three different problems. While these three products have different features in terms of functionality, they are all built on the same platform. Now, in terms of marketing some features of the platform, f.ex. High Availability, is significant to all of the products. How do I go about to include information about this feature on all product pages without getting penalized for duplicate content? Is there a way to tell Google that parts of the content on the pages for product 1-3 is duplicated with intent, or duplicated from f.ex. a page that explains the technical aspects of the platform?
Web Design | | SYSE0 -
Decreasing Page Load Time with Placeholder Images - Good Idea or Bad Idea?
In an effort to decease our page load time, we are looking at making a change so that all product images on any page past page 1 load with a place holder image. When the user clicks to the next page, it then loads all of the images for that page. Right now, all of the product divs are loaded into a Javascript array and loaded in chunks to the page display div. Product-heavy pages significantly increase load time as the browser loads all of the images from the product HTML before the Javascript can rewrite the display div with page-specific product HTML. In order to get around this, we are looking at loading the product HTML with a small placeholder image and then substituting the appropriate product image URLs when each page is output to the display div. From a user experience, this change will be seamless and they won't be able to tell the difference, plus they will benefit from a potentially a short wait on loading the images for the page in question. However, the source of the page will have all of the product images in a given category page all having the same image. How much of a negative impact will this have on SEO?
Web Design | | airnwater0 -
Duplicate Page Content
Currently experiencing duplicate pages for all hotel pages. What would be recommendation to fix the pop up pages that uses javascript? | http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=fotos http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=localizacion http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=panorama http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=tourVisual |
Web Design | | Melia0