How Can I Prevent Duplicate Page Title Errors?
-
I am working on a website that has two different sections, one for consumers and one for business. However, the products and the product pages are essentially the same but, of course, the pricing and quantities may be different. We just have different paths based on the kind of customer.
And, we get feeds from manufacturers for the content so it's difficult to change it.
We want Google to index both sections of the site but we don't want to get hammered for duplicate page titles and content.
Any suggestions?
Thanks!
-
Thanks for your reply Nakul.
I actually want Google to index both - one for each audience. We had "no follow" on one of the sections until yesterday. But, now that's been removed, I'm concerned about the duplicate content and titles.
-
Do you have one version of the pages that you'd prefer for your "Natural Search" / SEO users/visitors to see ? Let's say you have the same page, v1 and v2. And you want v1 to be seen by users coming from Google. In v2, add a canonical tag to pointing to v1. This way Google will crawl both sets of pages but only show v1 in the SERPS. Does that work ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sizable decrease in amount of pages indexed, however no drop in clicks, impressions, or ranking.
Hi everyone, I've run into a worrying phenomenon in GSC and im wondering if anyone has come across something similar. Since August, I have seen a steady decline in the number of pages that are indexed from my site, from 1.3 million down to about 800,000 in two months. Interestingly, my clicks/impressions continue to increase gradually (on the same pace they have been for months) and I see no other negative side affects resulting from this drop in coverage. In total I have 1.2 million urls that fall into one of three categories, "Crawled - currently not indexed", "Crawl anomaly", and "Discovered - currently not indexed" Some other notes - all of my valid, error, and excluded pages are https://www. , so I don't believe there is an issue with different versions of the same site being submitted. Also, my rankings have not changed so I tentatively believe that this is unrelated to the Medic Update. If anyone else has experienced this or has any insight to the problem I would love to know. Thanks!
Algorithm Updates | | Jason-Reid0 -
How often should I update the content on my pages?
I have started dropping on my rankings - due to lack of time after having a baby. I'm still managing to blog but I'm wondering if I update the content on my pages will that help? All my Meta tags and page descriptions were updated over a year ago - do I need to update these too? We were ranking in the top spots for a good few years, but we're slowly falling 😞 Please give me any advice to keep us from falling even further. I have claimed all my listings, and try to add new links once a month. I share my blog to all social sites and work hard to get Google reviews, we have 53 which is higher than any of our competitors. Any other ideas? Have I missed something that Google is looking for nowadays? Many thanks 🙂
Algorithm Updates | | Lauren16890 -
Who else is noticing a shift in deeper pages ranking?
Without mentioning names, we're noticing a shift in many of our clients ranking pages. Previously many of them held page 1 positions with their home page. We've been building brand only anchor text to these pages for some time now and there's a noticeable change in visibility to the domain as a whole displayed in GWT and there's an uplift in organic traffic too. It just happens that some of our clients already had pages in the root directory that were very optimised for the clients' head terms, but all of a sudden, these sub pages with very few inbound links have started ranking in the place of the home pages. I've attached a screenshot of the landing page organic traffic. The pages in question have been there for at least 8-10 months. These inner pages would not normally have been able to hold their ground in this position and I'm concerned that this is a temporary change. I can see this going one of two ways; (i) home page beings to out rank sub page as before, (i) sub page loses ranking ability and home page rank does not come back. My questions to the community are therefore; **Has anyone else noticed this shift in ranking behaviour? ** What are everyone's thoughts on this? - Will it remain this way? From this query I can easily ask another wider question; Good advice across the internet says we should be building strong brand links and citations to our clients' domains. Typically brand links go to the homepage, which should provide the homepage and (to a lesser extent the domain) with a ranking/traffic/visibility uplift. However, as I'm noticing other pages now picking up ranking boosts as a result of this; **Should we still be trying to gain links to these more commercial landing pages? ** How are others building high quality links to pages full of commercial copy? I hope this can spark a little bit of a debate. I look forward to hearing everyone's thoughts. Thanks yPOEjVA.png
Algorithm Updates | | tomcraig860 -
On-page Optimization
Hi, I have two campaigns and neither have any statistics for on-page optimization. Am I doing something wrong or how do I make these stats appear? I would like to improve my website. Thank you in advanced for any pointers or shared experience you may give me!
Algorithm Updates | | Pixeltistic0 -
Changing the # of results per page in Google search settings displays totally different results. Why is this?
Curious what's going on here. This is the first time I've seen this before. What's happening is this ... In Google, I search for "mobile apps orange county" and get a standard list of 10 results. I go to Google's search settings in the top right corner of the page (button is grey with a gear) to change the number of results per page from 10 to 50 (also did 100). When I go back to Google and search again for "mobile apps orange county" I get a much larger list but with completely different results. This time around the top 10-12 are dominated by the same website (ocregister.com) What's going on here that Google would now show different results? Why is this one website all of a sudden dominating the first 12 results? Thanks everyone! ByteLaunch
Algorithm Updates | | ByteLaunch0 -
Google Page Rank?
We have had a quality website for 12 years now, and it seems no matter how many more links we get and how much new content we add daily, we have stayed at PR3 for the past 10 years or so. Our SEOMoz domain authority is 52. We have over 950,000 pages linking to us from 829 unique root domains. Is this in line with PR3 or should we be approaching PR4 soon? We do daily blog posts with all unique, fresh quality content that has not been published elsewhere. We try to do everything with 'white hat' methods, and we are constantly trying to provide genuine content and high quality products, and customer service. How can we improve our PR and how important is PR today?
Algorithm Updates | | applesofgold0 -
Any ideas why our category pages got de-indexed?
Hi all, I work for evenues, a directory website that provides listings of meeting rooms and event spaces. Things seemed to be chugging along nicely with our link building effort (mostly through guest blogging using a variety of anchor text). Woke up on Monday morning to find that our City pages have been de-indexed. This page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington used to be at the top of page #2 in the SERPs for the keyword "Meeting Rooms in Seattle" I doubt that we got de-indexed because of our link building efforts, as it was only a few blog posts and links from profile pages on community websites. My guess is that when we did a recent 2.0 release of the site, there are now several "filters" or subcategory pages with latitude and longitude parameters in the URL + different page titles based on the categories like: "Meeting Rooms and Event Spaces in Seattle" --Main Page "Meeting Rooms in Seattle" "Classroom Venues in Seattle" "Party Venues in Seattle" There was a bit of pushback when I suggested that we do a rel="canonical" on these babies because ideally we'd like to rank for all 4 queries (Meeting Rooms, Party Venues, Classrooms, in City). These are new changes, and I have a sneaking suspicion this is why we got de-indexed. We're presenting generally the same content. Thoughts?
Algorithm Updates | | eVenuesSEO0 -
Gifts.com - Multiple domain pages in SERPs
One of our big natural search competitors for gift keywords is Gifts.com. We are competing for many keywords like "teen gifts", "gifts for him", "gifts for her". For many of these, the Google SERP has multiple Gifts.com pages on the first page. I have never seen more than one of our pages (uncommongoods.com) on a SERP page. Any clue how/why Gifts.com has multiple pages in search results ? Thanks!
Algorithm Updates | | znotes0