Title tag solution for a med sized site
-
Its the same old story, we all know it well. I have a client that has a site with 20k+ pages (not too big) and traffic levels around 450k/month.
Now we have identified 15 pages with various conversion points/great backlink metrics etc. that we are going to explicitly target in the first round of recs. However, we are looking at about 18,000 dup title tags that I'd like to clean up.
The site is not on a CMS and in the past I've had the dev team write a script to adopt the h1 tag or the name of the page etc as the title tag. This can cause a problem when some of these pages that are being found in long tail search lose their positions etc. I'm more hesitant than ever to make this move with this current client because they get a ton of long tail traffic spread over a ton of original content they wrote.
How does everyone else usually handle this? Thoughts?
Thanks in advance Mozzers!
-
How about having your developers script something, that scrapes all 18.000 h1, h2, h3 for each article and store them in a database. Finding dupes then would be a piece of cake, even for a less experienced developer You could easily export all your duplicates to csv and then manually rename them based on their content.
Dev time: about 1 day max. (Developed a lot of software myself and IMHO a good developer should get this up and running within 4 hours)
If you don't have toooooo many duplicate tags, correcting those in question shouldn't be taking too long aswell.
If you have done your chores you could reimport your corrected title-tags to the database. Your developer could write a script in the meantime, that sets the title-tag of a page according to the title-tag you stated in your database.
Hope that helped If you have further questions on this, just go ahead. Had a similar problem with 25k+ pages for a major health insurance and we figured out, that the best way to prevent problems was to do most of the work manually than with a script. Helped us a lot to stay within the budget and given timeframe.
-
This is sound advice. Test out a percentage of pages before rolling out the change site-wide.
I also agree that 18K duplicate titles isn't helping the site.
One thing I would do is review analytics and define the top X % pages and hand optimize those. The balance can be optimized via rules utilizing the system you outlined. As to whether to use the H1 or the file name or some other element, I'd probably lean towards the h1 as it would likely accurately describe the content and not be truncated or contain stop words.
-
-
Can you implement on a section or on a % of your pages first? then you can test the effect without risking your whole catalogue
-
Ryan - excellent points! The benefits of adding a CMS to this site would be quite good, at the very least for providing some sort of grounds for moving forward on a unified platform.
-
The 18k is a hard piece of the puzzle to wrap your mind around...I'd like to give more details there but can't...currently. Hopefully when this campaign starts to show results they will let me write a case for it...I'll be sure to share.
There is a "templating system" for various sections. However, as I mentioned, many developers have had their hands in it and didn't follow a standardized system.
I am considering EGOLs comment
Thanks!
-
Is there any form of standardization? I can't imagine 18k pages which were independently developed.
There should be a templating system or some logic which controls code common to all pages. Most pages should share the same header, footer and sidebar, along with standards for things like a canonicalization tag, title and meta description.
If that is not the case, the EGOL's comment should be considered. It is not reasonable to maintain a site which lacks standards.
-
It's possible that putting a reasonably intelligent human on the job for a couple of months could pay back big time. I'll bet a good title tag job would pull in thousands of dollars worth of sales every month.
-
Hey Ryan,
Thanks for the response!
There was 18k title tag duplicates but top content that I can tell is being found in search is about 1,500 pages. Its not a forum site or a site with UGC. Its a very successful tech hardware company that has put out a lot of great unique content over time.
Determining the logic is the tough part because there isn't a lot of consistency throughout the site...different developers have had their hands in it over time.
-
What kind of site is it?
With 18k+ pages I will take a guess that it is a forum site. Definitely check with your forum software provider. There should be some form of "page container" which is used as a template for all the site's pages. If you can determine the logic you want to use, such as go with the post title or H1 tag, then you can modify the template according to your logic and take care of your entire site quickly and easily.
-
Thanks for the response! I should rephrase my question...
I'm either looking for tricks/tips others use in this situation or messages like yours that will give me the confidence to go for it haha.
I think we've all experienced the fear of doing what we know is technically correct and risk being at the mercy of the algo. I've gone this route a lot in the past but I've never done it on a site that gets traffic so deep into so many pages from search.
Have you ever gone the script route? If so, what did you have it pull to use as a title tag? Like I mentioned above, I usually have used h1's in the past..
-
If a lot of traffic is coming in through 18,000 pages that have duplicate title tags I am willing to bet that there will be a huge increase in the amount of traffic that those pages pull when unique and relevant title tags are put in place.
So, although there is a small chance that traffic will go down, I think that there is a much higher chance that traffic will immedately shoot up spectacularly - and the quality of that traffic might also improve.
I would archive the site, run a script to replace the title tags, see what happens. You can always put the old title tags back up if this doesn't work - but I bet it works great.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Complicated Title Tag Issues. Experts, Please Help!
Hey there Moz community! This is the first time I ask a question here so please forgive me if I miss any forum etiquette. I am managing SEO for an educational site which is built in React Js, and so far much of the job has been keyword research and site optimization. The site still has slow PageSpeed though. The Issues - 4 weeks ago we published 20 or so content pieces, I had pre-prepared title tags and meta descriptions. But when we released the content there was a programming error that made all of the pages show another title tag for all 20 pages instead of the pre-prepared individual title tags. I noticed this after 3 days and the issue was fixed within 6 days, but by then Google had crawled and indexed the pages. And now I can't get Google to change to the pre-prepared tags no matter what I do! I've tried changing the content, changing the URL of one of the pages, and I've sent Google spiders to re-crawl the pages multiple times. The super weird thing is that the correct title tag shows in the 'navigation bar/tabs bar' on google chrome: But NOT when I view the source code for the page: Yesterday I was taking a walk in the park and I just couldn't stop thinking about it (it is really starting to get to me by now since nothing works), so I ran back home and looked closely at one of these pages in the Google search console. And I noticed something I hadn't seen before… BOTH of the title tags can be found in the HTML: Pre-prepared title tag: <title></strong>UK Seat Belt & Car Seat Laws: The Definitive Guide<strong></title> The other title tag (in src section): title=Ace%20The%20DMV%20Permit%20Test%20%26%20Get%20Your%20License Could this be the problem or what do you think? I've understood that Google has automated title tags and that they can choose their own if they think it fits better, but the title tags aren't even close to describing the topic as it is now so it doesn’t make any sense. All answers are greatly appreciated! Your advice is life-saving for a learner like me. P.S. I love SEO but it can be very frustrating sometimes! Thank you very much, Leo
Intermediate & Advanced SEO | | Leowa0 -
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
Spammy sites that link to a site
Hello, What is the best and quickest way to identify spammy sites that link to a website, and then remove them ( google disavow?) Thank you dear Moz, community - I appreciate your help 🙂 Sincerely, Vijay
Intermediate & Advanced SEO | | vijayvasu0 -
Technical Site Questions
When i do a google cache of our site, i see 2 menus, our developers say that's because the 2nd is for the mobile menu - is that correct, as when i look up other sites that have mobile rendering they only have one menu visible. Plus GWT's has the number of internal links per page at least x2 what they should have - are they connected? Secondly when i do a spider test through http://tools.seobook.com/general/spider-test/ it shows all "behind the scenes text" eg font names, portals, sliders, margins - "font size px" is shown as 17 times and a density of 2.15% - surely this isnt correct as google will be thinking that these are my keywords !? My site is www.over50choices.co.uk Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Duplicate Title Tags & Duplication Meta Description after 301 Redirect
Today, I was checking my Google webmaster tools and found 16,000 duplicate title tags and duplicate meta description. I have investigate for this issue and come to know about as follow. I have changed URL structure for 11,000 product pages on 3rd July, 2012 and set up 301 redirect from old product pages to new product pages. Google have started to crawl my new product pages but, De-Indexing of old URLs are quite slower. That's why I found this issue on Google webmaster tools. Can anyone suggest me, How can I increase ratio of De-Indexing for old URLs? OR any other suggestions? How much time Google will take to De-Index old URLs from web search?
Intermediate & Advanced SEO | | CommercePundit0 -
What happened to my site?
This is my website: http://goo.gl/Brekn It was related to health niche. I am good with the serp for the last couple of hours. I am in top for the keywords: cure for piles, hemorrhoids treatment, natural hemorrhoid treatment. My main keyword is hemorrhoids treatment. And has recently added this page, http://goo.gl/SXNtX This page has some dupe content, as content writer was busy i have copied the content from other sources. Does this have any effect? I have not added this page onto homepage as i dont like google to index it until my copywriter writes the content. Dont know what happened, in the last few hours i lost my serp to no where. Can anyone tell me what happened to my site. Did i done anything wrong? In the last week i have added a link to my ebooks related site. Does this affected my serps? Or did the Google recent change http://googlewebmastercentral.blogspot.com/2012/01/page-layout-algorithm-improvement.html Has any effect on my site? Awaiting for your replies...
Intermediate & Advanced SEO | | Dex324344320 -
Dupicated Site Issues?
We are launching a new site for the Australian market and the URL will just be siteAU.com. Currently the tech team (before we came on board) has it setup with almost exactly the same content (including the site css/nav/structure etc). Some product page content is slightly different, and category pages have different product orders, plus there are location pages that are specific to AU, but otherwise it's the same. The original site: site.ca has been around for 6+ years, with several thousand pages and solid organic ranking (though the last few months have dropped ) Will the new AU site create issues for the original domain? We also have siteUSA.com which follows the same logic and has been live for a while.
Intermediate & Advanced SEO | | BMGSEO0