Title tag solution for a med sized site
-
Its the same old story, we all know it well. I have a client that has a site with 20k+ pages (not too big) and traffic levels around 450k/month.
Now we have identified 15 pages with various conversion points/great backlink metrics etc. that we are going to explicitly target in the first round of recs. However, we are looking at about 18,000 dup title tags that I'd like to clean up.
The site is not on a CMS and in the past I've had the dev team write a script to adopt the h1 tag or the name of the page etc as the title tag. This can cause a problem when some of these pages that are being found in long tail search lose their positions etc. I'm more hesitant than ever to make this move with this current client because they get a ton of long tail traffic spread over a ton of original content they wrote.
How does everyone else usually handle this? Thoughts?
Thanks in advance Mozzers!
-
How about having your developers script something, that scrapes all 18.000 h1, h2, h3 for each article and store them in a database. Finding dupes then would be a piece of cake, even for a less experienced developer You could easily export all your duplicates to csv and then manually rename them based on their content.
Dev time: about 1 day max. (Developed a lot of software myself and IMHO a good developer should get this up and running within 4 hours)
If you don't have toooooo many duplicate tags, correcting those in question shouldn't be taking too long aswell.
If you have done your chores you could reimport your corrected title-tags to the database. Your developer could write a script in the meantime, that sets the title-tag of a page according to the title-tag you stated in your database.
Hope that helped If you have further questions on this, just go ahead. Had a similar problem with 25k+ pages for a major health insurance and we figured out, that the best way to prevent problems was to do most of the work manually than with a script. Helped us a lot to stay within the budget and given timeframe.
-
This is sound advice. Test out a percentage of pages before rolling out the change site-wide.
I also agree that 18K duplicate titles isn't helping the site.
One thing I would do is review analytics and define the top X % pages and hand optimize those. The balance can be optimized via rules utilizing the system you outlined. As to whether to use the H1 or the file name or some other element, I'd probably lean towards the h1 as it would likely accurately describe the content and not be truncated or contain stop words.
-
-
Can you implement on a section or on a % of your pages first? then you can test the effect without risking your whole catalogue
-
Ryan - excellent points! The benefits of adding a CMS to this site would be quite good, at the very least for providing some sort of grounds for moving forward on a unified platform.
-
The 18k is a hard piece of the puzzle to wrap your mind around...I'd like to give more details there but can't...currently. Hopefully when this campaign starts to show results they will let me write a case for it...I'll be sure to share.
There is a "templating system" for various sections. However, as I mentioned, many developers have had their hands in it and didn't follow a standardized system.
I am considering EGOLs comment
Thanks!
-
Is there any form of standardization? I can't imagine 18k pages which were independently developed.
There should be a templating system or some logic which controls code common to all pages. Most pages should share the same header, footer and sidebar, along with standards for things like a canonicalization tag, title and meta description.
If that is not the case, the EGOL's comment should be considered. It is not reasonable to maintain a site which lacks standards.
-
It's possible that putting a reasonably intelligent human on the job for a couple of months could pay back big time. I'll bet a good title tag job would pull in thousands of dollars worth of sales every month.
-
Hey Ryan,
Thanks for the response!
There was 18k title tag duplicates but top content that I can tell is being found in search is about 1,500 pages. Its not a forum site or a site with UGC. Its a very successful tech hardware company that has put out a lot of great unique content over time.
Determining the logic is the tough part because there isn't a lot of consistency throughout the site...different developers have had their hands in it over time.
-
What kind of site is it?
With 18k+ pages I will take a guess that it is a forum site. Definitely check with your forum software provider. There should be some form of "page container" which is used as a template for all the site's pages. If you can determine the logic you want to use, such as go with the post title or H1 tag, then you can modify the template according to your logic and take care of your entire site quickly and easily.
-
Thanks for the response! I should rephrase my question...
I'm either looking for tricks/tips others use in this situation or messages like yours that will give me the confidence to go for it haha.
I think we've all experienced the fear of doing what we know is technically correct and risk being at the mercy of the algo. I've gone this route a lot in the past but I've never done it on a site that gets traffic so deep into so many pages from search.
Have you ever gone the script route? If so, what did you have it pull to use as a title tag? Like I mentioned above, I usually have used h1's in the past..
-
If a lot of traffic is coming in through 18,000 pages that have duplicate title tags I am willing to bet that there will be a huge increase in the amount of traffic that those pages pull when unique and relevant title tags are put in place.
So, although there is a small chance that traffic will go down, I think that there is a much higher chance that traffic will immedately shoot up spectacularly - and the quality of that traffic might also improve.
I would archive the site, run a script to replace the title tags, see what happens. You can always put the old title tags back up if this doesn't work - but I bet it works great.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Near Duplicate Title Tag Checker
Hi Everyone, I know there are a lot of tools like Siteliner, which can check the uniqueness of body copy, but are there any that can restrict the check to the title tags alone? Alternatively, is there an Excel or Google Sheets function that would allow me to do the same thing? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
Adult Toys Sites
Does anyone know of any changes SEOwise when running an adult toy site versus a normal eCommerce site? Is there any tips or suggestions that are worth knowing to achieve rankings faster? Thanks,
Intermediate & Advanced SEO | | the-gate-films0 -
Why my site not ranking
Hello everyone, can anyone suggest me, where i am having problem in my site www.suntechengineers.com, i know content is less,
Intermediate & Advanced SEO | | poojathakar
but any other things that i am missing in my site? Is There any on page query please let me know, i need urgently getting up this,please help thanx in advance0 -
Google favoring old site over new site...
Hi, I started a new site for a client: www.berenjifamilylaw.com. His old site: www.bestfamilylawattorney.com was too loaded up with bad links. Here's the weird part: when you Google: "Los Angeles divorce lawyer" you see the old site come up on the 21st page, but Google doesn't even show the new site (even though it is indexed). It's been about 2 weeks now and no change. Has anyone experienced something like this? If so, what did you do (if anything). Also, I did NOT do a 301 redirect from old to new b/c of spammy links. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
How would you handle 12,000 "tag" pages on Wordpress site?
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
Intermediate & Advanced SEO | | M_D_Golden_Peak1 -
Should i use Categories or Tags ?
Hi 🙂 My blog is http://www.dota2club.com/ and i am not sure should i use Categories as tags or normally tags like most ? And how much are important tags generally ? Right now you can see that i am using Categories as tags , is this ok ? Thank you !!!!
Intermediate & Advanced SEO | | wolfinjo1 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0