Need Some Quality Vs. Quantity SEO Advice
-
We have a gallery here with our main categories of patches. https://www.stadriemblems.com/gallery/
If you click on one, say Fire Patches, you'll be taken to a page of just fire patches. https://www.stadriemblems.com/fire-patches/
But here's the kicker: If you notice of the fire patch page, there are also sub-categories to that. So if you click on say, Fire Rescue, you get taken one level deeper. https://www.stadriemblems.com/fire-patches/fire-rescue-patches/
I'm redoing this entire site (a project over five years overdue), and I'm wondering if it's really worth it to keep these three-level deep sub pages. I originally created them with long tail SEO in mind, making us be the only ones who come up when people search for very specific patches. But it's a big undertaking to redo all of them, and are they really adding any value?
-
It is not at all uncommon to have the category taxonomy run three levels deep like this.
What page should I land on if I search for "Volunteer Firefighter Patches"? If it's /fire-patches/ you may have trouble targeting "Volunteer" while also targeting Ambulance, Hazmat...
I think pages specifically targeting topics like "EMS Patches" are necessary for this site and would not recommend removing them.
While you're rethinking the taxonomy though, I would change /fire-patches/ to /first-responders/ because at the moment the site has EMS and Ambulance under "Fire". Really, all of these professionals are "First Responders". Then /first-responders/fire-patches/ may even continue another level deep, such as /first-responders/fire-patches/volunteer/ .
-
I'm leaning toward quality over quantity. Looking at those individual pages, they don't seem to be offering much in the way of unique content.
-
I have to make this decision within the week. I guess I can't ask people to advise me when I have no data to give them.
-
If it is possible to add the analytics code to the pages, I would do that and let it run for a couple months. Then you can make the decision with full data.
-
Well, this has brought to my attention that somehow the CMS has neglected adding the analytics code to those pages, probably because they're in a sub folder, so this is a bigger shot in the dark than I thought.
-
Are these pages pulling in any traffic? Can you trace that traffic through the shopping cart?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Filter By Category bad for seo?
Hello Everyone! I know that a single product should not have filter by color option since it will create duplicate content, and you have to use canonical tags to solve it. BUT how about sorting through products via category/brands?
On-Page Optimization | | Safxmed
Filter by category changes the URL of the General shop page (ex: hello.com/Shop/Category1022039 ). This page only displays the products within, no content/ descriptions etc unlike the original category page (ORIGINAL CATEGORY PAGE) Each of these category/brand already have their own individual pages (ex: hello.com/Shop/A). This is the page that will be optimized for content, FAQ, and ranking etc. Unlike in the url created when filtering through the categories. So technically I would have 2 URL for each Brand/Category. Would they compete with each other? What would you guys suggest. Please advise me on this. Thank You0 -
H2's vs Meta description
in some of my serp results the h2's are showing up instead of the meta description. i have read that H2's arent really valid anymore. can someone clarify this for me?
On-Page Optimization | | dhanson240 -
Is it SEO-wise to edit an already published article?
One of the pages on the website is #7 on the first page for a highly competetive keyword. Since I would like to improve rankings and the page is not optimized (e.g. keyword density is 0), is it SEO-wise to edit an article and create a good on-page optimization? Of course, the ultimate goal is to be in TOP 3 for a specific keyword.
On-Page Optimization | | zorsto1 -
On page SEO Strategy / What pages to use?
What is the best page to use for targeting your hard to rank keywords? The keyword phrases in question here are "Acrylic Tank Manufacturing", "Custom Aquariums", & "Acrylic Aquariums" As of right now we have created 3 separate pages for each one of these keyword phrases. http://seaquaticaquariums.com/custom-aquariums for "Custom Aquariums" http://www.seaquaticaquariums.com/custom-aquariums/acrylic-aquariums/ for "Acrylic Aquariums" http://www.seaquaticaquariums.com/services/acrylic-tank-manufacturing/ for "Acrylic Tank Manufacturing" Or are we better of using the home page http://www.seaquaticaquariums.com/ for the our main hard to rank for terms. Generally speaking I would think more people will link to our home page.
On-Page Optimization | | SeaQuatic0 -
SEO Title Tag Best Practice?
Hi, I would like to know if appending site name at the end of page title ( page title -_ Site Name/Site URL_ ) is a good/bad practice in reference to SEO? If a good practice which is better ( Site Name or Site URL )? Example: 20 Things you didn't know about zombies - coed.com Thanks in advance
On-Page Optimization | | COEDMediaGroup0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Onpage SEO Analysis within a campaign
I have setup a campaign and have a number of pages which are ranked as F, but that is because they are comparing against the homepage and not the internal page I have setup. Is it possible to update the page which the campaign monitor is checking against. Or is it driven by the highest ranking page for that keyword. Thanks Andy
On-Page Optimization | | iprosoftware0