Ridding of taxonomies, so that articles enhance related page's value
-
Hello,
I'm developing a website for a law firm, which offers a variety of services.
The site will also feature a blog, which would have similarly-named topics. As is customary, these topics were taxonomies.
But I want the articles to enhance the value of the service pages themselves and because the taxonomy url /category/divorce has no relationship to the actual service page url /practice-areas/divorce, I'm worried that if anything, a redundantly-titled taxonomy url would dilute the value of the service page it's related to.
Sure, I could show some of the related posts on the service page but if I wanted to view more, I'm suddenly bounced over to a taxonomy page which is stealing thunder away from the more important service page.
So I did away with these taxonomies all together, and posts are associatable with pages directly with a custom db table.
And now if I visit the blog page, instead of a list of category terms, it would technically be a list of the service pages and so if a visitor clicks on a topic they are directed to /practice-areas/divorce/resources (the subpages are created dynamically) and the posts are shown there.
I'll have to use custom breadcrumbs to make it all work. Just wondering if you guys had any thoughts on this. Really appreciate any you might have and thanks for reading
-
Thank you for taking the time to respond. Makes a lot of sense, I appreciate it.
-
It is true that having pages with the same "page-name" (the last part following the final slash of a URL, e.g the page-name of this question is "ridding-of-taxonomies-so-that-articles-enhance-related-page-s-value"), which are also topically very similar - can cause 'jumpy' SERPs.
Many feel that the dangers of what is termed 'keyword cannibalisation' are over-egged. This may be true, but I have (myself) assuredly seen examples of it in action. Usually it occurs with most prominence when neither page strongly eclipses the other in terms of SEO authority (e.g: inbound signals like referring domains, citations across the web and general 'buzz' associated with a given URL).
If both pages are new with little authority (or 'popularity') bound to their unique addresses, then certainly Google can get confused. You can end up with problems like, earning a decent ranking for a related keyword - but it hops from page to page every day / week and Google's algorithm bubbles away in the background. This can make it hard to drive traffic to the correct destination.
If both pages are very specific about the keywords which they are targeting, you could turn references of those keywords on the page you don't want to rank - into hyperlinks pointing to the URL which you do want to rank! (sorry that was a bit of a mouthful)
Although TBPR (Tool Bar PageRank) was done away with aeons ago, 'actual' PageRank is still at large within Google's ranking algorithm(s). When one page links to another page with anchor text that matches a keyword, it 'gives away' some of its (ranking) value to the page receiving the link (for the specific keyword or collection of keywords / search entity in question). Think of links as 'votes' from one page to another. The difference between this and real voting is that, for Google not all votes are equal (links from more authoritative pages boost the receiving pages more than links from pages that nobody cares about). Not very progressive but still...
In general we in SEO abused this mechanic between different domains resulting in Google's current clamp-down on EMA (Exact-Match Anchor, in regard to keyword anchor text) linking. That being said: the risk from doing the same thing internally within your own website is extremely minimal, as you are just redistributing SEO authority from one page to another along a specific axiom of relevance.
That's not like when you do it from one domain to another, obviously to leech authority from an external site to your own - which in most occurrences is a violation of Google's Web-Master guidelines.
Do be careful though, don't overdo this. If the content of the page which you don't want to rank ends up stuffed full of hyperlinks, that could make the page look spammy and hurt your CRO (or earn Panda-related algorithmic devaluation).
Just don't go mental, everything should be fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Which Shouldnt' Exist..
Hi I have paginated pages on a crawl which shouldn't be paginated: https://www.key.co.uk/en/key/chairs My crawl shows: <colgroup><col width="377"></colgroup>
Intermediate & Advanced SEO | | BeckyKey
| https://www.key.co.uk/en/key/chairs?page=2 |
| https://www.key.co.uk/en/key/chairs?page=3 |
| https://www.key.co.uk/en/key/chairs?page=4 |
| https://www.key.co.uk/en/key/chairs?page=5 |
| https://www.key.co.uk/en/key/chairs?page=6 |
| https://www.key.co.uk/en/key/chairs?page=7 |
| https://www.key.co.uk/en/key/chairs?page=8 |
| https://www.key.co.uk/en/key/chairs?page=9 |
| https://www.key.co.uk/en/key/chairs?page=10 |
| https://www.key.co.uk/en/key/chairs?page=11 |
| https://www.key.co.uk/en/key/chairs?page=12 |
| https://www.key.co.uk/en/key/chairs?page=13 |
| https://www.key.co.uk/en/key/chairs?page=14 |
| https://www.key.co.uk/en/key/chairs?page=15 |
| https://www.key.co.uk/en/key/chairs?page=16 |
| https://www.key.co.uk/en/key/chairs?page=17 | Where is this coming from? Thank you0 -
Site's pages has GA codes based on Tag Manager but in Screaming Frog, it is not recognized
Using Tag Assistant (Google Chrome add-on), we have found that the site's pages has GA codes. (also see screenshot 1) However, when we used Screaming Frog's filter feature -- Configuration > Custom > Search > Contain/Does Not Contain, (see screenshot 2) SF is displaying several URLs (maybe all) of the site under 'Does Not Contain' which means that in SF's crawl, the site's pages has no GA code. (see screenshot 3) What could be the problem why SF states that there is no GA code in the site's pages when in fact, there are codes based on Tag Assistant/Manager? Please give us steps/ways on how to fix this issue. Thanks! SgTovPf VQNOJMF RCtBibP
Intermediate & Advanced SEO | | jayoliverwright0 -
Robots.txt - Googlebot - Allow... what's it for?
Hello - I just came across this in robots.txt for the first time, and was wondering why it is used? Why would you have to proactively tell Googlebot to crawl JS/CSS and why would you want it to? Any help would be much appreciated - thanks, Luke User-Agent: Googlebot Allow: /.js Allow: /.css
Intermediate & Advanced SEO | | McTaggart0 -
Ecommerce SEO - Indexed product pages are returning 404's due to product database removal. HELP!
Hi all, I recently took over an e-commerce start-up project from one of my co-workers (who left the job last week). This previous project manager had uploaded ~2000 products without setting up a robot.txt file, and as a result, all of the product pages were indexed by Google (verified via Google Webmaster Tool). The problem came about when he deleted the entire product database from our hosting service, godaddy and performed a fresh install of Prestashop on our hosting plan. All of the created product pages are now gone, and I'm left with ~2000 broken URL's returning 404's. Currently, the site does not have any products uploaded. From my knowledge, I have to either: canonicalize the broken URL's to the new corresponding product pages, or request Google to remove the broken URL's (I believe this is only a temporary solution, for Google honors URL removal request for 90 days) What is the best way to approach this situation? If I setup a canonicalization, would I have to recreate the deleted pages (to match the URL address) and have those pages redirect to the new product pages (canonicalization)? Alex
Intermediate & Advanced SEO | | byoung860 -
.com Outranking my ccTLD's and cannot figure out why.
So I have a client that has a number of sites for a number of different countries with their specific ccTLD. They also have a .com in the US. The problem is that the UK site hardly ranks for anything while the .com ranks for a ton in the UK. I have setup GWT for the UK and the .com to be specific to their geographic locations. So I have the ccTLD and I have GWT showing where I want these sites to rank. Problem is it apparently is not working....Any clues as to what else I could do?
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
New gTLD's, buy or wait and see?
Is the new gTLD scheme from ICANN worth the money? I manage a brand relatively well-known in our own market segment. Would I benefit from moving from .com and national TLDs for my international sites to my own brand TLD? Are there any obvious SEO pros and cons?
Intermediate & Advanced SEO | | KnutDSvendsen0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0 -
Aside from creative link bait, what's a solid link building strategy involve?
All things considered, directories, blogs, articles, press releases, forums, social profiles, student discount pages, etc, what do you consider to be a strong, phased, link building strategy? I'm talking beyond natural/organic link bait, since many larger accounts will not allow you to add content to their website or take 6 months to approve a content strategy. I've got my own list, but would love to hear what the community considers to be a strong, structured, timeline-based strategy for link building.
Intermediate & Advanced SEO | | stevewiideman1