Ridding of taxonomies, so that articles enhance related page's value
-
Hello,
I'm developing a website for a law firm, which offers a variety of services.
The site will also feature a blog, which would have similarly-named topics. As is customary, these topics were taxonomies.
But I want the articles to enhance the value of the service pages themselves and because the taxonomy url /category/divorce has no relationship to the actual service page url /practice-areas/divorce, I'm worried that if anything, a redundantly-titled taxonomy url would dilute the value of the service page it's related to.
Sure, I could show some of the related posts on the service page but if I wanted to view more, I'm suddenly bounced over to a taxonomy page which is stealing thunder away from the more important service page.
So I did away with these taxonomies all together, and posts are associatable with pages directly with a custom db table.
And now if I visit the blog page, instead of a list of category terms, it would technically be a list of the service pages and so if a visitor clicks on a topic they are directed to /practice-areas/divorce/resources (the subpages are created dynamically) and the posts are shown there.
I'll have to use custom breadcrumbs to make it all work. Just wondering if you guys had any thoughts on this. Really appreciate any you might have and thanks for reading
-
Thank you for taking the time to respond. Makes a lot of sense, I appreciate it.
-
It is true that having pages with the same "page-name" (the last part following the final slash of a URL, e.g the page-name of this question is "ridding-of-taxonomies-so-that-articles-enhance-related-page-s-value"), which are also topically very similar - can cause 'jumpy' SERPs.
Many feel that the dangers of what is termed 'keyword cannibalisation' are over-egged. This may be true, but I have (myself) assuredly seen examples of it in action. Usually it occurs with most prominence when neither page strongly eclipses the other in terms of SEO authority (e.g: inbound signals like referring domains, citations across the web and general 'buzz' associated with a given URL).
If both pages are new with little authority (or 'popularity') bound to their unique addresses, then certainly Google can get confused. You can end up with problems like, earning a decent ranking for a related keyword - but it hops from page to page every day / week and Google's algorithm bubbles away in the background. This can make it hard to drive traffic to the correct destination.
If both pages are very specific about the keywords which they are targeting, you could turn references of those keywords on the page you don't want to rank - into hyperlinks pointing to the URL which you do want to rank! (sorry that was a bit of a mouthful)
Although TBPR (Tool Bar PageRank) was done away with aeons ago, 'actual' PageRank is still at large within Google's ranking algorithm(s). When one page links to another page with anchor text that matches a keyword, it 'gives away' some of its (ranking) value to the page receiving the link (for the specific keyword or collection of keywords / search entity in question). Think of links as 'votes' from one page to another. The difference between this and real voting is that, for Google not all votes are equal (links from more authoritative pages boost the receiving pages more than links from pages that nobody cares about). Not very progressive but still...
In general we in SEO abused this mechanic between different domains resulting in Google's current clamp-down on EMA (Exact-Match Anchor, in regard to keyword anchor text) linking. That being said: the risk from doing the same thing internally within your own website is extremely minimal, as you are just redistributing SEO authority from one page to another along a specific axiom of relevance.
That's not like when you do it from one domain to another, obviously to leech authority from an external site to your own - which in most occurrences is a violation of Google's Web-Master guidelines.
Do be careful though, don't overdo this. If the content of the page which you don't want to rank ends up stuffed full of hyperlinks, that could make the page look spammy and hurt your CRO (or earn Panda-related algorithmic devaluation).
Just don't go mental, everything should be fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I Add Location to ALL of My Client's URLs?
Hi Mozzers, My first Moz post! Yay! I'm excited to join the squad 🙂 My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc. I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like: example.com/weddings/planners-washington-dc-md-va
Intermediate & Advanced SEO | | pdrama231
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-va OR example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lighting In both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it. Thoughts? Thank you!!0 -
Rankings drop - we've added user reviews, are they causing over optimisation on page?
Hello Hopefully can get a few opinions on this. We've added some user reviews to our website for key products. We added these approximately 3-4 weeks ago. In the last week we've seen keyword rankings drop on the pages they've been added to. For example see: http://www.naturalworldsafaris.com/wildlife/primates.aspx This page ranked well for both gorilla safari and gorilla safaris but both terms have dropped considerably (12 to 20 checking Google UK on the Moz rank checker). Due to the formatting required for the Rich Snippets (and we have the user review stars in the SERPS) the term "Gorilla safari" is perhaps becoming a bit spammy on the page. Another example would be "Borneo holidays" (up and down in the SERPS between 12-18) on this page: http://www.naturalworldsafaris.com/destinations/far-east/borneo.aspx Do you feel that these fluctuations in keyword ranking could be to do with this? Thanks
Intermediate & Advanced SEO | | KateWaite0 -
Is Content Location Determined by Source Code or Visual Location in Search Engine's Mind?
I have a page with 2 scroll features. First 1/3 of the page (from left) has thumb pictures (not original content) and a vertical scroll next to. Remaining 2/3 of the page has a lot of unique content and a vertical scroll next to it. Question: Visually on a computer, the unique content is right next to the thumbs, but in the source code the original content shows after these thumbs. Does that mean search engines will see this content as "below the fold" and actually, placing this content below the thumbs (requiring a lot of scrolling to get to the original content) would in a search engine's mind be the exact same location of the content, as the source code shows the same location? I am trying to understand if search engines base their analysis on source code or also visual location of content? thx
Intermediate & Advanced SEO | | khi50 -
Acceptable use of availability attribute 'preorder' value in rich snippets schema markup and Google Shopping feed?
Hello all, Could someone please advise on acceptable use of the availability attribute 'preorder' value in rich snippets schema markup for our websites and the Google Shopping feed? Currently all of our products are either 'in stock' or 'out of stock', also mentioned was 'available for order' but I found that in the 2014 Google Shopping update, this value will be merged with 'in stock' here 'We are simplifying the ‘availability’ attribute by merging ‘in stock’ with ‘available for order’ and removing ‘available for order’. The products which we would like to mark as 'preorder' have been in stock and then sold out, however we have a due date for when they will come back into stock, so therefore the customer can preorder the product on our website i.e. pay in advance to secure their purchase and then they are provided with a due date for the products. Is this the correct use of the 'preorder' value, or does the product literally have to never have been released before? The guidance we have is: 'You are taking orders for this product, but it’s not yet been released.' Is this set in stone? Many thanks in advance and kind regards.
Intermediate & Advanced SEO | | jeffwhitfield0 -
What's the best way to check Google search results for all pages NOT linking to a domain?
I need to do a bit of link reclamation for some brand terms. From the little bit of searching I've done, there appear to be several thousand pages that meet the criteria, but I can already tell it's going to be impossible or extremely inefficient to save them all manually. Ideally, I need an exported list of all the pages mentioning brand terms not linking to my domain, and then I'll import them into BuzzStream for a link campaign. Anybody have any ideas about how to do that? Thanks! Jon
Intermediate & Advanced SEO | | JonMorrow0 -
What is a "Bad Link" in Google's eyes? Low DA?
Hi there, I'm going through my link profile and I noticed I have a few links that are from <10 DA sites. One has a DA of 6. Should I remove these? Aside from any referral traffic I receive from these links (I know there is none), are these links hurting me?
Intermediate & Advanced SEO | | Travis-W
What should I look out for in a site I may guest post on? Thanks!
Travis0 -
Getting out of Google's Penguin
Hi all, my site www.uniggardin.dk has lost major rankings on the searchengine google.dk. Went from rank #2-3 on important keywords to my site, and after the latest update most of my rankings have jumped to #12 - #20. This is so annoying, and I really have no idea what to do. Can it cause bad links to my site? In that case what will I have to do? Thanks in advance,
Intermediate & Advanced SEO | | Xpeztumdk
Christoffer0 -
Creating 100,000's of pages, good or bad idea
Hi Folks, Over the last 10 months we have focused on quality pages but have been frustrated with competition websites out ranking us because they have bigger sites. Should we focus on the long tail again? One option for us is to take every town across the UK and create pages using our activities. e.g. Stirling
Intermediate & Advanced SEO | | PottyScotty
Stirling paintball
Stirling Go Karting
Stirling Clay shooting We are not going to link to these pages directly from our main menus but from the site map. These pages would then show activities that were in a 50 mile radius of the towns. At the moment we have have focused our efforts on Regions, e.g. Paintball Scotland, Paintball Yorkshire focusing all the internal link juice to these regional pages, but we don't rank high for towns that the activity sites are close to. With 45,000 towns and 250 activities we could create over a million pages which seems very excessive! Would creating 500,000 of these types of pages damage our site? This is my main worry, or would it make our site rank even higher for the tougher keywords and also get lots of traffic from the long tail like we used to get. Is there a limit to how big a site should be? edit0