Dynamic Links vs Static Links
-
There are under 100 pages that we are trying to rank for and we'd like to flatten our site architecture to give them more link juice. One of the methods that is currently in place now is a widget that dynamically links to these pages based on page popularity...the list of links could change day to day.
We are thinking of redesigning the page to become more static, as we believe it's better for link juice to flow to those pages reliably than dynamically. Before we do so, we need a second opinion.
-
Agreeing with the previous post adn in addition:
If you have dynamic links yor targeted pages get a different PR and anchor text value that can cause a fluctuation in your rankings that can be quite emberrasing i think. If you choose static links you can justify the targetted pages, forward value to those pages that really need them, and you can place your links in the beginning of the text fields so that they forward more pagerank than links in the bottom. I would choose static as I would like to alays kno hich page I am proposing, how and where I am proposing it. In addition to these technical Factors you have a better chance of guiding visitors through related material on your site.
Cheers
-
Hi there Gemma.
I'm sure that if you decide to go dynamic, then your website is changing daily with plenty of fresh new content. Be sure that if you go with this option, to ensure that your sitemap uploaded to GWT reflects the frequency of updates to each page; otherwise you completely lose the SE's and any effort done.
On the flip side, if you choose to go with static (which I would probably strongly recommend), then you won't have to worry about any new links automatically generated to be broken (as may and will happen in a dynamic environment), plus you get to take full control of what you wan to name the page (content-related obviously).
These are just my two cents since we also play with them from time to time. I like the dynamic, but it can be time-consuming even if you do streamline it to less than 100 pages, but static lets you know you're ranking well if you're constantly targeting site specific keywords...
Best of luck and Happy Turkey Day!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Base href + relative link href for canonical link
I have a site that in the head section we specify a base href being the domain with a trailing slash and a canonical link href being the relative link to the domain. <base <="" span="">href="http://www.domain.com/" /> href="link-to-page.html" rel="canonical" /> I know that Google recommends using an absolute path as a canonical link but is specifying a base href with a relative canonical link the same thing or is it still seen as duplicate content?
Intermediate & Advanced SEO | | Nobody16116990439410 -
Ecommerce: A product in multiple categories with a canonical to create a ‘cluster’ in one primary category Vs. a single listing at root level with dynamic breadcrumb.
OK – bear with me on this… I am working on some pretty large ecommerce websites (50,000 + products) where it is appropriate for some individual products to be placed within multiple categories / sub-categories. For example, a Red Polo T-shirt could be placed within: Men’s > T-shirts >
Intermediate & Advanced SEO | | AbsoluteDesign
Men’s > T-shirts > Red T-shirts
Men’s > T-shirts > Polo T-shirts
Men’s > Sale > T-shirts
Etc. We’re getting great organic results for our general T-shirt page (for example) by clustering creative content within its structure – Top 10 tips on wearing a t-shirt (obviously not, but you get the idea). My instinct tells me to replicate this with products too. So, of all the location mentioned above, make sure all polo shirts (no matter what colour) have a canonical set within Men’s > T-shirts > Polo T-shirts. The presumption is that this will help build the authority of the Polo T-shirts page – this obviously presumes “Polo Shirts” get more search volume than “Red T-shirts”. My presumption why this is the best option is because it is very difficult to manage, particularly with a large inventory. And, from experience, taking the time and being meticulous when it comes to SEO is the only way to achieve success. From an administration point of view, it is a lot easier to have all product URLs at the root level and develop a dynamic breadcrumb trail – so all roads can lead to that one instance of the product. There's No need for canonicals; no need for ecommerce managers to remember which primary category to assign product types to; keeping everything at root level also means there no reason to worry about redirects if product move from sub-category to sub-category etc. What do you think is the best approach? Do 1000s of canonicals and redirect look ‘messy’ to a search engine overtime? Any thoughts and insights greatly received.0 -
How to build authority links and how they look ?
Hi how to build authority links and how they look ? If you could give a few examples so that i can see how it looks. and i have 9 linking root domains which is really low. so what are linking root domains ? and what i need to about that ? Thank you 🙂
Intermediate & Advanced SEO | | Ivek991 -
Published Articles + Spam Links
Can you be a victim of your own success? So your write a quality article on your website. You educate your audience and hope quality trusted authority sites will link back to your article. Great, all those plus points adding to your SEO.
Intermediate & Advanced SEO | | Mark_Ch
On the down side you get poor quality sites with no real SEO value linking to your article. My Question Is This: What impact will poor quality sites have on your SEO?
What impact will changing the Anchor Text to something unrelated to the article content have on SEO?
Are there any other considerations?
Thanks Mark0 -
Webmaster Tools Internal Links
Hi all, I have around 400 links in the navigation menu (site-wide) and when I use webmaster tools to check for internal links to each page; some have as many as 250K and some as little as 200. Shouldn't the number of internal links for pages found in the navigation menu be relatively the same? Or is Google registering more internal links for pages linked closer to the top of the code Thanks!
Intermediate & Advanced SEO | | Carlos-R0 -
How To Create Dynamic WordPress Tags
Does anyone know how to make WordPress "tag" pages automatically generate a description based on the posts included in the tag? I have a lot of tags, and most of them rank well for long tail keywords. However I have noticed that although they have a dynamically generated "title meta tag" they do not generate a "description meta tag". I know WordPress lets you customize the description for each tag, but I have way to many for that. I need the description meta to be auto generated from the posts that are being tagged, rather than not including one at all. Does anyone know how to do this?
Intermediate & Advanced SEO | | MyNet0 -
Indirect SEO boost from links
I have 2 ecommerce sites, each with a blog. I am increasing my linkbuilding efforts, but I don't want to build too many links directly to my 2 sites over a short period of time. I have decided that I will add a certain number of links to sites/pages that are already linking to my main sites (for example, a blog post on my blog, guest post on another blog, article submission, etc.). How much of a benefit can I expect in terms of rankings? Has anyone tested this out or experimented with something like this? What are the pros and cons? I appreciate thoughtful comments.
Intermediate & Advanced SEO | | inhouseseo0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0