Keeping Roger Happy - The Dynamic Dilema!
-
Roger (the SEOMoz robot) is reporting 1000’s of duplicate pages, duplicate titles and overly dynamic URL’s. These are being caused by our dynamic forum/shopping/testimonial pages.
I appreciate Roger’s efforts and for making me aware of the situation, but should I be worrying about this too much? I believe that this shouldn’t affect rankings or SEO performance.. but then again I want to make Roger happy and see ‘0’ next to all errors and warnings! J
Many thanks in advance!
Lee
-
Many thanks Pete, will see what we can do and take action. Appreciate the advice
-
I'm seeing a lot of duplicates in your forum pages - I think the issue is that any attempts to click into the forum go to the login page, but the URL stays the same. You may want to block those from crawlers somehow (META NOINDEX, for example), since Google can't log into member areas.
They don't seem to be currently in the Google index, but there is potential to dilute your site's ranking ability and for Google to think that your content is "thin". I do think it's a problem you should address.
-
Appreciate that Alsvik, thought as much.
Still not sure whether I should be worrying about it too much though! Anyone else got any input?
-
Rel=canonical for duplicate entries to the same pages. You could, if possible on your server, add no follow, noindex to all but one active URL for the same page - or use redirects ...
-
I'll buy you a beer when you do Alsvik! How are you fixing the problem if you don't mind me asking?
-
I worry too. And therefore I fix pages, sorted by page authority. I calculate on reaching 0 some day in 2017 ... Yes, you should fix these, but you need to prioritise your errors and warnings. Since google is my biggest concern, I start by fixing the ones GWT show me - and then I focus on Mozbot errors and warnings ....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it good or bad to add noindex for empty pages, which will get content dynamically after some days
We have followers, following, friends, etc pages for each user who creates account on our website. so when new user sign up, he may have 0 followers, 0 following and 0 friends, but over period of time he can get those lists go up. we have different pages for followers, following and friends which are allowed for google to index. When user don't have any followers/following/friends, those pages looks empty and we get issue of duplicate content and description too short. so is it better that we add noindex for those pages temporarily and remove noindex tag when there are at least 2 or more people on those pages. What are side effects of adding noindex when there is no data on those page or benefits of it?
Intermediate & Advanced SEO | | swapnil120 -
Why does our main keyword keep dropping?
Hey Guys, We've seen an alarming drop in our main keyword for our website. Our biggest driver of traffic has always been the search term 'gifts for men' which we commanded the top spot for a while, but have always been in the top 4 for. Recently (in the last 3-4 months) we dropped to 6 and as of last night we dropped down to 9th. We still rank number 2 for 'gift ideas for men'. Both search terms point to this page: GIfts For Men Nothing onsite or technically has changed, and there is consistently new content in the form of products being added almost daily. We hit a manual action back in October of last year and I'm concerned that the toxic links (that we didn't create mind you) we disavowed may have been unnaturally boosting this page and now we're dropping significantly because they're gone. Any ideas on how we can curb this concerning trend? Thanks a lot
Intermediate & Advanced SEO | | TheGreatestGoat0 -
Replace dynamic paramenter URLs with static Landing Page URL - faceted navigation
Hi there, got a quick question regarding faceted navigation. If a specific filter (facet) seems to be quite popular for visitors. Does it make sense to replace a dynamic URL e.x http://www.domain.com/pants.html?a_type=239 by a static, more SEO friendly URL e.x http://www.domain.com/pants/levis-pants.html by creating a proper landing page for it. I know, that it is nearly impossible to replace all variations of this parameter URLs by static ones but does it generally make sense to do this for the most popular facets choose by visitors. Or does this cause any issues? Any help is much appreciated. Thanks a lot in advance
Intermediate & Advanced SEO | | ennovators0 -
I'm updating content that is out of date. What is the best way to handle if I want to keep old content as well?
So here is the situation. I'm working on a site that offers "Best Of" Top 10 list type content. They have a list that ranks very well but is out of date. They'd like to create a new list for 2014, but have the old list exist. Ideally the new list would replace the old list in search results. Here's what I'm thinking, but let me know if you think theres a better way to handle this: Put a "View New List" banner on the old page Make sure all internal links point to the new page Rel=canonical tag on the old list pointing to the new list Does this seem like a reasonable way to handle this?
Intermediate & Advanced SEO | | jim_shook0 -
Keep older blog content indexed or no?
Our really old blog content still sees traffic, but engagement metrics aren't the best (little time on site), and as a result, traffic has gradually started to decrease. Should we de-index it?
Intermediate & Advanced SEO | | nicole.healthline0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
Sites with dynamic content - GWT redirects and deletions
We have a site that has extremely dynamic content. Every day they publish around 15 news flashes, each of which is setup as a distinct page with around 500 words. File structure is bluewidget.com/news/long-news-article-name. No timestamp in URL. After a year, that's a lot of news flashes. The database was getting inefficient (it's managed by a ColdFusion CMS) so we started automatically physically deleting news flashes from the database, which sped things up. The problem is that Google Webmaster Tools is detecting the freshly deleted pages and reporting large numbers of 404 pages. There are so many 404s that it's hard to see the non-news 404s, and I understand it would be a negative quality indicator to Google having that many missing pages. We were toying with setting up redirects, but the volume of redirects would be so large that it would slow the site down again to load a large htaccess file for each page. Because there isn't a datestamp in the URL we couldn't create a mask in the htaccess file automatically redirecting all bluewidget.com/news/yymm* to bluewidget.com/news These long tail pages do send traffic, but for speed we only want to keep the last month of news flashes at the most. What would you do to avoid Google thinking its a poorly maintained site?
Intermediate & Advanced SEO | | ozgeekmum0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640