Keeping Roger Happy - The Dynamic Dilema!
-
Roger (the SEOMoz robot) is reporting 1000’s of duplicate pages, duplicate titles and overly dynamic URL’s. These are being caused by our dynamic forum/shopping/testimonial pages.
I appreciate Roger’s efforts and for making me aware of the situation, but should I be worrying about this too much? I believe that this shouldn’t affect rankings or SEO performance.. but then again I want to make Roger happy and see ‘0’ next to all errors and warnings! J
Many thanks in advance!
Lee
-
Many thanks Pete, will see what we can do and take action. Appreciate the advice
-
I'm seeing a lot of duplicates in your forum pages - I think the issue is that any attempts to click into the forum go to the login page, but the URL stays the same. You may want to block those from crawlers somehow (META NOINDEX, for example), since Google can't log into member areas.
They don't seem to be currently in the Google index, but there is potential to dilute your site's ranking ability and for Google to think that your content is "thin". I do think it's a problem you should address.
-
Appreciate that Alsvik, thought as much.
Still not sure whether I should be worrying about it too much though! Anyone else got any input?
-
Rel=canonical for duplicate entries to the same pages. You could, if possible on your server, add no follow, noindex to all but one active URL for the same page - or use redirects ...
-
I'll buy you a beer when you do Alsvik! How are you fixing the problem if you don't mind me asking?
-
I worry too. And therefore I fix pages, sorted by page authority. I calculate on reaching 0 some day in 2017 ... Yes, you should fix these, but you need to prioritise your errors and warnings. Since google is my biggest concern, I start by fixing the ones GWT show me - and then I focus on Mozbot errors and warnings ....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Dynamic Search Result Pages From Google
Hi Mozzerds, I have a quick question that probably won't have just one solution. Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors? Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea? Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site? I appreciate any feedback! Thanks! Andrew
Intermediate & Advanced SEO | | drewstorys0 -
Why is Google no longer Indexing and Ranking my state pages with Dynamic Content?
Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.
Intermediate & Advanced SEO | | eddys_kap0 -
Creating a site search engine while keeping SEO factors in mind
I run and own my own travel photography business. (www.mickeyshannon.com) I've been looking into building a search archive of photos that don't necessarily need to be in the main galleries, as a lot of older photos are starting to really clutter up and take away the emphasis from the better work. However, I still want to keep these older photos around. My plan is to simplify my galleries, and pull out 50-75% of the lesser/older photos. All of these photos will still be reachable by a custom-build simple search engine that I'm building to house all these older photos. The photos will be searchable based on keywords that I attach to each photo as I add them to my website. The question I have is whether this will harm me for having duplicate content? Some of the keywords that would be used in the search archive would be similar or the same to the main gallery names. However, I'm also really trying to push my newer and better images out there to the front. I've read some articles that talk about noindexing search keyword results, but that would make it really difficult for search engines to even find the older photos, as searching for their keywords would be the only way to find them. Any thoughts on a way to work this out that benefits, or at least doesn't hurt me, SEO-wise?
Intermediate & Advanced SEO | | msphotography0 -
Issues with Google-Bot crawl vs. Roger-Bot
Greetings from a first time poster and SEO noob... I hope that this question makes sense... I have a small e-commerce site, I have had Roger-bot crawl the site and I have fixed all errors and warnings that Volusion will allow me to fix. Then I checked Webmaster Tools, HTML improvements section and the Google-bot sees different dupe. title tag issues that Roger-bot did not. so A few weeks back I changed the title tag for a product, and GWT says that I have duplicate title tags but there is only one live page for the product. GWT lists the dupe. title tags, but when I click on each they all lead to the same live page. I'm confused, what pages are these other title tags referring to? Does Google have more than one page for that product indexed due to me changing the title tag when the page had a different URL? Does this question make sense? 2) Is this issue a problem? 3) What can I do to fix it? Any help would be greatly appreciated Jeff
Intermediate & Advanced SEO | | IOSC0 -
What is better for google: keep old not visited content deeply in the website, or to remove it?
We have quite a lot of old content which is not visited anymore. Should we remove it and have a lot of 410 errors which will be reported in GWT? Or should we keep it and forget about it?
Intermediate & Advanced SEO | | bele0 -
How To Create Dynamic WordPress Tags
Does anyone know how to make WordPress "tag" pages automatically generate a description based on the posts included in the tag? I have a lot of tags, and most of them rank well for long tail keywords. However I have noticed that although they have a dynamically generated "title meta tag" they do not generate a "description meta tag". I know WordPress lets you customize the description for each tag, but I have way to many for that. I need the description meta to be auto generated from the posts that are being tagged, rather than not including one at all. Does anyone know how to do this?
Intermediate & Advanced SEO | | MyNet0 -
Block all search results (dynamic) in robots.txt?
I know that google does not want to index "search result" pages for a lot of reasons (dup content, dynamic urls, blah blah). I recently optimized the entire IA of my sites to have search friendly urls, whcih includes search result pages. So, my search result pages changed from: /search?12345&productblue=true&id789 to /product/search/blue_widgets/womens/large As a result, google started indexing these pages thinking they were static (no opposition from me :)), but i started getting WMT messages saying they are finding a "high number of urls being indexed" on these sites. Should I just block them altogether, or let it work itself out?
Intermediate & Advanced SEO | | rhutchings0