Keeping Roger Happy - The Dynamic Dilema!
-
Roger (the SEOMoz robot) is reporting 1000’s of duplicate pages, duplicate titles and overly dynamic URL’s. These are being caused by our dynamic forum/shopping/testimonial pages.
I appreciate Roger’s efforts and for making me aware of the situation, but should I be worrying about this too much? I believe that this shouldn’t affect rankings or SEO performance.. but then again I want to make Roger happy and see ‘0’ next to all errors and warnings! J
Many thanks in advance!
Lee
-
Many thanks Pete, will see what we can do and take action. Appreciate the advice
-
I'm seeing a lot of duplicates in your forum pages - I think the issue is that any attempts to click into the forum go to the login page, but the URL stays the same. You may want to block those from crawlers somehow (META NOINDEX, for example), since Google can't log into member areas.
They don't seem to be currently in the Google index, but there is potential to dilute your site's ranking ability and for Google to think that your content is "thin". I do think it's a problem you should address.
-
Appreciate that Alsvik, thought as much.
Still not sure whether I should be worrying about it too much though! Anyone else got any input?
-
Rel=canonical for duplicate entries to the same pages. You could, if possible on your server, add no follow, noindex to all but one active URL for the same page - or use redirects ...
-
I'll buy you a beer when you do Alsvik! How are you fixing the problem if you don't mind me asking?
-
I worry too. And therefore I fix pages, sorted by page authority. I calculate on reaching 0 some day in 2017 ... Yes, you should fix these, but you need to prioritise your errors and warnings. Since google is my biggest concern, I start by fixing the ones GWT show me - and then I focus on Mozbot errors and warnings ....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?
I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
Intermediate & Advanced SEO | | SearchStan
https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?1 -
Is it good practice of keeping all our pages at second level?
While defining the site structure we thought of having all pages at second level only. i.e. domain.com/services domain.com/city domain.com/services-in-city please let us know the pros and cons of having this as architecture.
Intermediate & Advanced SEO | | fabogo_marketing0 -
I should keep looking at the IP address location on my backlink prospects?
Hello Everyone, I'm doing a backlink research and we are considering how to define a strategy. My website is located in Spain and I am focusing on ranking my website on Google.es for public coming from Spain. The thing is that until now I used to look carefully at the IP address of my backlink prospects, to check if the website was really hosted in Spain. I know Matt Cutts used to say very long ago that hosting location of a website was important. But taking into account that nowadays with cloud servers, CDNs, AnyCast and such technologies it's no longer possible to accurately identify the location of a website, specially if the CDN of the websites uses AnyCast, so that an IP address can be used by different machines in different locations. Do you guys think I should keep looking at the IP address location on my backlink prospects?
Intermediate & Advanced SEO | | C.A0 -
What is the value of Google Crawling Dynamic URLS with NO SEO
Hi All I am Working on travel site for client where there are 1000's of product listing pages that are dynamically created. These pages are not SEO optimised and are just lists of products with no content other than the product details. There are no meta tags for title and description on the listings pages. You then click Find Out more to go to the full product details. There is no way to SEO these Dynamic pages This main product details has no content other than details and now meta tags. To help increase my google rankings for the rest of the site which is search optimised would it be better to block google from indexing these pages. Are these pages hurting my ability to improve rankings if my SEO of the content pages has been done to a good level with good unique Titles, descriptions and useful content thanks In advance John
Intermediate & Advanced SEO | | ingageseo0 -
301 redirect w/ dynamic pages to static
I am trying to redirect old dynamically created pages to a new static one (single page). However, when I implement the redirects, it still uses part of the old dynamic url. For instance... dynamic.php?var=example1 dynamic.php?var=example2 dynamic.php?var=example3 should all redirect to: static.html. However, they are redirecting to: static.html?var=example1 static.html?var=example2 static.html?var=example3 The page is resolving fine, but I don't want google to misinterpret the new static page as numerous page with dup content. I tried this in PHP on the dynamic.php page as follows, but it the problem above persisted: header('HTTP/1.1 301 Moved Permanently');
Intermediate & Advanced SEO | | TheDude
header('Location: http://www.mysite.com/static.html'); I tried doing it in my .htaccess file as follows, but the problem persisted: redirect 301 /info/tool_stimulus.php?var=example1 http://www.mysite.com/static.html
redirect 301 /dynamic.php?var=example2 http://www.mysite.com/static.html Can anyone solve this in PHP or w/ htaccess? Help!!! 🙂0 -
Overly-Dynamic URLs & Changing URL Structure w Web Redesign
I have a client that has multiple apartment complexes in different states and metro areas. They get good traffic and pretty good conversions but the site needs a lot of updating, including the architecture, to implement SEO standards. Right now they rank for " <brand_name>apartments" on every place but not " <city_name>apartments".</city_name></brand_name> There current architecture displays their URLs like: http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=overview</client_apartments> http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=floorplans&floorPlanID=121</client_apartments> I know it is said to never change the URL structure but what about this site? I see this URL structure being bad for SEO, bad for users, and basically forces us to keep the current architecture. They don't have many links built to their community pages so will creating a new URL structure and doing 301 redirects to the new URLs drastically drop rankings? Is this something that we should bite the bullet on now for future rankings, traffic, and a better architecture?
Intermediate & Advanced SEO | | JaredDetroit0 -
Sites with dynamic content - GWT redirects and deletions
We have a site that has extremely dynamic content. Every day they publish around 15 news flashes, each of which is setup as a distinct page with around 500 words. File structure is bluewidget.com/news/long-news-article-name. No timestamp in URL. After a year, that's a lot of news flashes. The database was getting inefficient (it's managed by a ColdFusion CMS) so we started automatically physically deleting news flashes from the database, which sped things up. The problem is that Google Webmaster Tools is detecting the freshly deleted pages and reporting large numbers of 404 pages. There are so many 404s that it's hard to see the non-news 404s, and I understand it would be a negative quality indicator to Google having that many missing pages. We were toying with setting up redirects, but the volume of redirects would be so large that it would slow the site down again to load a large htaccess file for each page. Because there isn't a datestamp in the URL we couldn't create a mask in the htaccess file automatically redirecting all bluewidget.com/news/yymm* to bluewidget.com/news These long tail pages do send traffic, but for speed we only want to keep the last month of news flashes at the most. What would you do to avoid Google thinking its a poorly maintained site?
Intermediate & Advanced SEO | | ozgeekmum0 -
Posting new PDF. How to Keep Google Happy?
Posting new PDF. Is there anything beyond maintaining the PDF metadata, filename and URL I need to do to keep its current top ten search ranking on Google for a specific generic noun which it currently ranks for? Many thanks ahead of time for your help.
Intermediate & Advanced SEO | | nabarro0