Keeping Roger Happy - The Dynamic Dilema!
-
Roger (the SEOMoz robot) is reporting 1000’s of duplicate pages, duplicate titles and overly dynamic URL’s. These are being caused by our dynamic forum/shopping/testimonial pages.
I appreciate Roger’s efforts and for making me aware of the situation, but should I be worrying about this too much? I believe that this shouldn’t affect rankings or SEO performance.. but then again I want to make Roger happy and see ‘0’ next to all errors and warnings! J
Many thanks in advance!
Lee
-
Many thanks Pete, will see what we can do and take action. Appreciate the advice
-
I'm seeing a lot of duplicates in your forum pages - I think the issue is that any attempts to click into the forum go to the login page, but the URL stays the same. You may want to block those from crawlers somehow (META NOINDEX, for example), since Google can't log into member areas.
They don't seem to be currently in the Google index, but there is potential to dilute your site's ranking ability and for Google to think that your content is "thin". I do think it's a problem you should address.
-
Appreciate that Alsvik, thought as much.
Still not sure whether I should be worrying about it too much though! Anyone else got any input?
-
Rel=canonical for duplicate entries to the same pages. You could, if possible on your server, add no follow, noindex to all but one active URL for the same page - or use redirects ...
-
I'll buy you a beer when you do Alsvik! How are you fixing the problem if you don't mind me asking?
-
I worry too. And therefore I fix pages, sorted by page authority. I calculate on reaching 0 some day in 2017 ... Yes, you should fix these, but you need to prioritise your errors and warnings. Since google is my biggest concern, I start by fixing the ones GWT show me - and then I focus on Mozbot errors and warnings ....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
Creating a site search engine while keeping SEO factors in mind
I run and own my own travel photography business. (www.mickeyshannon.com) I've been looking into building a search archive of photos that don't necessarily need to be in the main galleries, as a lot of older photos are starting to really clutter up and take away the emphasis from the better work. However, I still want to keep these older photos around. My plan is to simplify my galleries, and pull out 50-75% of the lesser/older photos. All of these photos will still be reachable by a custom-build simple search engine that I'm building to house all these older photos. The photos will be searchable based on keywords that I attach to each photo as I add them to my website. The question I have is whether this will harm me for having duplicate content? Some of the keywords that would be used in the search archive would be similar or the same to the main gallery names. However, I'm also really trying to push my newer and better images out there to the front. I've read some articles that talk about noindexing search keyword results, but that would make it really difficult for search engines to even find the older photos, as searching for their keywords would be the only way to find them. Any thoughts on a way to work this out that benefits, or at least doesn't hurt me, SEO-wise?
Intermediate & Advanced SEO | | msphotography0 -
Change url structure and keeping the social media likes/shares
Hi guys, We're thinking of changing the url structure of the tutorials (we call it knowledgebase) section on our website. We want to make it shorter URL so it be closer to the TLD. So, for the convenience we'll call them old page (www.domain.com/profiles/profile_id/kb/article_title) and new page (www.domain.com/kb/article_title) What I'm looking to do is change the url structure but keep the likes/shares we got from facebook. I thought of two ways to do it and would love to hear what the community members thinks is better. 1. Use rel=canonical I thought we might do a rel=canonical to the new page and add a "noindex" tag to the old page. In that way, the users will still be able to reach the old page, but the juice will still link to the new page and the old pages will disappear from Google SERP and the new pages will start to appear. I understand it will be pretty long process. But that's the only way likes will stay 2. Play with the og:url property Do the 301 redirect to the new page, but changing the og:url property inside that page to the old page url. It's a bit more tricky but might work. What do you think? Which way is better, or maybe there is a better way I'm not familiar with yet? Thanks so much for your help! Shaqd
Intermediate & Advanced SEO | | ShaqD0 -
Interlinking vs. 'orphaning' mobile page versions in a dynamic serving scenario
Hi there, I'd love to get the Moz community's take on this. We are working on setting up dynamic serving for mobile versions of our pages. During the process of planning the mobile version of a page, we identified a type of navigational links that, while useful enough for desktop visitors, we feel would not be as useful to mobile visitors. We would like to remove these from our mobile version of the page as part of offering a more streamlined mobile page. So we feel that we're making a fine decision with user experience in mind. On any single page, the number of links removed in the mobile version would be relatively few. The question is: is there any danger in “orphaning” the mobile versions of certain pages because links don’t exist pointing to those pages on our mobile pages? Is this a legitimate concern, or is it enough that none of the desktop versions of pages are orphaned? We were not sure whether it’s even possible, in Googlebot’s eyes, to orphan a mobile version of a page if we use dynamic serving and if there are no orphaned desktop versions of our pages. (We also plan to link to "full site" in the footer.) Thank you in advance for your help,
Intermediate & Advanced SEO | | Eric_R
Eric0 -
Why does my site keep dropping and dropping when it comes to impressions?
It all started very well but now it is all just going down and down even though I try to follow the proper guidelines. Could anyone give me some advice if I pm the link?
Intermediate & Advanced SEO | | y3dc0 -
Rank keeps decreasing - Is my site penalized
Hello, I have run into a bit of a predicament. All of my search terms keep dropping on a monthly basis, even though I am adding quality guest posts every month. Even if I get a handful of articles on semi-popular sites my ranking still drop. I am wondering, is my site penalized? My metrics also over exceed my rankings using both the Moz metrics and pagerank. I have a PR of 5 and domain rank of 50+ and I am still getting outranked on every term by people with lower metrics (PR 2 and DR of 30) In the past I have done mostly article syndication through sites like ezinearticles and isnare, but that was about 5 years ago. I have also done a couple of the "pay $50 for 100 directory submissions" once but that was also about 5 years ago. Has anyone experienced anything like this? Anyone have any advice? As you can probably tell I am getting really frustrated. P.S. - This is happening for all pages on my website, not just particular pages. Is is possible to get a site wide penalty, and if so, what can be done about it?
Intermediate & Advanced SEO | | Mjstout0 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640