Solving link and duplicate content errors created by Wordpress blog and tags?
-
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo...
Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped?
Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages?
Thanks!
-
-
Just to follow up on the "too many links" issue Lacy...
That error isn't just related to the number of posts on the page - it's taking the total number of links in all locations of the page added together.
So it's counting up the links in your primary navigation, in your sidebar, in your footer, everything.
So if you have a sidebar widget that's listing all your tags, categories, or even a blogroll, all those links are adding up. same with a footer full of links.
The reason this is a problem is that with so many links, the ability of the homepage to effectively pass it's authority along to the other important pages is diluted by all the less-essential links.
Make sense?
Paul
-
I've had a clients website with only 20 tags on a 100 page website getting punished for duplicate content so try and 'no follow' things sooner rather than later.
-
As David says, you can fix your duplicate content issues easily with Yoast's Wordpress SEO plugin. Here's a link:
-
The duplicate content issue you can easily solve by installing an SEO plugin, such as Yoast's and noindexing the tag pages (you can also noindex categories, archives etc if you need).
I wouldn't have thought 6 excerpts would flag a too many links error, but if you're confident there aren't too many links, just ignore it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats the best way to move 30% of our content behind a paywall and still get indexed without penalties and without letting people see our content before they subscribe.
Hi all - We want to create a membership program so that they can get more great stuff from us and offers, deals, etc. but only if they qualify to be a member via a purchase for example. The question is we want to move only some of our content (c.30%) behind the membership curtain - will be a mix of SEO value content. There are few questions/ concerns I am hoping you the SEO community can help me with: How can i ensure Google continues to index it without getting penalized. If i tell google bot to index but not allow Google and other sites to see the membership content will that create a penalty? Is that considered a form of cloaking? How can i prevent having to reveal 3 pages a day under Google's First Click Free set-up. I suppose i want my cake and eat it and i suspect the answer is well i cant. Any help or insights that can help me make this decision better is gratefully accepted.
Reporting & Analytics | | Adrian-phipps0 -
We have a client that wants to apply UTM URL tagging to track local organic traffic in Google Analytics. Is there any benefit in doing this?
One of our clients requested that we apply UTM URL tagging to better track organic traffic in Google Analytics. We found this to be an odd request because we are most familiar with UTM tracking for special campaigns (referral tracking, PPC, email tracking, etc). Is there any benefit of applying UTM tags to urls to analyze local organic traffic in Google Analytics? Are there any resources out there about this? Thanks!
Reporting & Analytics | | RosemaryB0 -
Crawl errors for pages that no longer exist
Hey folks, I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense. The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list. Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about? Thanks!
Reporting & Analytics | | BrianAlpert780 -
What will be configuration for new version of tag manager for given below code?
Hello Expert, I am using new version of tag manager for enhance ecommerce. Now i have post related to enhance ecommerce for old version of tag manager this one - https://developers.google.com/tag-manager/enhanced-ecommerce In this post, below is the configuration of "Measuring Views of Product Details" for old version of tag manager, can you please tell me what will be configuration for new version of tag manager? ( mainly basic setting and firing rule ) Tag type : Universal Analytics
Reporting & Analytics | | bkmitesh
Track type : Pageview
Enable Enhanced Ecommerce Features: true
Use Data Layer: true
Basic Settings - Document Path: {{url path}}
Firing Rule: {{event}} equals gtm.js Thanks! BK Mitesh0 -
Find 404 Broken Links
We are looking for tools to help us repair broken backlinks, those with 404 error. We have used the Open Site Explorer tool to create a CSV file from our client's URL. We read the "Fixing Crawl Diagnostic Issues", but don't see how to "find the 404ed URL", nor do we see a "referral column" to scroll to. What steps should we take to locate the broken 404 linkages? What steps can we take to streamline repairs?
Reporting & Analytics | | jamie_netsitemarketing.com0 -
Link Analysis Past 6 Months
I'm analyzing inbound links for a site and I was wondering if there is a way to see how many of the links were created in a certain time period? Example how many inbound links were sent to this sub domain during the past 6 months? I would think there would be a fairly simple way to do this, but not sure. Any help is greatly appreciated.
Reporting & Analytics | | seantgreen0 -
Does prevent links from being included in Google Webmaster linking sites report?
My client has clean links in edit from nytimes.com. The links do not have nofollow tags. Google Webmaster stopped including links from nytimes.com in the external linking domains report and we don't know why since the URL is still live. The nytimes.com URL includes this tag in the source code: Are links on pages with NOARCHIVE still counted in Google Webmaster linking domains reports?
Reporting & Analytics | | ebenthurston0 -
Sub-category considered duplicate content?
Hello, My craw diagnostics from the PRO account is telling me that the following two links have duplicate content and duplicate title tag: http://www.newandupcoming.com/new-blu-ray-releases (New Blu-ray Releases) http://www.newandupcoming.com/new-blu-ray-releases/action-adventure (New Action & Adventure Releases | Blu-ray) I am really new to the SEO world so I am stuck trying to figure out the best solution for this issue. My question is how should I fix this issue. I guess I can put canonical tag on all sub-categories but I was worried that search engines would not craw the sub-categories and index potentially valuable pages. Thanks for all the help.
Reporting & Analytics | | hirono0