Solving link and duplicate content errors created by Wordpress blog and tags?
-
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo...
Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped?
Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages?
Thanks!
-
-
Just to follow up on the "too many links" issue Lacy...
That error isn't just related to the number of posts on the page - it's taking the total number of links in all locations of the page added together.
So it's counting up the links in your primary navigation, in your sidebar, in your footer, everything.
So if you have a sidebar widget that's listing all your tags, categories, or even a blogroll, all those links are adding up. same with a footer full of links.
The reason this is a problem is that with so many links, the ability of the homepage to effectively pass it's authority along to the other important pages is diluted by all the less-essential links.
Make sense?
Paul
-
I've had a clients website with only 20 tags on a 100 page website getting punished for duplicate content so try and 'no follow' things sooner rather than later.
-
As David says, you can fix your duplicate content issues easily with Yoast's Wordpress SEO plugin. Here's a link:
-
The duplicate content issue you can easily solve by installing an SEO plugin, such as Yoast's and noindexing the tag pages (you can also noindex categories, archives etc if you need).
I wouldn't have thought 6 excerpts would flag a too many links error, but if you're confident there aren't too many links, just ignore it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Looking for Third Party Tool Like Google Tag Manager
Hi All, Like google tag manager is there any tool or software or service available who can track my visitors each actions, events, clicks, after click what they do? But at same time like google tag manager it not slow down my site speed? I can say I am looking for tool which can main track different - different click on my pages. Any other suggestion pls share. Thanks!
Reporting & Analytics | | pragnesh96390 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
Google Analytics is treating my blog like all the content is just on the home page.
Hello all, I installed Google Analytics on a main website and a blog (blog.travelexinsurance.com) While it appears to be tracking correctly (and when I test it in real time it shows that I'm visiting) but it is treating the entire blog as though it's one page. So I can't see data on blog post X. All I see is that X visitors came to my blog in aggregate. So I see blog.travelex.com has 999 visitors, but it doesn't show that /travel-luggage got 50 visits, while /insurace-tips got 75 and so forth. I assume I screwed up the tracking somehow, but can't figure out where I went wrong. Tracking on the main domain works just fine. It's specific to the blog.
Reporting & Analytics | | Patrick_G0 -
Links On Expired Domains
Does anybody know if a link on an expired domain affects your SEO? I'm just asking because the SEO agency we used before used to create websites and then link to our company - very spammy. We have since ditched this agecny, however they wanted an extortionate amount to remove these links. Therefore, we decided to wait until these domains expired and then the links wouldn't exist. However, I am now completing a link audit and some of these sites are still appearing in the results (obtained from Link Research Tools) but I cannot access the links because the domains have expired. Can anyone help?
Reporting & Analytics | | AAttias0 -
Moz analytics showing joomla tag feature as duplicate page content
Moz Analytics is showing Joomla 3 tag pages as Duplicate Page Content because many articles are tagged with multiple words and therefore show up on the same tag-pages. example URL: www.domain.com/tag/tagID-tagname I already added "tag" as a URL parameter with Crawl=No URLs. Is there anything else I should do?
Reporting & Analytics | | modernmagic0 -
How do I fix apparent duplicates
I'm auditing a site and would appreciate your help with possible explanations and solutions as to why Google Analytics in the Content Drilldown page is showing what appears to be duplicate pages. (Refer image) I'm wondering if I have got my head around the rel=canonical tag because the page I'd consider a duplicate "page/" has a Canonical tag pointing to "~/page.html" This is the tag from the page Locations/ rel="canonical" href="http://www.domain.com/Locations.html" /> so am unsure why both versions of the page are generating views. Shouldn't the Canonical tag work like a 301 redirect? I'm unsure how the pages using the path page/ are generating so many views because I have not been able to find them and they are not indexed by Google. Unfortunately the site is built using a Propriety CMS I'm not familiar with. exK4EqrU25
Reporting & Analytics | | NicDale0 -
Link from Itunes
Does somebody know why links from Itunes are not visible in Seomoz/ Majestic. While the content is indexed by Google. (even if no-follow, I dont see the link). I hoped that building an app for one of my sites, is they ideal linkbait strategy for my EMD domain 🙂 Especially when apple is linking to my website from I tunes 🙂
Reporting & Analytics | | remkoallertz0 -
Google Links Disavow - Does that preclude new links from a domain?
If using Google disavow links tool and you disavow links from a 'domain' does that mean that any 'future or new links' from that domain will be blocked? Answer Yes is good if the domain is spammy but bad if the domain was submitted in error ........ Answer NO is good if the domain was submitted in error but bad if the site is spammy. Does anyone have an answer to this please? Also is there a disavow 'undo' request process available? cheers, Mike
Reporting & Analytics | | shags380