Duplicate Titles caused by blog
-
Hey I've done some research and understand the canonical tags and rel prev and rel next, but I wanted to get someones opinion on if we needed it since the articles are somewhat independent of each in content (there's a focus on both banks and accountants)
We have over 68 pages of blog materials
http://www.sageworks.com/blog/default.aspx?page=7
through
http://www.sageworks.com/blog/default.aspx?page=68
Thanks in advance for your help!
-
Unfortunately we're using blogengine.net and it's not the easiest to navigate. There is a place to enter custom coding in the heading, but it doesn't specify whether it is for the archived pages or not. From what I can tell, on the specific pages, there are nofollow codes for each individual story on each archive page, but not for the archive page itself. Sorry, relatively new to this.
-
Hi Josh, I'm not sure what CMS you are using, but there should be some way to update the template for your archive pages, which will update all your archive pages. Just add this code in the section of the template:
name="robots" content="noindex,follow" />
-
Hey Takeshi,
Thanks for the response! How would you go about no indexing the archives? Or should I just canonical all the previous blog pages to the 1st?
-
Pagination can is a somewhat tricky subject. The best solution is usually to include pagination markup (rel=next/prev) and include the appropriate canonical tags to avoid duplicate content. This brief tutorial from Google explains what to do:
http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html
If you really want 68 pages of archives indexed (not recommended post-Panda, but it's up to you) at least have your CMS generate unique titles for each page ("Blog Archive Page 1 | Sageworks", "Blog Archive Page 2 | Sageworks", etc).
-
Well it's true that the articles are independent in content, but these pages are simply the index/teasers for said articles. I'm not sure you need your archive lists indexed so much as the articles themselves, so why not use Canonical tags to solve this problem and canonical them all to page 1? (or whatever one holds the most recent blogs)
Hope that helps/makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Duplicate URLs on eCommerce site caused by parameters
Hi there, We have a client with a large eCommerce site with about 1500 duplicate URLs caused by the parameters in the URLs (such as the sort parameter where the list of products are then sorted by price, age etc.) Example: www.example.com/cars/toyota First duplicate URL: www.example.com/cars/toyota?sort=price-ascending Second duplicate URL: www.example.com/cars/toyota?sort=price-descending Third duplicate URL: www.example.com/cars/toyota?sort=age-descending Originally we had advised to add a robots.txt file to block search engines from crawling the URLs with parameters but this hasn't been done. My question: If we add the robots.txt now and exclude all URLs with filters - how long will it take for Google to disregard the duplicate URLs? We could ask the developers to add canonical tags to all the duplicates but these are about 1500... Thanks in advance for any advice!
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Article section on site or blog?
So, I've just started using MOZ since I've decided I wanna be an "expert" in SEO.
Intermediate & Advanced SEO | | KasperGJ
I run a couple of successful websites in Denmark and I've had some SEO guy do some SEO a few years back, but now I wanna learn this myself. I've already read a lot of books, blogs on the subject and talked with several SEO "experts". Anyways, I have a concrete "problem" which I need some help on deciding what to do. Its the same issue / dilemma on all my sites. Dilemma
On my site i have a menu-section called Articles and tips. As the name implies it's basically articles and tips on subjects related to the site.
The articles are both informal for the users and I also use these to attract new users on specific keywords.
The articles are not "spam" articles or quickly made articles, the actually give good information to the users and are wellwritten and so. I've hired a girl to create more articles, so there will be a good flow on articles, interviews and so on soon. Some SEO guys tells me, that I should create and use a external blog "instead" and post the articles there instead of on my site. (ex www.newsiteblog.com) And another SEO guy tells me that I should run a blog on my own site (ex www.ownsite.com/blog) , where I post the articles. I have a really hard time deciding what is the best way, since I hear all kinds of ideas, and really dont know who to trust. My own idea is, that it seems "stupid" to take content from the site and put on external blog.
Then I would also have to create a new blog, and point links from that to my site and so. Any of you guys have any ideas? Sorry for my bad english.0 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
Duplicate Titles caused by multiple variations of same URL
Hi. Can you please advise how I can overcome this issue. Moz.com crawle is indicating I have 100's of Duplicate Title tag errors. However this is caused because many URL's have been indexed multiple times in Google. For example. www.abc.com
Intermediate & Advanced SEO | | adhunna
www.abc.com/?b=123 www.abc.com/ www.abc.com/?b=654 www.abc.com/?b=875 www.abc.com/index.html What can I do to stop this issue being reported as duplictae Titles, as well as content? I was thinking maybe I can use Robots.txt to block various query string parameters. I'm Open to ideas and examples.0 -
Duplicate description problem in Wordpress.
Webmaster tools is flagging up duplicate descriptions for the page http://www.musicliveuk.com/live-acts. The page is one page in the wordpress page editor and the web designer set it up so that I can add new live acts from a seperate page editor on the left menu and that feeds into the page 'live-acts'. (it says under template 'live-acts-feed'. The problem is as I add more acts it creates new url's eg http://www.musicliveuk.com/live-acts/page/2 and http://www.musicliveuk.com/live-acts/page/3 etc... I use the all in one SEO pack and webmaster tools tells me that page 2/3/4/ etc all have the same description. How can I overcome this? I can't write new descriptions for each page as the all in one SEO pack will only allow me to enter one for the page 'live-acts'.
Intermediate & Advanced SEO | | SamCUK0 -
Does Google prefer Wordpress Blogs?
In creating a regular brochure website such as one for a dentist or doctor, do you see any SEO benefit to having it based in a Wordpress blog? I do see the SEO benefit of having an actual blog on the site and continually updating that, but simply using the Wordpress platform as a CMS - does that give the site any benefit? If there is a benefit, is there a way to duplicate that advantage without going through the trouble of creating a Wordpress template for the site? Maybe just publishing a sitemap.xml, and feed, etc? Thanks! Tom
Intermediate & Advanced SEO | | TomBristol0 -
Meta Description In Blog Feed
The SEOmoz crawl tool is giving me a lot of crawl errors because my blog feed and my blog tags do not have meta descriptions. Can you even give this type of content meta descriptions? If so how can you do it, as this content is created dynamically by Wordpress?
Intermediate & Advanced SEO | | MyNet0