Minimising duplicate content
-
From a minimising duplicate content perspective is it best to create all blog posts with a single tag so google doesn't think the same post being returned via a different tag search is duplicate content. I.e. the urls below return the same blog post; or doesn't it matter.
for example
http://www.ukholidayplaces.co.uk/blog/?tag=/stay+in+Margate
http://www.ukholidayplaces.co.uk/blog/?tag=/Margate+on+a+budget
are the same posts...
thanks
-
Hi!
Little late to the party here - thanks Geoff for helping out!!
While certainly creating excerpts on for the tag pages would be great - I'd suggest doing a crawl of your own site with something like Screaming Frog SEO Spider
I just did a crawl, and see a bunch of issues needing attention:
- Just about all of your meta descriptions are exactly the same
- Your H1s are all the same
- Bunch of duplicate titles (because for example, all the author archive subpages are being given the same title)
- I don't see any meta robots or canonical tags in use at all, which would be good to help control what pages you want indexed or counted for value.
- You have tons of meta keywords, mostly all duplicates, and the meta keywords tag should not be used anymore.
You've got some additional issues to work out besides just the tags thing.
Check webmaster tools to confirm this as well, Google webmaster tools will show you everything you need to fix!
-Dan
-
You're welcome Jonathan.
Feel free to see how a lot of other successful organisations implement this on their blogs on the web. Take Mashable for example, see their topics pages, these are essentially what blog articles are tagged with. Looks like they cut off their snippets at about 170 characters.
Also, ensure that you're using the canonical link element for blog article pages too to let search engines know that those are the originals and where you want the weight placed.
-
Thanks Geoff,
I wasn't sure after the recent updates.
Copy scape finds loads of matches but google didn't....
-
No, assigning multiple tags to multiple pages on your website is good practice. (Providing they are of relevance of course).
What you should think about doing is only displaying excerpts for tag / search result pages so that it doesn't flag as duplicate content. You don't need to be displaying the entire post(s) for a tag page, a small snippet with a 'Read More' or similar link will ensure the full original is only ever at one location, it's specific URI.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Does Google View Hidden Content?
I have a website which contains a lot of content behind a show hide, does Google crawl the "hidden" copy?
Web Design | | jasongmcmahon0 -
Why would a developer build all page content in php?
Picked up a new client. Site is built on Wordpress. Previous developer built nearly all page content in their custom theme's PHP files. In other words, the theme's "page.php" file contains virtually all the HTML for each of the site's pages. Each individual page's back-end page editor appears blank, except for some of the page text. No markup, no widgets, no custom fields. And no dedicated, page-specific php files either. Pages are differentiated within page.php using: elseif (is_page("27") Has anyone ever come across this approach before? Why might someone do this?
Web Design | | mphdavidson0 -
Content thin for new home page been told to change it? any suggestions?
Hi guys, I'm newbie.... I have been told that my home page is content thin, and if I want to rank really well in the search i need to have more relevant content on my homepage - the site is only new 2months and I can see we are now at 39th place in the search, if i make changes to the home page design and add more content will this effect this current ranking?
Web Design | | edward-may0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
Duplicate home page /index.asp /index.php etc
We recently moved www.devoted2vntage.co.uk to shopify but seem to have multiple home page variants still in google index. I am concerned that these will be causing duplicate content. I have redirected the offending URLs below to www.devoted2vintage.co.uk/ and have set up a canonical URL but need an expect to tell me if I have taken the current steps and if not, exactly what I need to do. www.devoted2vintage.co.uk/index.php www.devoted2vintage.co.uk/index.htm www.devoted2vintage.co.uk/index.html www.devoted2vintage.co.uk/index.shtml www.devoted2vintage.co.uk/index.aspx www.devoted2vintage.co.uk/index.cfm www.devoted2vintage.co.uk/index.pl www.devoted2vintage.co.uk/index.asp
Web Design | | devoted2vintage0 -
What reason would scrapers, and syndication sites outrank all of our content?
Typing in any of our titles for content, scrapers and content syndication sites all outrank us by quite a bit. What is the main reason for this usually? I started noticing this happening quite a bit this year, and think maybe it has to do with panda. Has anyone figured out the reasoning?
Web Design | | upbuiltgames0 -
How serious is duplicate page content?
We just launched our site on a new platform - Magento Enterprise. We have a wholesale catalog and and retail catalog. We have up to 3 domains pointing to each product. We are getting tons of duplicate content errors. What are the best practices for dealing with this? Here is an example: mysite.com/product.html mysite.com/category/product.html mysite.com/dynamic-url
Web Design | | devonkrusich0 -
Lazy Loading Content and SEO
I'v been seeing a lot of websites use a technique to present content to website visitors when the scroll down the page called "Lazy Loading". Does this hinder SEO and indexing since the content is not actually on the page until the user acts/requests it?
Web Design | | JusinDuff0