Minimising duplicate content
-
From a minimising duplicate content perspective is it best to create all blog posts with a single tag so google doesn't think the same post being returned via a different tag search is duplicate content. I.e. the urls below return the same blog post; or doesn't it matter.
for example
http://www.ukholidayplaces.co.uk/blog/?tag=/stay+in+Margate
http://www.ukholidayplaces.co.uk/blog/?tag=/Margate+on+a+budget
are the same posts...
thanks
-
Hi!
Little late to the party here - thanks Geoff for helping out!!
While certainly creating excerpts on for the tag pages would be great - I'd suggest doing a crawl of your own site with something like Screaming Frog SEO Spider
I just did a crawl, and see a bunch of issues needing attention:
- Just about all of your meta descriptions are exactly the same
- Your H1s are all the same
- Bunch of duplicate titles (because for example, all the author archive subpages are being given the same title)
- I don't see any meta robots or canonical tags in use at all, which would be good to help control what pages you want indexed or counted for value.
- You have tons of meta keywords, mostly all duplicates, and the meta keywords tag should not be used anymore.
You've got some additional issues to work out besides just the tags thing.
Check webmaster tools to confirm this as well, Google webmaster tools will show you everything you need to fix!
-Dan
-
You're welcome Jonathan.
Feel free to see how a lot of other successful organisations implement this on their blogs on the web. Take Mashable for example, see their topics pages, these are essentially what blog articles are tagged with. Looks like they cut off their snippets at about 170 characters.
Also, ensure that you're using the canonical link element for blog article pages too to let search engines know that those are the originals and where you want the weight placed.
-
Thanks Geoff,
I wasn't sure after the recent updates.
Copy scape finds loads of matches but google didn't....
-
No, assigning multiple tags to multiple pages on your website is good practice. (Providing they are of relevance of course).
What you should think about doing is only displaying excerpts for tag / search result pages so that it doesn't flag as duplicate content. You don't need to be displaying the entire post(s) for a tag page, a small snippet with a 'Read More' or similar link will ensure the full original is only ever at one location, it's specific URI.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
Any alternative techniques to display tabbed content without using Javascript / JSON and be SEO Friendly?
John Mueller's input in the EGWMH hangout suggests that Google MAY ignore expandable content served by Javascript. Are there any alternative techniques to display tabbed content without using Javascript / JSON and be SEO Friendly? I do however view these as good for website interactivity and UX - and see many examples of websites performing well and ranking highly whilst using these techniques - are there any Google friendly ways to serve content on a page so that search bots can recognise and choose to crawl / consume the content as legitimate fodder?
Web Design | | Fergclaw0 -
Parallax, SEO, and Duplicate Content
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize. We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function. Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang. www.example.com/About < This is its own page www.example.com/about/#/history < This is a subpage that you scroll to on the About page We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content? Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user. I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
Web Design | | PaulRonin2 -
SEO tricks for a one page site with commented html content
Hi, I am building a website that is very similar to madebysofa.com : means it is one page site with entire content loaded (however are commented in html) and by clicking on sections it modify the DOM to make specific section visible. It is very interesting from UX point of view but as far as I know, since this way most of my content is always commented and hidden from crawlers, I will loose points regarding SEO. Is there any workaround you can recommend or you think sites like madebysofa.com are doomed to loose SEO points by nature? Best regards,
Web Design | | Ashkan10 -
Hi Everybody. I have a large site that is made up of the main site then a large support site. The support site has a lot of overlapping content and similar titles. Would it be beneficial to separate the two? Thank you. All answers appreciated.
Hi Everybody. I have a large site that is made up of the main site then a large support site. The support site has a lot of overlapping content and similar titles. Would it be beneficial to separate the two? Thank you. All answers appreciated.
Web Design | | arithon0 -
Duplicate Page Content
Currently experiencing duplicate pages for all hotel pages. What would be recommendation to fix the pop up pages that uses javascript? | http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=fotos http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=localizacion http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=panorama http://www.solmelia.com/hoteles/espana/tenerife/redlevel-at-gran-melia-palacio-de-isora/en/visor.html?pest=tourVisual |
Web Design | | Melia0 -
SEOMoz crawl report shows a duplicate content and duplicate title for these two url's http://freightmonster.com/ and http://freightmonster.com/index.html. How do I fix this?
What page is attached to http://freightmonster.com/ if it is not the index.html ? Should I do a redirect from the index page to something more descriptive?
Web Design | | FreightBoy1 -
Using "#" anchors to display different content
If I have a page that has an area on the page that acts like a widget and has three different tabs. These tabs provide 3 different types of information relevant to the page subject matter. By default when someone goes to the page one of the tabs is showing but you have to click on the others to see the info on them. Is it OK to use domain.com/topic#TAB1, domain.com/topic#TAB2, domain.com/topic#TAB3 to create shortcut links so that people can land on the page and have that predetermined tab showing. I'm wondering what search engines might think. Essentially all the content of all three tabs is there for people to see but they'd have to click to see the other tabs. I don't consider the content to be hidden. But I'd like to hear people's thoughts.
Web Design | | Business.com0