Importance of Unique Content Location in Source Code
-
How much does Google value placement of unique content in the source code vs where it is visually displayed? I have a case where my unqiue content visually displays high on page for the user, but in the source code the unique quality content is below duplicate type content that appear across many other domains (think e-commerce category thumbs on left side of screen and 80% right side of screen unique stuff).
I have the impression I am at a disadvantage because these pages have the unique / quality content lower in source code. Any thoughts on this?
-
Unfortunately, this issue has been up for debate over the past few years and there's no clear-cut answer. You might like to take a look at this staff-endorsed answer from 2012, which indicated that Google will look at how much unique content is above the fold. If your content is buried deep in the source code and duplicate copy comes first, then this _could _be an issue.
If you can place your unique content higher up within the source code, then this wouldn't be a bad thing, but I have suspicion that chances are there's always more pressing things to optimise.
Sorry there's no '100% yes!' on this, but IMO this has always been one of the vaguest areas of SEO; especially as we know G has become more adept at crawling in the past few years (javascript support, etc).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content From API - Remove or to Redirect ?
Hi Guys,
Intermediate & Advanced SEO | | PaddyM556
I am working on a site at the moment,
Previous developer used a API to pull in HealthCare content (HSE) .
So the API basically generates landing pages within the site, and generates the content.
To date it has over 2k in pages being generated.
Some actually rank organically and some don't. New site being launch: So a new site is being launched & the "health advice" where this content used to live be not included in the new site. So this content will not have a place to be displayed. My Query: Would you allow the old content die off in the migration process & just become 404's
Or
Would you 301 redirect the all or only ranking pages to the homepage ? Other considerations, site will be moved to https:// so site will be submitted to search console & re-indexed by Google. Would love to hear if anyone had similar situation or suggestions.
Best Regards
Pat0 -
Topic research and content suggestion
Is the topic research and content suggestions that semrush gives (that is currently in beta) similar to what moz calls content suggestions ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Duplicate content across domains?
Does anyone have suggestions for managing duplicate product/solution website content across domains? (specifically parent/child company domains) Is it advisable to do this? Will it hurt either domain? Any best practices when going down this path?
Intermediate & Advanced SEO | | pilgrimquality0 -
Too much content??
Hey Moz comm! My company is migrating all of our content manually from several subdomains into one new, unified subdomain next week. We will be uploading content at the rate of 15 blog posts/day or 75 posts/week--is it possible that we can get flagged by google for this, or is it always good to be adding lots of content? It's all quality stuff, but would they think we're spamming? Just wondering, curious to hear any insights or recommendations, thanks!
Intermediate & Advanced SEO | | genevieveagar0 -
Rotating content = Google Penalty?
Hi all. We have an ecommerce site which features various product sections. In each section you might have 60 products each displayed neatly in pages of 10. We recently added functionality, so that if a product is out of stock, it will automatically drop that product to the back of the list and bring another in stock one forward. We're just worried that Google will view the same information, repeatedly rotating on the first page of 10 products (the page that ranks) and think we're in some way trying to trick Google into thinking the content is fresh? Does anyone have a throw on this? Is it likely to penalise us? Thank you!!! Ben
Intermediate & Advanced SEO | | bnknowles10 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Duplicate content resulting from js redirect?
I recently created a cname (e.g. m.client-site .com) and added some js (supplied by mobile site vendor to the head which is designed to detect if the user agent is a mobi device or not. This is part of the js: var CurrentUrl = location.href var noredirect = document.location.search; if (noredirect.indexOf("no_redirect=true") < 0){ if ((navigator.userAgent.match(/(iPhone|iPod|BlackBerry|Android.*Mobile|webOS|Window Now... Webmaster Tools is indicating 2 url versions for each page on the site - for example: 1.) /content-page.html 2.) /content-page.html?no_redirect=true and resulting in duplicate page titles and meta descriptions. I am not quite adept enough at either js or htaccess to really grasp what's going on here... so an explanation of why this is occurring and how to deal with it would be appreciated!
Intermediate & Advanced SEO | | SCW0 -
Which is more effective: JQuery + CSS for Tabbed Content or Create Unique Pages for each tab.
We are building a from-scratch directory site and trying to determine the best way to structure our pages. Each general listing page has four sections of specific information. What is a better strategy for SEO: Using tabs (e.g. JQuery + CSS) and putting all content on one page (and will all of the content still be indexible using JQuery?) OR creating unique pages for each section. JQuery: sitename.com/listing-name#section1 Unique Pages: sitename.com/listing-name/section1 If I go with option one, I can risk not being crawlable by google if they can't read through the scripting. However, I feel like the individual pages will not rank if there's a small amount of content for each section. Is it better to keep all the content on one page and focus on building links to that? Or better to build out the section pages and worry about adding quality content to them so that long term there is more specificity for long tail search and better quality search experience on Google? We are also set up to have "../listing-type/listing-name" but are considering removing 'listing type and just having "../listing-name/". Do you think this more advantageous for boosting rankings? I know that was like five questions. I've been doing a lot of research and these are the things that I'm still scratching my head about. Some general direction would be really great! Thank You!
Intermediate & Advanced SEO | | knowyourbank0