Technical Automated Content - Indexing & Value
-
One of my clients provides some Financial Analysis tools, which generate automated content on a daily basis for a set of financial derivatives. Basically they try to estimate through technical means weather a particular share price is going up or down, during the day as well as their support and resistance levels.
These tools are fairly popular with the visitors, however I'm not sure on the 'quality' of the content from a Google Perspective. They keep an archive of these tools which tally up to nearly a 100 thousand pages, what bothers me particularly is that the content in between each of these varies only slightly.
Textually there are maybe up to 10-20 different phrases which describe the move for the day, however the page structure is otherwise similar, except for the Values which are thought to be reached on a daily basis. They believe that it could be useful for users to be able to access back-dated information to be able to see what happened in the past. The main issue is however that there is currently no back-links at all to any of these pages and I assume Google could deem these to be 'shallow' provide little content which as time passes become irrelevant. And I'm not sure if this could cause a duplicate content issue; however they already add a Date in the Title Tags, and in the content to differentiate.
I am not sure how I should handle these pages; is it possible to have Google prioritize the 'daily' published one. Say If I published one today; if I had to search "Derivative Analysis" I would see the one which is dated today rather then the 'list-view' or any other older analysis.
-
I would appreciate some more feedback, I'm looking to group some of these pages from the 100k we're bringing it down to around 33k.
As regards comments not sure it's very feasible from research we did not many people go into back-dated entries so it's highly doubtful we'd receive much if any comments.
-
Right, I guess that's true as we still rank for other terms. However there are concerns that this could effect the Domain Rank ( I don't think its the case). We've decided to try drop at least 1/3rd of these 'automated pages' by displaying them in AJAX this way there should be a bit less stuff in the google index.
-
If certain area of the website have a duplicate content that Google will only ignore those pages which contain duplication the affect will never be on the complete website!
-
I don't exactly want all content to be deemed to be unique, what I'm more interested in is making sure that this content does not penalize the rest of the website; it's fine if its ignored by Google if its more then a week or two old. What we don't want is old results coming up when today's value is far more interesting.
I'd be happy if Google would prioritize the 'daily' posts more with relation to 'freshness'.
-
In my personal opinion slightly varied content can count under the duplicate content and this is mainly because the major %age of content on different pages is same...
As you explain how the content is generated, I don’t think there is a way you can manage to change the page in such a way that it becomes unique from each other and adding unique content to each pages is not a very good idea as there are around 100 thousand pages as you said earlier!
If I would be at your place I would have added the comment section below the content so that users who are interested in the content can share their experience, how this data helped them, what exactly happened in the market.... and this user generated content will help the up-coming pages to be unique with user generated content.
This idea will help to an extent to give new life to old pages but saying that it will make all pages unique is almost next to impossible in my eye!
Obviously, this is my suggestions but I would love to listen to others what they would do if they gone through the similar situation!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why did Google cache & index a different domain than my own?
We own www.homemenorca.com, a real estate website based in Spain. Pages from this domain are not being indexed: https://www.google.com/search?q=site%3Awww.homemenorca.com&oq=site%3Awww.homemenorca.com&aqs=chrome..69i57j69i58j69i59l2.3504j0j7&sourceid=chrome&ie=UTF-8Please notice that the URLs are Home Menorca, but the titles are not Home Menorca, they are Fincas Mantolan, a completely different domain and company: http://www.fincasmantolan.com/. Furthermore, when we look at Google's cache of Home Menorca, we see a different website: http://webcache.googleusercontent.com/search?q=cache%3Awww.homemenorca.com%2Fen&oq=cache%3Awww.homemenorca.com%2Fen&aqs=chrome..69i57j69i58j69i59.1311j0j4&sourceid=chrome&ie=UTF-8We reviewed Google Search Console, Google Fetch, the canonical tags, the XML sitemap, and many more items. Google Search Console accepted our XML sitemap, but is only indexing 5-10% of the pages. Google is fetching and rendering the pages properly. However, we are not seeing the correct content being indexed in Google. We have seen issues with page loading times, loading content longer than 4 seconds, but are unsure why Google would be indexing a different domain.If you have suggestions or thoughts, we would very much appreciate it.Additional Language Issue:When a user searches "Home Menorca" from America or the UK with "English" selected in their browser as their default language, they are given a Spanish result. It seems to have accurate hreflang annotations within the head section on the HTML pages, but it is not working properly. Furthermore, Fincas Mantolan's search result is listed immediately below Home Menorca's Spanish result. We believe that if we fix the issue above, we will also fix the language issue. Please let us know any thoughts or recommendations that can help us. Thank you very much!
Intermediate & Advanced SEO | | CassG12340 -
301ing Pages & Moving Content To Many Other Domains
Recently started working with a large site that, for reasons way beyond organic search, wants to forward internal pages to a variety of external sites. Some of these external sites that would receive the content from the old site are owned, admin'd and/or hosted by the old site, most are not. All of the sites receiving content would be a better topic fit for that content than the original site. The process is not all at once, but gradual over time. No internal links on the old site to the old page or the new site/url would exist post content move and 301ing. The forwarding is mostly to help Google realize the host site of this content is not hosting duplicate content, but is the one true copy. Also, to pick up external links to the old pages for the new host site. It's a little like domain name change, but not really since the old site will continue to exist and the new sites are a variety of new/previously existing sites that may or may not share ownership/admin etc. In most cases, we won't be able to change any external link pointing to the original site and will just be 301ing the old url to the contents new home on another site. Since this is pretty unusual (like I wouldn't get up in the morning and choose to do this for the heck of it), here are my three questions: Is there any organic search risk to the old site or the sites receiving the old content/301 in this maneuver? Will the new sites pick up the link equity benefit on pages that had third party/followed links continuing to point to the old site but resolving via the 301 to this totally different domain? Any other considerations? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945011 -
Content Cannibalism Question with example
Hi, Since I love writing and I write a lot I always find myself worried about ruining for my self with Content Cannibalism. Yesterday, while looking to learn about diamonds I encountered a highly ranked website that has two pages ranking high on the first page simultaneously (4th and 5th) - I never noticed it before with Google. The term I googled was "vvs diamonds" and the two pages were: http://bit.ly/1N51HpQ and http://bit.ly/1JefWYS Two questions: 1. Does that happen often with Google (presenting two lines from the same site on first page)? 2. Would it be better practice for the writer to combine them? - creating a one more powerful page... Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Re-using content
Hi, I've just sold the domain for a website, so I'm free to re-purpose the content over to another website I own. How can I make sure that Gg doesn't deem it as duplicate? Do I need to let Gg naturally realise that the 'original' website no longer has the content on it? Do I need to hold-off putting the content live again? Should I notify Gg by-way of a de-index request, etc (assuming the domain won't incur any difficulty if I do this)? Thanks in advance.
Intermediate & Advanced SEO | | newstd1000 -
VisitSweden indexing error
Hi all Just got a new site up about weekend travel for VisitSweden, the official tourism office of Sweden. Everything went just fine except som issues with indexing. The site can be found here at weekend.visitsweden.com/no/ For some weird reason the "frontpage" of the site does not get indexed. What I have done myself to find the issue: Added sitemaps.xml Configured and added site to webmaster tools Checked 301s so they are not faulty By doing a simple site:weekend.visitsweden.com/no/ you can see that the frontpage is simple not in the index. Also by doing a cache:weekend.visitsweden.com/no/ I see that Google tries to index the page without the trailing /no/ for some reason. http://webcache.googleusercontent.com/search?q=cache:http://weekend.visitsweden.com/no/ Any smart ideas to get this fixed or where to start looking? All help greatly appreciated Kind regards Fredrik
Intermediate & Advanced SEO | | Resultify0 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Duplicate page content query
Hi forum, For some reason I have recently received a large increase in my Duplicate Page Content issues. Currently it says I have over 7,000 duplicate page content errors! For example it says: Sample URLs with this Duplicate Page Content http://dikelli.com.au/accessories/gowns/news.html http://dikelli.com.au/accessories/news.html
Intermediate & Advanced SEO | | sterls
http://dikelli.com.au/gallery/dikelli/gowns/gowns/sale_gowns.html However there are no physical links to any of these page on my site and even when I look at my FTP files (I am using Dreamweaver) these directories and files do not exist. Can anyone please tell me why the SEOMOZ crawl is coming up with these errors and how to solve them?0 -
How can I remove duplicate content & titles from my site?
Without knowing I created multiple URLs to the same page destinations on my website. My ranking is poor and I need to fix this problem quickly. My web host doesn't understand the problem!!! How can I use canonical tags? Can somebody help, please.
Intermediate & Advanced SEO | | ZoeAlexander0