What is the critical size to reach for a content farm to be under google spot?
-
We're looking for building a content farm, as an igniter for another site, so there will be some duplicate content. Is it a good or a bad strategy in terms of SEO.
-
If you have original and valuable content then put it on your main site.
-
Is it that bad? Would it be good if the content is original and valuable?
-
Bad.
Why build right in the cross-hairs of Google?
I would rather do almost anything else.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google considers the direct traffic on the pages with rel canonical tags?
Hi community, Let's say there is a duplicate page (A) pointing to original page (B) using rel canonical tag. Pagerank will be passed from Page A to B as the content is very similar and Google honours it hopefully. I wonder how Google treats the direct traffic on the duplicate Page A. We know that direct traffic is also an important ranking factor (correct me if I'm wrong). If the direct traffic is high on the duplicate page A, then how Google considers it? Will there be any score given to original page B? Thanks
Algorithm Updates | | vtmoz0 -
Do the back-links go wasted when anchor text or context content doesn't match with page content?
Hi Community, I have seen number of back-links where the content in that link is not matching with page content. Like page A linking to page B, but content is not really relevant beside brand name. Like page with "vertigo tiles" linked to page about "vertigo paints" where "vertigo" is brand name. Will these kind of back-links completely get wasted? I have also found some broken links which I'm planning to redirect to existing pages just to reclaim the back-links even though the content relevancy is not much beside brand name. Are these back-links are beneficial or not? Thanks
Algorithm Updates | | vtmoz0 -
Google keyword tool
I was quite happy with google keyword tool for basic and accurate searches for keywords. Can anyone suggests a new tool that will give accurate search volume on google ( country specific ) I am not interest in info for adwords, and find a keyword planner tool way out in traffic results, compared to Keyword tool. Is the keyword tool completely gone?
Algorithm Updates | | summer3000 -
New Google Update In The Past Two Days???
Was there a new Google update in the past couple of days. Traffic on my test site has gone from ~ 1,000 per day to over 4,000 per day for no particular reason. Most of the traffic is still coming from Google and is not the result of any new major links. My keyword rankings also appear to be the same ...
Algorithm Updates | | Humanovation0 -
Weekly traffic from Google: How do you explain this?
Hello here, I have question for you. Please, have a look at the attached image from my website analytics which shows the unique visits trend of the last 2 months. What it is interesting is that every Monday Google brings me more traffic than any other day of the week, whereas on Saturdays it gives me the lowest traffic. And looks like that's a pretty regular weekly pattern. Why is Google doing that? What does that mean? Why such a clear and steady pattern? I am eager to know your thoughts about this! CGuULrN.jpg
Algorithm Updates | | fablau0 -
Google Panda - large domain benefits
Hi, A bit of a general question, but has anyone noticed a improvement in rankings for large domains - ie well known, large sites such as Tesco, Amazon? From what I've seen, the latest Panda update seems to favour the larger sites, as opposed to smaller, niche sites. Just wondered if anyone else has noticed this too?Thanks
Algorithm Updates | | Digirank0 -
Need some Real Insight into our SEO Issue and Content Generation
We have our site www.practo.com We have our blog as blog.practo.com We plan to have our main site in a months time from now as www.ray.practo.com The Issues - I will then need to direct all my existing traffic from www.practo.com to www.ray.practo.com Keeping in mind SEO and also since I will be generating new content via our Wordpress instance what are the best ways to do this so that google does not have difficulty in find out content 1. Would it be good if I put the Wordpress instance as ray.practo.com/ blog(wordpress instance comes in here in the directory) / article-url 2.Would it be better with www.practo.com / ray / blog/article-url I am using wordpress to roll out all our new SEO based content on various keywords and topics for which we want traffice - primary reasons are since we needed a content generation cms platform so that we dont have to deal with html pages and every time publish those content pages via a developer. Is the above - what soever I am planning to do in the correct manner keeping SEO in mind. Any suggestions are welcome. I seriously need to know writing seo based content on wordpress instance and have them in the urls is that a good idea? Or is only html a good idea. But we need some cms to be there so that content writers can write content independently. Please guide accordingly. Thanks
Algorithm Updates | | shanky10 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0