Content, for the sake of the search engines
-
So we all know the importance of quality content for SEO; providing content for the user as opposed to the search engines. It used to be that copyrighting for SEO was treading the line between readability and keyword density, which is obviously no longer the case.
So, my question is this, for a website which doesn't require a great deal of content to be successful and to fullfil the needs of the user, should we still be creating relavent content for the sake of SEO?
For example, should I be creating content which is crawlable but may not actually be needed / accessed by the user, to help improve rankings?
Food for thought
-
Assuming I'm not cloaking any content, how would the Search Engines know it's for them rather than users? Essentially I'd be adding relevant content which, as far as users are concerned, is superfluous.
I guess my point is, should I create content for users who are never going to read it, for the purposes of SEO?
Thanks
-
Generally the search engines don't want to see content that is just for them and not users, or showing the search engines one version of content and showing users a different version (which is called cloaking).
-
I have Done On page for my website.i want to target mu main keyword " healthy Breakfast "....i am writing weekly 5 + articles most of them 500 + words.....will you please suggest me about my On page Optimization Done or need to do more And Also Smart Optimization technique to get Good Rank.......
For checking my on page and suggesting my optimization tips i am sharing my site link
http://ahealthybreakfastfood.com
Wishing Good Answers
-
Less that 500 words on average on each page; fairly well optimised in terms of internal links, key word density etc confirming to most best practices.
However, almost all of the content is static - so I'm concerned that we're not getting much in the way of fresh content, hence my question about creating content just for the SE.
-
is the onpage SEO as tight as it can be in terms of targeting? how many words are currently on each page approximately?
-
Wwe should always be creating new, relevant content for our sites. Obviously don't over do it and don't write for the Search Engines alone... but if you have pages lacking much content that you feel could better serve your users with some copy added to it then by all means go ahead and write something up. Maybe look for underdeveloped pages that could be perfect for trying to attract a longtail term you haven't put much love into or expanding on a niche page with something insightful/interesting where you may have taken the page for granted before and/or assumed no one needed an explanation.
-
Totally see your point and I agree. However, what if I'm looking to be pro-active and improve my rankings?
Competition is quite high and the site in question receives decent volumes of traffic but not necessarily for some of the search terms I want to target.
Thanks!
-
If the content is not needed for rankings due to low competition, and is not of benefit to the user, then i would not create additional content just for the engines unless you see slipping in the ranks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google considers the cached content of a page if it's redirected to new page?
Hi all, If we redirect an old page to some new page, we know that content relevancy between source page and this new page matters at Google. I just wonder if Google is looking at the content relevancy of old page (from cache) and new page too. Thanks
Algorithm Updates | | vtmoz0 -
Are links from inside duplicate content on a 3rd party site pointing back to you worthwhile.
In our niche there are lots of specialist 'profile / portfolio' sites were we can upload content (usually project case studies. These are often quite big and active networks and can drive decent traffic and provide links from high ranking pages. The issue im a bit stuck on is - because they are profile / portfolio based usually its the same content uploaded to each site. But im beginning to get the feeling that these links from within duplicate content although from high ranking sites are not having an effect. Im about to embark on a campaign to re rewrite each of our portfolio items (each one c. 400 words c. 10 times) for each different site, but before i do i wandered if any one has had any experience / a point of view on with wether Google is not valuing links from within duplicate content (bare in mind these arnt spam sites, and are very reputable, mainly because once you submit the content it gets reviewed prior to going live). And wether a unique rewrite of the content solves this issue.
Algorithm Updates | | Sam-P0 -
Google indexing site content that I did not wish to be indexed
Hi is it pretty standard for Google to index content that you have not specifically asked them to index i.e. provided them notification of a page's existence. I have just been alerted by 'Mention' about some new content that they have discovered, the page is on our site yes and may be I should have set it to NO INDEX but the page only went up a couple of days ago and I was making it live so that someone could look at it and see how the page was going to look in its final iteration. Normally we go through the usual process of notifying Google via GWMT, adding it to our site map.xml file, publishing it via our G+ stream and so on. Reviewing our Analytics it looks like there has been no traffic to this page yet and I know for a fact there are no links to this page. I am surprised at the speed of the indexation, is it a example of brand mention? Where an actual link is now no longer required? Cheers David
Algorithm Updates | | David-E-Carey0 -
Google's Local Search Results for Broad Keywords
I have a question regarding Google's local search results for broad keywords. Since Google is changing their algo to reflect local results for broad words, would it be beneficial now to start going after those words as well? For example: we have a client ranking for 'miami security alarm', but I would like to know if it would be beneficial to start optimizing for 'security alarm' as well. Also, since Google's keyword research tool reflects searches on a national level, how would I be able to find out how many searches a broad keyword is receiving on a local level? Thank you in advanced!
Algorithm Updates | | POPCreative0 -
How can I use Intuit without getting duplicate content issues
All of my Intuit site show duplicate content on the index pages. How can I avoid this
Algorithm Updates | | onestrohm0 -
Rank Tracking & Personalized Search
How effective is rank tracking when google tends to deliver personalized search? I tend to clear out my browser of all info, cookies and cache so I can get the best results but how effective are rank tracking algo's in delivering accurate results. I run various apps and tests and I get different results.
Algorithm Updates | | bronxpad0 -
Need help with some duplicate content.
I have some duplicate content issues on my blog I'm trying to fix. I've read lots of different opinions online about the best way to correct it, but they all contradict each other. I was hoping I could ask this community and see what the consensus was. It looks like my category and page numbers are showing duplicate content. For instance when I run the report I see things like this: http://noahsdad.com/resources/ http://noahsdad.com/resources/page/2/ http://noahsdad.com/therapy/page/2/ I'm assuming that is just the categories that are being duplicated, since the page numbers only show on the report at the end of a category. What is the best way to correct this? I don't use tags at all on my blog, using categories instead. I also use the Yoast SEO plug in. I have a check mark in the box that disables tags. However it says, "If you're using categories as your only way of structure on your site, you would probably be better off when you prevent your tags from being indexed." There is a box that allows you to disable categories also, but the description above makes it seem like I don't want to block both tags and categories. Any ideas what I should do? Thanks.
Algorithm Updates | | NoahsDad0 -
Duplicate content penalisation?
Hi We are pulling in content snippets from our product blog to our category listing pages on our ecommerce site to provide fresh, relevant content which is working really well. What I am wondering is if we are going to get penalised for dupicate content as both our our blog and ecommerce site are on the same ip address? If so would moving the blog to a separate server and / or a separate domain name be a wise move? Thanks very much
Algorithm Updates | | libertybathrooms0