Can Page Content & Description Have Same Content?
-
I'm studying my crawl report and there are several warnings regarding missing meta descriptions.
My website is built in WordPress and part of the site is a blog.
Several of these missing description warnings are regarding blog posts and I was wondering if I am able to copy the first few lines of content of each of the posts to put in the meta description, or would that be considered duplicate content?
Also, there are a few warnings that relate to blog index pages, e.g. http://www.iainmoran.com/2013/02/ - I don't know if I can even add a description of these as I think they are dynamically created?
While on the subject of duplicate content, if I had a sidebar with information on several of the pages (same info) while the content would be coming from a WP Widget, would this still be considered duplicate content and would Google penalise me for it?
Would really appreciate some thoughts on this,please.
Thanks,
Iain.
-
Thanks, Tom - I'll have a go at that and make an actual robots.txt file and upload it.
It is odd though and when I was creating my WP pages there are Yoast Options for each page - several of them I set to noindex, though looking at the virtual robots.txt, these isn't the case. My file just has:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Thanks again for all your help,
Iain.
-
http://wordpress.org/support/topic/robotstxt-file-4
Thats about the only thing I can find on it. Hope you can gleam some use out of it. Seem rather complicated for such an easy task.
-
Cheers Tom,
Yeah it is rather strange. There doesn't appear to be another plugin that should be causing this. Yoast is certainly the one relating to SEO.
Iain.
-
I think this is where I run out of useful things to add. That seems very odd to me.
Do you have any other plugins active that might be producing a robots.txt file?
-
Thanks Tom,
When I click Edit Files in Yoast it says:
"If you had a robots.txt file and it was editable, you could edit it from here."And yet, I do have one (albeit it appears a virtual one) as it can be viewed here:
http://www.iainmoran.com/robots.txtIf I try to view the site files on the server, via FTP or CPanel, there is no robots.txt file there!
I appear to be using the latest version of Yoast.
Thanks,
Iain.
-
Hey Iain,
There is a way to edit the file with Yoast. It should have a section called Edit Files when you click on the "SEO" part on the left hand side of your Wordpress dashboard. Once in there you should see robots.txt on the top. If you dont see it you might need to upgrade to the newest version of Yoast.
Thanks,
Tom
-
Thanks so much for your reply, Tom - very useful indeed!
I'm using Yoast SEO for WordPress, which apparently creates a virtual robots.txt and I can't see anyway to edit it as such. Unlike the posts themselves, which I can set to "noindex", the dynamic pages I cannot.
Unless I make my own robots.txt and upload it to my server, but I'm concerned that it will confuse matters and conflict with the one created/managed by Yoast?
Thanks again,
Iain.
-
Hey Iain,
I would do custom meta descriptions if possible. Meta descriptions are generally used to "sell" the content. They dont have any effect on ranking and if you dont feel like adding custom content to them, Google will just display the first content the page automatically. It is not considered duplicate content.
I would also probably get rid of those blog index pages from your xml sitemap and no index them if you can with meta robots or robots.txt. Those will produce duplicate content and you really want to drive people and bots to the posts themselves. Not the index pages of the posts.
I also wouldnt worry about the sidebar. As long as you are providing a decent amount of unique content on each page you will be fine.
Hope that helps.
Site looks good!
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Google index the text content in a PDF?
I really really thought the answer was always no. There's plenty of other things you can do to improve search visibility for a PDF, but I thought the nature of the file type made the content itself not-parsable by search engine crawlers... But now, my client's competitor is ranking for my client's brand name with a PDF that contains comparison content. Thing is, my client's brand isn't in the title, the alt-text, the url... it's only in the actual text of the PDF. Did I miss a major update? Did I always have this wrong?
Technical SEO | | LindsayDayton0 -
Is this duplicate content when there is a link back to the original content?
Hello, My question is: Is it duplicate content when there is a link back to the original content? For example, here is the original page: http://www.saugstrup.org/en-ny-content-marketing-case-infografik/. But that same content can be found here: http://www.kommunikationsforum.dk/anders-saugstrup/blog/en-ny-content-marketing-case-til-dig, but there is a link back to the original content. Is it still duplicate content? Thanks in advance.
Technical SEO | | JoLindahl912 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Can you use aggregate review rich snippets on non-product pages?
It seems like the intended purpose of the aggregate review rich snippet is for an individual product page like a page for Madden 2013. However, what if you created a single page for all football video games that you sell and put reviews on this page for different games in this category. Could you still use the aggregate review markup for this page?
Technical SEO | | ProjectLabs0 -
If a page isn't linked to or directly sumitted to a search engine can it get indexed?
Hey Guys, I'm curious if there are ways a page can get indexed even if the page isn't linked to or hasn't been submitted to a search engine. To my knowledge the following page on our website is not linked to and we definitely didn't submit it to Google - but it's currently indexed: <cite>takelessons.com/admin.php/adminJobPosition/corp</cite> Anyone have any ideas as to why or how this could have happened? Hopefully I'm missing something obvious 🙂 Thanks, Jon
Technical SEO | | TakeLessons0 -
Snippets on every page considered duplicate content?
If I create a page that pulls a 10 snippets of information from various external site, would that content be considered duplicate content? If I link to the source, would it be recommended to use a "nofollow" tag?
Technical SEO | | nicole.healthline0 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0 -
On Page 301 redirect for html pages
For php pages youve got Header( "HTTP/1.1 301 Moved Permanently" );
Technical SEO | | shupester
Header( "Location: http://www.example.com" );
?> Is there anything for html pages? Other then Or is placing this code redirect 301 /old/old.htm http://www.you.com/new.php in the .htaccess the only way to properly 301 redirect html pages? Thanks!0