Can Page Content & Description Have Same Content?
-
I'm studying my crawl report and there are several warnings regarding missing meta descriptions.
My website is built in WordPress and part of the site is a blog.
Several of these missing description warnings are regarding blog posts and I was wondering if I am able to copy the first few lines of content of each of the posts to put in the meta description, or would that be considered duplicate content?
Also, there are a few warnings that relate to blog index pages, e.g. http://www.iainmoran.com/2013/02/ - I don't know if I can even add a description of these as I think they are dynamically created?
While on the subject of duplicate content, if I had a sidebar with information on several of the pages (same info) while the content would be coming from a WP Widget, would this still be considered duplicate content and would Google penalise me for it?
Would really appreciate some thoughts on this,please.
Thanks,
Iain.
-
Thanks, Tom - I'll have a go at that and make an actual robots.txt file and upload it.
It is odd though and when I was creating my WP pages there are Yoast Options for each page - several of them I set to noindex, though looking at the virtual robots.txt, these isn't the case. My file just has:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Thanks again for all your help,
Iain.
-
http://wordpress.org/support/topic/robotstxt-file-4
Thats about the only thing I can find on it. Hope you can gleam some use out of it. Seem rather complicated for such an easy task.
-
Cheers Tom,
Yeah it is rather strange. There doesn't appear to be another plugin that should be causing this. Yoast is certainly the one relating to SEO.
Iain.
-
I think this is where I run out of useful things to add. That seems very odd to me.
Do you have any other plugins active that might be producing a robots.txt file?
-
Thanks Tom,
When I click Edit Files in Yoast it says:
"If you had a robots.txt file and it was editable, you could edit it from here."And yet, I do have one (albeit it appears a virtual one) as it can be viewed here:
http://www.iainmoran.com/robots.txtIf I try to view the site files on the server, via FTP or CPanel, there is no robots.txt file there!
I appear to be using the latest version of Yoast.
Thanks,
Iain.
-
Hey Iain,
There is a way to edit the file with Yoast. It should have a section called Edit Files when you click on the "SEO" part on the left hand side of your Wordpress dashboard. Once in there you should see robots.txt on the top. If you dont see it you might need to upgrade to the newest version of Yoast.
Thanks,
Tom
-
Thanks so much for your reply, Tom - very useful indeed!
I'm using Yoast SEO for WordPress, which apparently creates a virtual robots.txt and I can't see anyway to edit it as such. Unlike the posts themselves, which I can set to "noindex", the dynamic pages I cannot.
Unless I make my own robots.txt and upload it to my server, but I'm concerned that it will confuse matters and conflict with the one created/managed by Yoast?
Thanks again,
Iain.
-
Hey Iain,
I would do custom meta descriptions if possible. Meta descriptions are generally used to "sell" the content. They dont have any effect on ranking and if you dont feel like adding custom content to them, Google will just display the first content the page automatically. It is not considered duplicate content.
I would also probably get rid of those blog index pages from your xml sitemap and no index them if you can with meta robots or robots.txt. Those will produce duplicate content and you really want to drive people and bots to the posts themselves. Not the index pages of the posts.
I also wouldnt worry about the sidebar. As long as you are providing a decent amount of unique content on each page you will be fine.
Hope that helps.
Site looks good!
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dropdown content on page being crawled
Hi, will the content within a dropdown on a page be crawled? I.e. if the page visitor has to click to reveal the content as a dropdown will it be crawled by bots. Thanks
Technical SEO | | BillSCC1 -
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
Duplicate Content on a Page Due to Responsive Version
What are the implications if a web designer codes the content of the site twice into the page in order to make the site responsive? I can't add the url I'm afraid but the H1 and the content appear twice in the code in order to produce both a responsive version and a desktop version. This is a Wordpress site. Is Google clever enough to distinguish between the 2 versions and treat them individually? Or will Google really think that the content has been repeated on the same page?
Technical SEO | | Wagada0 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Added 301 redirects, pages still earning duplicate content warning
We recently added a number of 301 redirects for duplicate content pages, but even with this addition they are still showing up as duplicate content. Am I missing something here? Or is this a duplicate content warning I should ignore?
Technical SEO | | cglife0 -
Can Googlebot read the content on our homepage?
Just for fun I ran our homepage through this tool: http://www.webmaster-toolkit.com/search-engine-simulator.shtml This spider seems to detect little to no content on our homepage. Interior pages seem to be just fine. I think this tool is pretty old. Does anyone here have a take on whether or not it is reliable? Should I just ignore the fact that it can't seem to spider our home page? Thanks!
Technical SEO | | danatanseo0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Trying to get on Google page one for keyword "criminal defense attorney san diego". What can I do?
I'm trying to help a friend who is an attorney get on page one for the keyword "criminal defense attorney san diego." So far I've changed his title and description tags since they weren't optimized before. (SERP shows old title tag, however I submitted a XML sitemap through Webmaster tools to get the new title tags updated.) He also had a few duplicate pages, but I took care of that with some 301 redirects. I also added a h1 tag, alt image tag, and more content. I also spent a few hours building links for him. He currently has a page authority of 52 and domain authority of 44 with a decent amount of links pointing to his site. I'm wondering why he's stuck on page 4, when his competitors that have less impressive numbers seem to show up on page 1. I did look at his link profile using OSE and I'm worried that his old SEO guy got him spam links. His website is www.nasserilegal.com, however the page I was focusing on was www.nasserilegal.com/criminal.html Any advice would be great.
Technical SEO | | micasalucasa0