Can Page Content & Description Have Same Content?
-
I'm studying my crawl report and there are several warnings regarding missing meta descriptions.
My website is built in WordPress and part of the site is a blog.
Several of these missing description warnings are regarding blog posts and I was wondering if I am able to copy the first few lines of content of each of the posts to put in the meta description, or would that be considered duplicate content?
Also, there are a few warnings that relate to blog index pages, e.g. http://www.iainmoran.com/2013/02/ - I don't know if I can even add a description of these as I think they are dynamically created?
While on the subject of duplicate content, if I had a sidebar with information on several of the pages (same info) while the content would be coming from a WP Widget, would this still be considered duplicate content and would Google penalise me for it?
Would really appreciate some thoughts on this,please.
Thanks,
Iain.
-
Thanks, Tom - I'll have a go at that and make an actual robots.txt file and upload it.
It is odd though and when I was creating my WP pages there are Yoast Options for each page - several of them I set to noindex, though looking at the virtual robots.txt, these isn't the case. My file just has:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Thanks again for all your help,
Iain.
-
http://wordpress.org/support/topic/robotstxt-file-4
Thats about the only thing I can find on it. Hope you can gleam some use out of it. Seem rather complicated for such an easy task.
-
Cheers Tom,
Yeah it is rather strange. There doesn't appear to be another plugin that should be causing this. Yoast is certainly the one relating to SEO.
Iain.
-
I think this is where I run out of useful things to add. That seems very odd to me.
Do you have any other plugins active that might be producing a robots.txt file?
-
Thanks Tom,
When I click Edit Files in Yoast it says:
"If you had a robots.txt file and it was editable, you could edit it from here."And yet, I do have one (albeit it appears a virtual one) as it can be viewed here:
http://www.iainmoran.com/robots.txtIf I try to view the site files on the server, via FTP or CPanel, there is no robots.txt file there!
I appear to be using the latest version of Yoast.
Thanks,
Iain.
-
Hey Iain,
There is a way to edit the file with Yoast. It should have a section called Edit Files when you click on the "SEO" part on the left hand side of your Wordpress dashboard. Once in there you should see robots.txt on the top. If you dont see it you might need to upgrade to the newest version of Yoast.
Thanks,
Tom
-
Thanks so much for your reply, Tom - very useful indeed!
I'm using Yoast SEO for WordPress, which apparently creates a virtual robots.txt and I can't see anyway to edit it as such. Unlike the posts themselves, which I can set to "noindex", the dynamic pages I cannot.
Unless I make my own robots.txt and upload it to my server, but I'm concerned that it will confuse matters and conflict with the one created/managed by Yoast?
Thanks again,
Iain.
-
Hey Iain,
I would do custom meta descriptions if possible. Meta descriptions are generally used to "sell" the content. They dont have any effect on ranking and if you dont feel like adding custom content to them, Google will just display the first content the page automatically. It is not considered duplicate content.
I would also probably get rid of those blog index pages from your xml sitemap and no index them if you can with meta robots or robots.txt. Those will produce duplicate content and you really want to drive people and bots to the posts themselves. Not the index pages of the posts.
I also wouldnt worry about the sidebar. As long as you are providing a decent amount of unique content on each page you will be fine.
Hope that helps.
Site looks good!
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Indexing without content
Hello. I have a problem of page indexing without content. I have website in 3 different languages and 2 of the pages are indexing just fine, but one language page (the most important one) is indexing without content. When searching using site: page comes up, but when searching unique keywords for which I should rank 100% nothing comes up. This page was indexing just fine and the problem arose couple of days ago after google update finished. Looking further, the problem is language related and every page in the given language that is newly indexed has this problem, while pages that were last crawled around one week ago are just fine. Has anyone ran into this type of problem?
Technical SEO | | AtuliSulava1 -
Dealing with Expired & Reoccurring Content At Scale
Hello, I have a question concerning maintenance & pruning content with a large site that has a ton of pages that are either expired OR reoccurring. Firstly, there's ~ 12,000 pages on the site. They have large sections of the site that have individual landing pages for time-sensitive content, such as promotions and shows. They have TONS of shows every day, so the # of page to manage keeps exponentially increasing. Show URLs: I'm auditing the show URLs and looking at pages that have backlinks. With those, I am redirecting to the main show pages.
Technical SEO | | triveraseo
-However, there are significant # of show URLs that are from a few years ago (2012, 2013, 2014, 2015) that DON'T get traffic or have any backlinks (or ranking keywords). Can I delete these pages entirely from the site, or should I go through the process of 410-ing them (and then deleting? or ...?)Can you let 410's sit?)? They are in the XML sitemap right now, so they get crawled, but are essentially useless, and I want to cut off the dead weight, but I'm worried about deleting a large # of pages from the site at once. For show URLs that are still obsolete, but rank well in terms of kewyords and get some traffic...is there any recommended option? Should I bother adding them to a past shows archive section or not since they are bringing in a LITTLE traffic? Or ax them since it's such a small amount of traffic compared to what they get from the main pages. There are URLs that are orphaned and obsolete right now, but will reoccur. For instance, when an artist performs, they get their own landing page, they may acquire some backlinks and rank, but then that artist doesn't come back for a few months. The page just sits there, orphaned and in the XML sitemap. However, regardless of back-links/keywords, the page will come back eventually. Is there any recommended way to maintain this kind of situation? Again, there are a LOT of URLs in this same boat. Promotional URLs: I'm going about the same process for promotions and thankfully, the scale of hte issue is much less. However, same question as above...they have some promotional URLs, like NYE Special Menu landing pages or Lent-Specials, etc, for each of their restaurants. These pages are only valid for a short amount of time each year, and otherwise, are obsolete. I want to reuse the pages each year, though, but don't want them to just sit there in the XML sitemap. Is there ever an instance where I might want to 302 redirect them, and then remove the 302 for the short amount of time they are valid? I'm not AS concerned about the recycled promotional URLs. There are much fewer URLs in this category. However, as you can probably tell, this large site has this problem of reoccurring content throughout, and I'd like to get a plan in place to clean it up and then create rules to maintain. Promotional URLs that reoccur are smaller, so if they are orphaned, not the end of the world, but there are thousands of show URLs with this issue, so I really need to determine the best play here. Any help is MUCH appreciated!0 -
Can Google index the text content in a PDF?
I really really thought the answer was always no. There's plenty of other things you can do to improve search visibility for a PDF, but I thought the nature of the file type made the content itself not-parsable by search engine crawlers... But now, my client's competitor is ranking for my client's brand name with a PDF that contains comparison content. Thing is, my client's brand isn't in the title, the alt-text, the url... it's only in the actual text of the PDF. Did I miss a major update? Did I always have this wrong?
Technical SEO | | LindsayDayton0 -
Woocommerce Duplicate Page Content Issue
Hi, I'm receiving a duplicate content error. It says that this url: https://kidsinministry.org/childrens-ministry-curriculum/?option=com_content&task=view&id=20&Itemid=41 is a duplicate of this: http://kidsinministry.org/childrens-ministry-curriculum I'm using wordpress, woocommerce, and not really sure how to even address this. I tried adding this to .htaccess but it didn't redirect the url: 301 Redirects Redirect 301 https://kidsinministry.org/childrens-ministry-curriculum/?option=com_content&task=view&id=20&Itemid=41 http://kidsinministry.org/childrens-ministry-curriculum/ Anyone have any ideas? Thanks!
Technical SEO | | a_toohill0 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
Can you use aggregate review rich snippets on non-product pages?
It seems like the intended purpose of the aggregate review rich snippet is for an individual product page like a page for Madden 2013. However, what if you created a single page for all football video games that you sell and put reviews on this page for different games in this category. Could you still use the aggregate review markup for this page?
Technical SEO | | ProjectLabs0 -
One landing page with lots of content or content hub?
Interested in getting some opinions on if it's better to build one great landing page with tons of content or build a good landing page and build more content (as blog posts?) and interlink them back to the landing/hub page? Thoughts and opinions? Chris
Technical SEO | | sanctuarymg0 -
Can URL re writes fix the problem of critical content too deep in a sites structure?
Good morning from Wetherby UK 🙂 Ok imagine this scenario. You ask the developers to design a site where "offices to let" is on level two of a sites hierachy and so the URL would look like this: http://www.sandersonweatherall.co.uk/office-to-let. But Yikes when it goes live it ends up like this: http://www.sandersonweatherall.co.uk...s/residential/office-to-let Is a fix to this a URL re - write? Or is the only fix relocating the office to let content further up the site structure? Any insights welcome 🙂
Technical SEO | | Nightwing0