Can Page Content & Description Have Same Content?
-
I'm studying my crawl report and there are several warnings regarding missing meta descriptions.
My website is built in WordPress and part of the site is a blog.
Several of these missing description warnings are regarding blog posts and I was wondering if I am able to copy the first few lines of content of each of the posts to put in the meta description, or would that be considered duplicate content?
Also, there are a few warnings that relate to blog index pages, e.g. http://www.iainmoran.com/2013/02/ - I don't know if I can even add a description of these as I think they are dynamically created?
While on the subject of duplicate content, if I had a sidebar with information on several of the pages (same info) while the content would be coming from a WP Widget, would this still be considered duplicate content and would Google penalise me for it?
Would really appreciate some thoughts on this,please.
Thanks,
Iain.
-
Thanks, Tom - I'll have a go at that and make an actual robots.txt file and upload it.
It is odd though and when I was creating my WP pages there are Yoast Options for each page - several of them I set to noindex, though looking at the virtual robots.txt, these isn't the case. My file just has:
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Thanks again for all your help,
Iain.
-
http://wordpress.org/support/topic/robotstxt-file-4
Thats about the only thing I can find on it. Hope you can gleam some use out of it. Seem rather complicated for such an easy task.
-
Cheers Tom,
Yeah it is rather strange. There doesn't appear to be another plugin that should be causing this. Yoast is certainly the one relating to SEO.
Iain.
-
I think this is where I run out of useful things to add. That seems very odd to me.
Do you have any other plugins active that might be producing a robots.txt file?
-
Thanks Tom,
When I click Edit Files in Yoast it says:
"If you had a robots.txt file and it was editable, you could edit it from here."And yet, I do have one (albeit it appears a virtual one) as it can be viewed here:
http://www.iainmoran.com/robots.txtIf I try to view the site files on the server, via FTP or CPanel, there is no robots.txt file there!
I appear to be using the latest version of Yoast.
Thanks,
Iain.
-
Hey Iain,
There is a way to edit the file with Yoast. It should have a section called Edit Files when you click on the "SEO" part on the left hand side of your Wordpress dashboard. Once in there you should see robots.txt on the top. If you dont see it you might need to upgrade to the newest version of Yoast.
Thanks,
Tom
-
Thanks so much for your reply, Tom - very useful indeed!
I'm using Yoast SEO for WordPress, which apparently creates a virtual robots.txt and I can't see anyway to edit it as such. Unlike the posts themselves, which I can set to "noindex", the dynamic pages I cannot.
Unless I make my own robots.txt and upload it to my server, but I'm concerned that it will confuse matters and conflict with the one created/managed by Yoast?
Thanks again,
Iain.
-
Hey Iain,
I would do custom meta descriptions if possible. Meta descriptions are generally used to "sell" the content. They dont have any effect on ranking and if you dont feel like adding custom content to them, Google will just display the first content the page automatically. It is not considered duplicate content.
I would also probably get rid of those blog index pages from your xml sitemap and no index them if you can with meta robots or robots.txt. Those will produce duplicate content and you really want to drive people and bots to the posts themselves. Not the index pages of the posts.
I also wouldnt worry about the sidebar. As long as you are providing a decent amount of unique content on each page you will be fine.
Hope that helps.
Site looks good!
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I make sure a desktoppage is shown in the (desktop) search results instead of the mobile page?
When I search for my brandname, the mobile version of the customer support page is shown in the (desktop) results. We use a m.example.nl mobile webpage. To try to solve the problem, we’ve adjusted the following: Made sure the homepage is marked according to schema.org Homepage expanded with textual content and headings containing our brandname Removed all the textual content from the mobile customer support page Added the mobile customer support page to the mobile sitemap What can we change more in settings/marking/sitemap, to make sure our desktop homepage is shown in the brandname results?
Technical SEO | | WillieBV0 -
Content change within the same URL/Page (UX vs SEO)
Context: I'm asking my client to create city pages so he can present all of his appartements in that specific sector so i can have a page that ranks for "appartement for rent in +sector". The page will present a map with all the sector so the user can navigate and choose the sector he wants after he landed on the page. Question: The UX team is asking if we absolutly need to reload the sector page when the user is clicking the location on the map or if they can switch the content within the same page/url once the user is on the landing page. My concern: 1. Can this be analysed as duplicate content if Google can crawl within the javascript app or if Google only analyse his "first view" of the page. 2. Do you consider that it would be preferable to keep the "page change" so i'm increasing the number of page viewed ?
Technical SEO | | alexrbrg0 -
Duplicate content due to numerous sub category level pages
We have a healthcare website which lists doctors based on their medical speciality. We have a paginated series to list hundreds of doctors. Algorithm: A search for Dentist in Newark locality of New York gives a result filled with dentists from Newark followed by list of dentists in locations near by Newark. So all localities under a city have the same set of doctors distributed jumbled an distributed across multiple pages based on nearness to locality. When we don't have any dentists in Newark we populate results for near by localities and create a page. The issue - So when the number of dentists in New York is <11 all Localities X Dentists will have jumbled up results all pointing to the same 10 doctors. The issue is even severe when we see that we have only 1-3 dentists in the city. Every locality page will be exactly the same as a city level page. We have about 2.5 Million pages with the above scenario. **City level page - **https://www.example.com/new-york/dentist - 5 dentists **Locality Level Page - **https://www.example.com/new-york/dentist/clifton, https://www.example.com/new-york/dentist/newark - Page contains the same 5 dentists as in New York city level page in jumbled up or same order. What do you think we must do in such a case? We had discussions on putting a noindex on locality level pages or to apply canonical pointing from locality level to city level. But we are still not 100% sure.
Technical SEO | | ozil0 -
Off-page SEO and on-page SEO improvements
I would like to know what off-page SEO and on-page SEO improvements can be made to one of our client websites http://www.nd-center.com Best regards,
Technical SEO | | fkdpl2420 -
Where to put content on the page? - technical
The new algo update says any images at the top of the page negatively affect user experience if they are adverts? how does google know if its an advert or relevant banner? When trying to put text as far up as possible on the page, is it ok to make it appear higher in the code but appear further down using css? Or does Google not go from the code top to bottom when working this out, more how it renders? Any advice much appreciated.
Technical SEO | | pauledwards0 -
Tags causing Duplicate page content?
I was looking through the 'Duplicate Page Content' and Too Many On-Page Link' errors and they all seem to be linked to the 'Tags' on my blog pages. Is this really a problem and if so how should I be using tags properly to get the best SEO rewards?
Technical SEO | | zapprabbit1 -
Page not Accesible for crawler in on-page report
Hi All, We started using SEOMoz this week and ran into an issue regarding the crawler access in the on-page report module. The attached screen shot shows that the HTTP status is 200 but SEOMoz still says that the page is not accessible for crawlers. What could this be? Page in question
Technical SEO | | TiasNimbas
http://www.tiasnimbas.edu/Executive_MBA/pgeId=307 Regards, Coen SEOMoz.png0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0