Wordpress.com content feeding into site's subdomain, who gets SEO credit?
-
I have a client who had created a Wordpress.com (not Wordpress.org) blog, and feeds blog posts into a subdomain blog.client-site.com. My understanding was that in terms of SEO, Wordpress.com would still get the credit for these posts, and not the client, but I'm seeing conflicting information.
All of the posts are set with permalinks on the client's site, such as blog.client-site.com/name-of-post, and when I run a Google site:search query, all of those individual posts appear in the Google search listings for the client's domain.
Also, I've run a marketing.grader.com report, and these same results are seen.
Looking at the source code on the page, however, I see this information which leads me to believe the content is being credited to, and fed in from, Wordpress.com ('client name' altered for privacy):
href="http://client-name.files.wordpress.com/2012/08/could_you_survive_a_computer_disaster.jpeg">class="alignleft size-thumbnail wp-image-2050" title="Could_you_survive_a_computer_disaster" src="http://client-name.files.wordpress.com/2012/08/could_you_survive_a_computer_disaster.jpeg?w=150&h=143"
I'm looking to provide a recommendation to the client on whether they are ok to continue moving forward with this current setup, or whether we should port the blog posts over to a subfolder on their primary domain www.client-site.com/blog and use Wordpress.org functionality, for proper SEO.
Any advice?? Thank you!
-
My understanding is this:
If its a duplicate, as in a copy and paste article, then Google will eventually de-index the duplication and keep the original article.
In your clients case though, they are providing the source links, so google doesnt label it as duplicate content, but sees it as syndicated content.
Look at news sources for example. The same article syndicated on multiple sites are all indexed and stay indexed. (this is the case for your clients site)
What i would tell your client is that fresh and unique content on their site is key for SEO. By syndicating articles, it doesn't provide any benefits for SEO in terms of unique and fresh content, so the operation is pointless unless its for user experience only.
Give them an example, say its the same as giving away articles to other websites, and then reusing them on their site as "second hand" articles. Just because its word press doesn't mean its any different to any other website out there.
Good luck!
Greg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quick SEO Audit of my site.
Hello, I hope you are doing great. I am working on a website that is related to flea collars for cats and dogs. And I want you to make a quick audit of the site where am I lacking. It could be great if you can help ASAP. You can view my site here :
Intermediate & Advanced SEO | | Request4peace0 -
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Is it necessary to use Google's Structured Data Markup or alternative for my B2B site?
Hi, We are in the process of going through a re-design for our site. Am trying to understand if we need to use some sort of structured data either from Google Structured data or schema. org?
Intermediate & Advanced SEO | | Krausch0 -
Effect on SEO with growing number of subdomains
Since a few days I'm having some concernes on our website structure regarding SEO. Since I can't find similar cases I'm curious if the Moz community maybe have a few thoughts on the issue I'm facing The situation is as follow: For every new client our company (hosting) receives through www.example.com a new subdomain is created. This subdomain is an backup of the original website of the client and is very much irrelevant to our business. Google can also crawl these subdomains and index them. Productvariant 1: clientxxx1.productX.example.com
Intermediate & Advanced SEO | | Steven87
Productvariant 2: clientxxx1.productY.example.com
Productvariant 3: cleintxx10.productZ.example.com So I think above situation is far from ideal and I think it can cause problems. The problems we could be facing where Im thinking of are: no control over content (spam, low quality, bad optimised pages) duplicate sites (the backup on our subdomain and the original one of the client) impossible to make/manage a property for each subdomain in search console. Huge amount of subdomains which could influence crawl/indexation by Google. Maybe there are some more issues we could face where I didn't think of? The most common fix would be to use an other domain for the backups like client1.host-example.com and prevent Google from crawling it. This way www.example.com wouldn't be affected. So my questions basically are: 1. How much will this influence rankings for www.example.com
2. Are there any similar cases and what effect did it have on rankings/crawl/indexation when it got fixed / didn't got fixed?0 -
Is it good practice to use "SAVE $1000's" in SEO titles and Meta Descriptions?
Our company sells a product system that will permanently waterproof almost anything. We market it as a DIY system. I am working on SEO titles and descriptions. This topic came up for discussion, if using "SAVE $1000's.." would help or hurt. We are trying to create an effective call to action, but we are wondering if search engines see it as click bait. Can you
Intermediate & Advanced SEO | | tyler.louth0 -
Site been plagiarised - duplicate content
Hi, I look after two websites, one sells commercial mortgages the other sells residential mortgages. We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right. I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word. I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well. I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory? I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view. Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario! In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy. Any advice would be greatly appreciated. Thanks, Amelia
Intermediate & Advanced SEO | | CommT0 -
No matter what I do, my website isn't showing up in search results. What's happening?
I've checked for meta-robots, all SEO tags are fixed, reindexed with google-- basically everything and it's not showing up. According to SEOMoz all looks fine, I am making a few fixes, but nothing terribly major. It's a new website, and i know it takes a while, but there is no movement here in a month. Any insights here?
Intermediate & Advanced SEO | | Wabash0