Wordpress.com content feeding into site's subdomain, who gets SEO credit?
-
I have a client who had created a Wordpress.com (not Wordpress.org) blog, and feeds blog posts into a subdomain blog.client-site.com. My understanding was that in terms of SEO, Wordpress.com would still get the credit for these posts, and not the client, but I'm seeing conflicting information.
All of the posts are set with permalinks on the client's site, such as blog.client-site.com/name-of-post, and when I run a Google site:search query, all of those individual posts appear in the Google search listings for the client's domain.
Also, I've run a marketing.grader.com report, and these same results are seen.
Looking at the source code on the page, however, I see this information which leads me to believe the content is being credited to, and fed in from, Wordpress.com ('client name' altered for privacy):
href="http://client-name.files.wordpress.com/2012/08/could_you_survive_a_computer_disaster.jpeg">class="alignleft size-thumbnail wp-image-2050" title="Could_you_survive_a_computer_disaster" src="http://client-name.files.wordpress.com/2012/08/could_you_survive_a_computer_disaster.jpeg?w=150&h=143"
I'm looking to provide a recommendation to the client on whether they are ok to continue moving forward with this current setup, or whether we should port the blog posts over to a subfolder on their primary domain www.client-site.com/blog and use Wordpress.org functionality, for proper SEO.
Any advice?? Thank you!
-
My understanding is this:
If its a duplicate, as in a copy and paste article, then Google will eventually de-index the duplication and keep the original article.
In your clients case though, they are providing the source links, so google doesnt label it as duplicate content, but sees it as syndicated content.
Look at news sources for example. The same article syndicated on multiple sites are all indexed and stay indexed. (this is the case for your clients site)
What i would tell your client is that fresh and unique content on their site is key for SEO. By syndicating articles, it doesn't provide any benefits for SEO in terms of unique and fresh content, so the operation is pointless unless its for user experience only.
Give them an example, say its the same as giving away articles to other websites, and then reusing them on their site as "second hand" articles. Just because its word press doesn't mean its any different to any other website out there.
Good luck!
Greg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Community Discussion: Are You Optimizing Your Brand's Content for Featured Snippets?
My latest post on the Moz Blog, Featured Snippets: A Dead-Simple Tactic for Making, explores how to keep Featured Snippets once you have them. I'm curious to know how many brands are actively working to get in the answer box, and for those who are, what's been the results?
Intermediate & Advanced SEO | | ronell-smith2 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
Mystery 404's
I have a large number of 404's that all have a similar structure: www.kempruge.com/example/kemprugelaw. kemprugelaw keeps getting stuck on the end of url's. While I created www.kempruge.com/example/ I never created the www.kempruge.com/example/kemprugelaw page or edited permalinks to have kemprugelaw at the end of the url. Any idea how this happens? And what I can do to make it stop? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Impact of simplifying website and removing 80% of site's content
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited. The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability. Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
Intermediate & Advanced SEO | | RG_SEO0 -
SEO for an exponentially growing site?
Hey Mozers! I was having a quick chat with a friend the other day on doing SEO for a site that grows in page size at an exponential rate and was just wondering how you would go about optimizing it? The example that we used would be a site that allowed users to upload videos and then have people vote on two videos against each other. So, if there are 100 uploaded videos and each of them are pared up with the other 99 to create a unique voting/battle page which has it's own unique URL, the site can get very large, VERY quickly. Meaning if just one more video is uploaded there would be How exactly would you go about optimizing the site? My biggest area of confusion would be generating sitemaps. I'm aware of best practices with large sitemaps (i.e. having a sitemap of sitemaps, not going over 50k in entries per sitemap etc..) But, how would you go about creating the sitemaps for this website if it's growing at an exponential rate, if at all? If you have any other questions feel free to ask and I'll clarify it. Thanks! 😃 **TL;DR How would you optimize a site that grows at an exponential rate? **
Intermediate & Advanced SEO | | JordanChoo0 -
How to get the 'show map of' tag/link in Google search results
I have 2 clients that have apparently random examples of the 'show map of' link in Google search results. The maps/addresses are accurate and for airports. They are both aggregators, they service the airports e.g. lax airport shuttle (not actual example) BUT DO NOT have Google Place listings for these pages either manually OR auto populated from Google, DO NOT have the map or address info on the pages that are returned in the search results with the map link. Does anyone know how this is the case? Its great that this happens for them but id like to know how/why so I can replicate across all their appropriate pages. My understanding was that for this to happen you HAD to have Google Place pages for the appropriate pages (which they cant do as they are aggregators). Thanks in advance, Andy
Intermediate & Advanced SEO | | AndyMacLean0 -
Will blocking google and SE's from indexing images hurt SEO?
Hi, We have a bit of a problem where on a website we are managing, there are thousands of "Dynamically" re-sized images. These are stressing out the server as on any page there could be upto 100 dynamically re-sized images. Google alone is indexing 50,000 pages a day, so multiply that by the number of images and it is a huge drag on the server. I was wondering if it maybe an idea to blog Robots (in robots.txt) from indexing all the images in the image file, to reduce the server load until we have a proper fix in place. We don't get any real value from having our website images in "Google Images" so I am wondering if this could be a safe way of reducing server load? Are there any other potential SEO issues this could cause?? Thanks
Intermediate & Advanced SEO | | James770 -
SEO on a mature site - diminishing returns?
I have a site that has been indexed in Google since 2002. Back then, I secured all of the highly recommended links of the time, like DMOZ and Yahoo Directory, and got just a couple very high PR links from highly relevant sites. That was enough to get us top listing on our best "niche" keywords and many long tail searches. Once we got to that point, we got lazy and have just relied upon our original links and any natural links that came our way. We also have a very highly detailed Adwords campaign in which we bid on almost any keyword that has every resulted in an organic conversion. A few months ago, I decided to kick our SEO efforts up a notch and hired a company to do an aggressive link building campaign and target some very high search volume terms that we had previously given up on. The campaign has been very successful in getting high ranking for several targetted terms. However, I am seeing zero impact on our site traffic or sales. I am beginning to wonder if Google's algorithms are so efficient that all of this extra SEO work is to no avail. Is there a point of diminishing returns where it is not productive to optimize a site's organic listings any further? Between our Adwords campaign, our already pretty good organic results, and google's ability to divine a searchers intent and lead them to the most relevant results, how do you decide when there is little benefit to further optimization? It is an important question for me because I have been considering putting a lot of work into adding content to our ecommerce site and I would hate to do all that work for nothing.
Intermediate & Advanced SEO | | mhkatz0