Questions created by RCNOnlineMarketing
-
Philosophy & Deep Thoughts On Tag/Category URLs
Hello, SEO Gurus! First off, my many thanks to this community for all of your past help and perspective. This is by far the most valuable SEO community on the web, and it is precisely because of all of you being here. Thanks! I've recently kicked off a robust niche biotech news publishing site for a client, and in the first 6 weeks, we've generated 15K+ views and 9300 visits. The site is built on the WordPress platform. I'm well aware that a best practice is to noindex tag and category pages, as I've heard SEOs say that they potentially lead to duplicate content issues. We're using tags and categories heavily, and to date, we've had just 282 visits from tag & category pages. So, that's 2.89% of our traffic; the vast majority of traffic has landed on the homepage or article pages (we are using author markup). Here's my question, though, and it's more philosophical: do these pages really cause a duplicate content issue? Isn't Google able to determine that said page is a tag page, and thus not worthy of duplicate content penalties? If not, then why not? To me, tag/category pages are sometimes better content pages to have ranked than article pages, since, for news especially, they potentially give searchers a better search result (particularly for short tail keywords). For example, if I write articles all the time about the Mayo Clinic," I'd rather have my evergreen "Mayo Clinic" tag page rank on page one for the keyword "mayo clinic" than just one specific article that very quickly drops out of the news cycle. Know what I mean? So, to summarize: 1. Are doindexed tag/category pages really a duplicate content problem, and if so, why the heck? 2. Is there a strategy for ranking tag/category pages for news publishing sites ahead of article pages? Thanks as always for your time and attention. Kind Regards, Mike
Intermediate & Advanced SEO | | RCNOnlineMarketing0 -
"Starting Over" With A New Domain & 301 Redirect
Hello, SEO Gurus. A client of mine appears to have been hit on a non-manual/algorithm penalty. The penalty appears to be Penguin-like, and the client never received any message (not that that means it wasn't manual). Prior to my working with her, she engaged in all kinds of SEO fornication: spammy links on link farms, shoddy article marketing, blog comment spam -- you name it. There are simply too many tens of thousands of these links to have removed. I've done some disavowal, but again, so much of the link work is spam. She is about to launch a new site, and I am tempted to simply encourage her to buy a new domain and start over. She competes in a niche B2B sector, so it is not terribly competitive, and with solid content and link earning, I think she'd be ok. Here's my question: If we were to 301 the old website to the new one, would the flow of page rank outperform any penalty associated with the site? (The old domain only has a PR of 2). Anyone like my idea of starting over, rather than trying to "recover?" I thank you all in advance for your time and attention. I don't take it for granted.
Intermediate & Advanced SEO | | RCNOnlineMarketing0 -
Filling Up Content For A New News Publishing Site
Hello, SEO Gurus. I have a client whom I've been working with for a few months now, and part of our service offering is to publish and promote fresh, daily content on his site's blog. This strategy has been a huge success thus far, he is very happy with the content, etc. Now, he is getting ready to launch a second site, which will be a news publishing site for his industry niche, and we will once again be providing the content on a daily basis: we're going to be producing 10 to 15 articles a day. It's a big operation for us. The client, however, is concerned that he doesn't want the site to appear "thin" on content in the early going, and asked if it would be possible to populate the new site with the articles we wrote on the other site's blog. My gut reaction to this is that it would be an exceedingly bad idea to do this. While we are the ones who authored the original content (and we've used author tags and publishing markup), the best bet is to simply start fresh. Besides that, seeing as we'll be pumping out tons of content on a daily basis, it won't take long to fill up the content coffers. That being said, I just wanted to run this past you all and see if anyone had any alternative ideas on how to use the old content without it being duplicate content. I was thinking that maybe designating all of the old articles with noindex, nofollow could be an option? Many thanks in advance for your time and attention. Sincerely, Mike
Content Development | | RCNOnlineMarketing0 -
Does Google News Inclusion Affect Organic Rankings?
Hello SEO Gurus, Here's a question I've been unable to find an answer for: if you manage to get a publishing website or blog included in the Google News aggregate, can it negatively affect organic search visibility? I've never read anything that explicitly says so, but I have both read and experienced how e-commerce sites often have difficulty in ranking high for both organic and shopping searches. It seems that Google balances out visibility between the two. Has anyone had any experience with a website or blog that managed to rank high for the same high-value keyword on both organic search and news search? Thanks in advance! Mike
Technical SEO | | RCNOnlineMarketing0 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike
Technical SEO | | RCNOnlineMarketing0 -
Exponentially Increasing Duplicate Content On Blogs
Most of the clients that I pick up are either new to SEO best practices, or have worked with sketchy SEO providers in the past, who did little more than build spammy links. Most of them have deployed little if any on-site SEO best practices, and early on I spend a lot of time fixing canonical and duplicate content issues alla 301 redirects. Using SEOMOZ, however, I see a lot of duplicate content issues with blogs that live on the sites I work on. With every new blog article we publish, more duplicate content builds up. I feel like duplicate content on blogs grows exponentially, because every time you write a blog article, it exists provisionally on the blog homepage, the article link, a category page, maybe a tag page, and an author page. I have a two-part question: Is duplicate content like this a problem for a blog -- and for the website that the blog lives on? Are search engines able to parse out that this isn't really duplicate content? If it is a problem, how would you go about solving it? Thanks in advance!
On-Page Optimization | | RCNOnlineMarketing0 -
Can Linking Between Your Own Sites Excessively Be a Penguin No-No?
I have a bunch of travel-related sites that for a long time dominated google.com.au without any intensive SEO whatsoever. Aside from solid on-page content and meta tag, I did no link building. However, all of my sites are heavily interlinked, and I think they are linked with do follow links and lots of anchor texts. Here are a few of them: www.beautifulpacific.com www.beautifulfiji.com www.beautifulcooklands.com My idea in inter-linking them was to create a kind of branded "Beautiful" nexus of sites. However, when Penguin hit -- which I believe was on April 27th -- search traffic crashed, and has crashed over and over again. I've read that Penguin penalized over-optimization vis a vis anchor text links. I don't have a lot of inbound links like these, but they are everywhere among my sites. Is it possible that all of my text links have hurt me with Penguin? Thanks to everyone in advance for your time and attention. I really appreciate it. -Mike
Intermediate & Advanced SEO | | RCNOnlineMarketing0