What are your views on recent statements regarding "advertorial" content?
-
Hi,
Recently, there's been a lot said and written about how Google is going to come down hard on 'advertorial' content.
Many B2B publishers provide exposure to their clients by creating and publishing content about them -----based on information/ content obtained from clients (for example, in the form of press releases) or compiled by the publisher. From a target audience/ user perspective, this is useful information that the publication is bringing to its audience. Also, let's say the publishers don't link directly to client websites.
In such a case, how do you think Google is likely to look at publisher websites in the context of the recent statements related to 'advertorial' type content?
Look forward to views of the Moz community.
Thanks,
Manoj
-
Thanks, Matt.
Yes, this issue with advertorial content is two-fold
a) With links and flow of PageRank --- which I suppose is more straightforward and a no-no. However, what's the likely impact on services such as PRWEb, PRLeap etc.?
b) Content that may have been generated and published as a result of commercial arrangement between two parties, but without any links and flow of PageRank. Need to consider this in the context of a supposedly a better understanding and reliance on entities-associations for ranking purposes.
-
Hi Manoj,
You have to be really careful here, Matt Cutts came out and said:
“If a link in a paid post would affect search engines, that link should not pass PageRank (e.g. by using the nofollow attribute).”
If you're paying for an article to be placed then make sure that you nofollow the link because, like with the big InterFlora case, you will get punished.
Matt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
A semi client of mine reached out to me recently that another site scraped his whole site and traffic dropped significantly.
Someone that reaches out to me on occasion for help but is mostly an SEO DIYer recently had his site copied completely and his traffic dropped significantly immediately after. I have never had a client experience this in the past. Does anyone have suggestions or expertise on this? See his question and what he has done below. Jeremy This site scraped my credit site. Its appearance coincides with a dramatic sitewide decrease in Google traffic.I submitted a takedown request by paying this company $200. No results yet. My hosting company also placed blocks on the site HTML which pings my server for CSS and picture files. My Google Webmaster tools account shows inbound links coming from the copycat. Is there something more I should be doing? Copy Site: http://masqueros.com/Real Site: https://www.savvyoncredit.com/
White Hat / Black Hat SEO | | jeremyskillings0 -
Avoiding the "sorry we have no imagery here" G-maps error
Hi there, we recently did a redesign on a big site and added Gmaps locations to almost every page since we are related to Real State, Listings, Details, search results all have a map embedded. While looking at GWT I found that the top keywords on our site (which is in spanish) are the following. have here imagery sorry After a quick search I found out this is a Gmaps bug, when Google Bot accesses the Pages it throws an error out with this text repeated several times. If you do a search for "sorry we have no imagery here" you will see lots of sites with this issue. My question is, Is this affecting the overall SEO since Bots are actually crawling and indexing this hence its being reported by GWT, Should I cloak this to robots? Has anyone noticed this or has been able to fix it? Thanks in advance!
White Hat / Black Hat SEO | | makote0 -
"NOINDEX,FOLLOW" same as "NOINDEX, FOLLOW" ?
Notice the space between them - I am trying to debug my application and sometimes it put in a space - Will this small difference matter to the bots?
White Hat / Black Hat SEO | | bjs20100 -
XML feeds in regards to Duplicate Content
Hi everyone I hope you can help. I run a property portal in Spain and am looking for an answer to an issue we are having. We are in the process of uploading an XML feed to our site which contains 10,000+ properties relating to our niche. Although this is great for our customers I am aware this content is going to be duplicated from other sites as our clients advertise over a range of portals. My question is, are there any measures I can take to safeguard our site from penalisation from Google? Manually writing up 10,000 + descriptions for properties is out of the question sadly. I really hope somebody can help Thanks Steve
White Hat / Black Hat SEO | | buysellrentspain0 -
If I were to change the geographic keyword such as "foreclosures in Dallas" on 20 related blogs to "foreclosures in Los Angeles" what would happen?
In other words I'm wondering if someone built up an internet presence for their company through multiple websites over the years and then decided to move to another part of the united states, would it work to change all the keywords to the new location? Would that work toward getting them ranked in the new area or would you have to create entirely new websites? Thanks guys.
White Hat / Black Hat SEO | | whorneff3100 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0