I have 100+ Landing Pages I use for PPC... Does Google see this as a blog farm?
-
I am currently using about 50-100 domains for geotargeted landing pages for my PPC campaigns. All these pages basically have the same content, I believe are hosted on a single unique ip address and all have links back to my main url. I am not using these pages for SEO at all, as I know they will never achieve any significant SEO value. They are simply designed to generate a higher conversion rate for my PPC campaigns, because they are state and city domains. My question is, does google see this as a blog/link farm, and if so, what should I do about it? I don't want to lose any potential rankings they may be giving my site, if any at all, but if they are hurting my main urls SEO performance, then I want to know what I should do about it.
any advice would be much appreciated!
-
Hi jfishe1988, could I ask - do you have analytics on these pages? In your initial question you suggest that the pages don't have any organic value but in responding to Alick you say that lots of them get organic rankings.
As a couple of broader points:
- Google has given conflicting advice about how low-quality links are treated, Gary Ilyes has said that low-quality links are just ignored (which makes sense as a way to negate negative SEO tactics) so this may not be something you need to worry about if you aren't relying on these 100 odd domains to channel link equity to the site you want to rank. However, John Mueller has said that the disavow file still has value, as I say - conflicting advice.
- If the only purpose of these landing pages is for hyper-targeted PPC and you don't want any link equity that is coming from them in case it poisons the main site, you could consider adding a nofollow tag to those pages. That should mean that Google ignores all links from those pages to your main site. It would insulate your main site from any link-network based penalties but it would also remove any benefits the main site might currently be getting due to any backlinks those domains have.
-
This doesn't really answer my question. I have ppc landing pages that all link to my main url. I guess my question is, should I disavow these links back to my home page? The landing pages are all set up on a different ip address than my main url. Lots of them get organic rankings. I don't want to noindex them. I'm just wondering if it would be best to nofollow the links back to my home domains.
Please help.
-
Hi,
Check PPCBossman reply in above post. I also follow the same for my PPC landing page.
Hope this helps.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
High ranking nationally but not locally via google
A website I am working on is ranked very well in all tracked keywords at a national level, but not from a local standpoint via google. I find it weird that the site is on the first page if you search from many other states/towns/locations but not locally. Looked on Google Search Console and couldn't see any link to why this is happening. Figured we would clear out the htaccess for any redirect issues and hope it fixes it. Suggestions please? Never seen google do this. It is strange.
White Hat / Black Hat SEO | | SeobyKP1 -
Googlebot crawling AJAX website not always uses _escaped_fragment_
Hi, I started to investigate googlebot crawl log of our website, and it appears that there is no 1:1 correlation between a crawled URL with escaped_fragment and without it.
White Hat / Black Hat SEO | | yohayg
My expectation is that each time that google crawls a URL, a minute or so after, it suppose to crawl the same URL using an escaped_fragment For example:
Googlebot crawl log for https://my_web_site/some_slug Results:
Googlebot crawled this URL 17 times in July: http://i.imgur.com/sA141O0.jpg Googlebot crawled this URL additional 3 crawls using the escaped_fragment: http://i.imgur.com/sOQjyPU.jpg Do you have any idea if this behavior is normal? Thanks, Yohay sOQjyPU.jpg sA141O0.jpg0 -
Google URL Shortener- Should I use one or multiple???
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL). Many of these links go to the same page ex .com/services-page Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results? Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this. Thanks!
White Hat / Black Hat SEO | | mgordon1 -
A doorway-page vendor has made my SEO life a nightmare! Advice anyone!?
Hey Everyone, So I am the SEO at a mid-sized nationwide retailer and have been working there for almost a year and half. This retailer is an SEO nightmare. Imagine the worst possible SEO nightmare, and that is my unfortunate yet challenging everyday reality. In light of the new algorithm update that seems to be on the horizon from Google to further crack down on the usage of doorway pages, I am coming to the Moz community for some desperately needed help. Before I was employed here, the eCommerce director and SEM Manager connected with a vendor that told them basically that they can do a PPC version of SEO for long-tail keywords. This vendor sold them on the idea that they will never compete with our own organic content and can bring in incremental traffic and revenue due to all of this wonderful technology they have that is essentially just a scraper. So for the past three years, this vendor has been creating thousands of doorway pages that are hosted on their own server but our masked as our own pages. They do have a massive index / directory in HTML attached to our website and even upload their own XML site maps to our Google Web Master Tools. So even though they “own” the pages, they masquerade as our own organic pages. So what we have today is thousands upon thousands of product and category pages that are essentially built dynamically and regurgitated through their scraper / platform, whatever. ALL of these pages are incredibly thin in content and it’s beyond me how Panda has not exterminated them. ALL of these pages are built entirely for search engines, to the point that you would feel like the year was 1998. All of these pages are incredibly over- optimized with spam that really is equivalent to just stuffing in a ton of meta keywords. (like I said – 1998) Almost ALL of these scraped doorway pages cause an incredible amount of duplicate content issues even though the “account rep” swears up and down to the SEM Manager (who oversees all paid programs) that they do not. Many of the pages use other shady tactics such as meta refresh style bait and switching. For example: The page title in the SERP shows as: Personalized Watch Boxes When you click the SERP and land on the doorway page the title changes to: Personalized Wrist Watches. Not one actual watch box is listed. They are ALL simply the most god awful pages in terms of UX that you will ever come across BUT because of the sheer volume of this pages spammed deep within the site, they create revenue just playing the odds game. Executives LOVE revenue. Also, one of this vendor’s tactics when our budget spend is reduced for this program is to randomly pull a certain amount of their pages and return numerous 404 server errors until spend bumps back up. This causes a massive nightmare for me. I can go on and on but I think you get where I am going. I have spent a year and half campaigning to get rid of this black-hat vendor and I am finally right on the brink of making it happen. The only problem is, it will be almost impossible to not drop in revenue for quite some time when these pages are pulled. Even though I have helped create several organic pages and product categories that will pick-up the slack when these are pulled, it will still be awhile before the dust settles and stabilizes. I am going to stop here because I can write a novel and the millions of issues I have with this vendor and what they have done. I know this was a very long and open-ended essay of this problem I have presented to you guys in the Moz community and I apologize and would love to clarify anything I can. My actual questions would be: Has anyone gone through a similar situation as this or have experience dealing with a vendor that employs this type of black-hat tactic? Is there any advice at all that you can offer me or experiences that you can share that can help be as armed as I can when I eventually convince the higher-ups they need to pull the plug? How can I limit the bleeding and can I even remotely rely on Google LSI to serve my organic pages for the related terms of the pages that are now gone? Thank you guys so much in advance, -Ben
White Hat / Black Hat SEO | | VBlue1 -
Do I need to use meta noindex for my new website before migration?
I just want to know your thoughts if it is necessary to add meta noindex nofollow tag in each page of my new website before migrating the old pages to new pages under a new domain? Would it be better if I'll just add a blockage in my robots.txt then remove it once we launch the new website? Thanks!
White Hat / Black Hat SEO | | esiow20130 -
Google places VS position one ranking above the places.
Hi Guys, Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword? I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts? Or would they have that listing as well as the places listing? I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc. Appreciate some guidance Thanks. BC
White Hat / Black Hat SEO | | Bodie0 -
Article pages not ranking as well as they should
Hello, Our articles here are not ranking as strongly as they should. Could you take a look and tell me why? When I search for the exact article title we do not come up. We used to. Note our sitewide footer links to some articles in case that's the problem, but even articles not in the footer links aren't performing.
White Hat / Black Hat SEO | | BobGW0 -
Why did Google reject us from Google News?
I submitted our site, http://www.styleblueprint.com to Google to pontentially be a local news source in Nashville. I received the following note back: We reviewed your site and are unable to include it in Google News at this
White Hat / Black Hat SEO | | styleblueprint
time. We have certain guidelines in place regarding the quality of sites
which are included in the Google News index. Please feel free to review
these guidelines at the following link: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769#3 Clicking the link, it anchors to the section that says: These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit. etc... Now we have never intentionally tried to do anything deceptive for our rankings. I am new to SEOmoz and new to SEO optimization in general. I am working through the errors report on our campaign site but I cannot tell what they are dinging us for. Whatever it is we will be happy to fix it. All thoughts greatly appreciated. Thanks in advance, Jay0