Facebook "lockout"
-
I'm not sure what the correct term is, but I've visited websites that require me to like page 1 of an article, to view page 2. Little annoying but fair enough, they wrote the content, I clearly find it of value as I want page 2.
I run a download website, with user generated content. We used to only allow downloads to members, this resulted in 5,000+ new signups per day and a massive userbase.
We now allow guests to download content, the majority are freeloaders, not even a thank you to the artist.
I am about to employ a system for guests, that forces them to like, tweet or G+ the download, for it to begin. If they don't, no download.
Are there any SEO considerations here? The page this will be implemented on, isn't a crawlable page.
Cheers.
-
I don't see any glaring SEO implications with this method, especially since page it's implemented on isn't indexed. The biggest issue I see would be the user experience. You might get a slow down in downloads from this, but if that sounds like it may not be a problem.
Hope that helps!
-
Hi Eliathah, I don't use WP, so it will be a custom job.
The page is noindex, but it is crawlable. Just not indexable. I made a mistake in my original post.
-
Are you asking for suggestions on plugins? If so theres a great plugin for that by WPMU http://premium.wpmudev.org/project/pay-with-a-like/
As far as SEO goes, if the page isn't crawlable anyways, i dont see how it could affect you.
I might be missing the point of the question
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a tool to find out if a URL has been deemed "SPAM" by GOOGLE
I am currently doing a link audit on one of my sites and I am coming across some links that appear to be spam. Is there a tool that I can plug their URL into to see if they have been deemed spam by GOOGLE?
Intermediate & Advanced SEO | | Mozd0 -
Should you bother with an "impact links" manual action
I have a couple sites that have these, and I have done a lot of work to get them removed, but there seems to be very little if any benefit from doing this. In fact, sites were we have done nothing after these penalties seem to be doing better than ones where we have done link removal and the reconsideration request. Google says "I_f you don’t control the links pointing to your site, no action is required on your part. From Google’s perspective, the links already won’t count in ranking. However, if possible, you may wish to remove any artificial links to your site and, if you’re able to get the artificial links removed, submit a reconsideration request__. If we determine that the links to your site are no longer in violation of our guidelines, we’ll revoke the manual action._" I would guess a lot of people with this penalty don't even know they have it, and it sounds like leaving it alone really doesn't hurt your site. If seems to me that just simply ignoring this and building better links and higher quality content should help improve your site rankings vs. worrying about trying to get all these links removed/disavowed. What are your thoughts? Is it worth trying to get this manual action removed?
Intermediate & Advanced SEO | | netviper0 -
SEO considerations around an "Ad Wall"
I'm not sure what the correct terminology would be for this but I'm calling it an ad wall. Essentially an ad overlay when someone enters a website. I see this most commonly on certain news websites. For example when you click on a link to an article on ign or forbes.com you get an ad that you have to close or skip to read the article. What are the SEO considerations if implementing something like this? I'm wondering if there are any similar to a pay wall in the sense that you want to let crawlers in to see your content and rank it but users get an ad or redirected to an ad and then back to the article page. This link currently does it for me for example http://www.forbes.com/sites/tjmccue/2012/05/22/spacex-launches-with-15-dreams-onboard/ I set my user agent to google bot and go right through to the article but if it is set to the browser default I get to an ad page I have to skip first. Is this the infamous "white hat cloaking"? Are the other ways to implement the same idea (a modal window that opens via javascript for example) that are more or less risky? I'm mainly interested in doing this based on referrer: people who type a URL directly don't see it but clicking on a link they do see it, for example.
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
App "Review" Website with DA of 58 - Good or Bad Link?
Hi, We have a web app. All our competitors are on http://www.appappeal.com. We can suggest ourselves here http://www.appappeal.com/contact/suggest. If we get reviewed and the link is a follow link is this a good thing or a bad thing. They call themselves a directory and you can pay to get a "priority" review. Should we avoid or is it a good link as the DA is 58?
Intermediate & Advanced SEO | | Studio330 -
Is it better "nofollow" or "follow" links to external social pages?
Hello, I have four outbound links from my site home page taking users to join us on our social Network pages (Twitter, FB, YT and Google+). if you look at my site home page, you can find those 4 links as 4 large buttons on the right column of the page: http://www.virtualsheetmusic.com/ Here is my question: do you think it is better for me to add the rel="nofollow" directive to those 4 links or allow Google to follow? From a PR prospective, I am sure that would be better to apply the nofollow tag, but I would like Google to understand that we have a presence on those 4 social channels and to make clearly a correlation between our official website and our official social channels (and then to let Google understand that our social channels are legitimate and related to us), but I am afraid the nofollow directive could prevent that. What's the best move in this case? What do you suggest to do? Maybe the nofollow is irrelevant to allow Google to correlate our website to our legitimate social channels, but I am not sure about that. Any suggestions are very welcome. Thank you in advance!
Intermediate & Advanced SEO | | fablau9 -
Could large number of "not selected" pages cause a penalty?
My site was penalized for specific pages in the UK On July 28 (corresponding with a Panda update). I cleaned up my website and wrote to Google and they responded that "no manual spam actions had been taken". The only other thing I can think of is that we suffered an automatic penalty. I am having problems with my sitemap and it is indexing many error pages, empty pages, etc... According to our index status we have 2,679,794 not selected pages and 36,168 total indexed. Could this have been what caused the error? (If you have any articles to back up your answers that would be greatly appreciate) Thanks!
Intermediate & Advanced SEO | | theLotter0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
Why my site is "STILL" violating the Google quality guidelines?
Hello, I had a site with two topics: Fashion & Technology. Due to the Panda Update I decided to change some things and one of those things was the separation of these two topics. So, on June 21, I redirected (301) all the Fashion pages to a new domain. The new domain performed well the first three days, but the rankings dropped later. Now, even the site doesn't rank for its own name. So, I thought the website was penalized for any reason, and I sent a reconsideration to Google. In fact, five days later, Google confirmed that my site is "still violating the quality guidelines". I don't understand. My original site was never penalized and the content is the same. And now when it is installed on the new domain becomes penalized just a few days later? Is this penalization only a sandbox for the new domain? Or just until the old URLs disappear from the index (due to the 301 redirect)? Maybe Google thinks my new site is duplicating my old site? Or just is a temporal prevention with new domains after a redirection in order to avoid spammers? Maybe this is not a real penalization and I only need a little patience? Or do you think my site is really violating the quality guidelines? (The domain is http://www.newclothing.co/) The original domain where the fashion section was installed before is http://www.myddnetwork.com/ (As you can see it is now a tech blog without fashion sections) The 301 redirect are working well. One example of redirected URLs: http://www.myddnetwork.com/clothing-shoes-accessories/ (this is the homepage, but each page was redirected to its corresponding URL in the new domain). I appreciate any advice. Basically my fashion pages have dropped totally. Both, the new and old URLs are not ranking. 😞
Intermediate & Advanced SEO | | omarinho0