Facebook "lockout"
-
I'm not sure what the correct term is, but I've visited websites that require me to like page 1 of an article, to view page 2. Little annoying but fair enough, they wrote the content, I clearly find it of value as I want page 2.
I run a download website, with user generated content. We used to only allow downloads to members, this resulted in 5,000+ new signups per day and a massive userbase.
We now allow guests to download content, the majority are freeloaders, not even a thank you to the artist.
I am about to employ a system for guests, that forces them to like, tweet or G+ the download, for it to begin. If they don't, no download.
Are there any SEO considerations here? The page this will be implemented on, isn't a crawlable page.
Cheers.
-
I don't see any glaring SEO implications with this method, especially since page it's implemented on isn't indexed. The biggest issue I see would be the user experience. You might get a slow down in downloads from this, but if that sounds like it may not be a problem.
Hope that helps!
-
Hi Eliathah, I don't use WP, so it will be a custom job.
The page is noindex, but it is crawlable. Just not indexable. I made a mistake in my original post.
-
Are you asking for suggestions on plugins? If so theres a great plugin for that by WPMU http://premium.wpmudev.org/project/pay-with-a-like/
As far as SEO goes, if the page isn't crawlable anyways, i dont see how it could affect you.
I might be missing the point of the question
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Using "nofollow" internally can help with crawl budget?
Hello everyone. I was reading this article on semrush.com, published the last year, and I'd like to know your thoughts about it: https://www.semrush.com/blog/does-google-crawl-relnofollow-at-all/ Is that really the case? I thought that Google crawls and "follows" nofollowed tagged links even though doesn't pass any PR to the destination link. If instead Google really doesn't crawl internal links tagged as "nofollow", can that really help with crawl budget?
Intermediate & Advanced SEO | | fablau0 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0 -
Disabling a website - What to do with "link juice"?
Hi I built a website for a client a long time ago now and for a number of reasons I have decided to shut down the website. None payment being one of the reasons! My question to all you SEO guru's out there is, what should I do with 301 redirects. The site is an e-commerce based website and my personal website is simply advertising my services and portfolio. If I 301 redirect all the traffic from the customer website, will there be any issue with Google (or any search engine) seeing that my website is receiving traffic for search phrases such as "Coffee Mugs"? I.e. abolutely no relevance at all to my website content! My worry is my site could be penalised for a flurry of thousands of redirected links. Also, if I redirect everything to my site and the customer decides to pay the bill in due course, I will then remove the redirects - I guess this will have a massive impact on the rankings of the site? Thanks for reading and any advice.
Intermediate & Advanced SEO | | yousayjump0 -
Link Building for "State" informational pages
I have a webpage for all 50 states for specific info relating to relocation and was wondering if there are any recommended links to work at getting for these pages. I would like to do "state" specific and possibly health related links for each page to help in the SEO rankings. I can see that if I just wanted to get 10 links on each page that is going to be 500 links I have to build and it is going to be very time consuming but I feel it is necessary. Thank you, Poo
Intermediate & Advanced SEO | | Boodreaux0 -
Avoiding 301 on purpose; Landing homepage linking to another domain with "Click here to go" and 5 sec meta refresh
Hello, Some users when they search for our site by using "ourbrand" keyword that ignore the first result (we will call it here ourbrand.de -not real name-) and they look for ourbrand.com . Even though we have that domain name also registered (indeed it also has a high ranking power) we are doing a 301 from the dot com to the dot.de . What we want to do is to index the homepage of the dot com, that is http://www.ourband.com as a secondary result while doing a 301 to any other internal URL of the dot com to the dot .de. Yes, we will loose link juice for the main domain but at least we will not loose visits from the brand traffic (which is our main traffic). So the question is, would Google index ourbrand.com if we show just a landing page that just show our logo, a "Click here to go to ourbrand.de" with a link to http://www.ourbrand.de and a meta refresh of 6 seconds to that URL? Additionally a cookie would be sent to the first time visitors, so in the next time they would be automatically redirected. PS: The 6 seconds is to avoid search engine consider it a "301" like it do with short meta refresh (not sure what time is the minimum to avoid be considered a 301). Any other suggestions on how to deal with this problem are welcomed
Intermediate & Advanced SEO | | Zillo0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
So what exactly does Google consider a "natural" link profile?
As part of my company's ongoing SEO effort we have been analyzing our link profile. A colleague of mine feels that we should be targeting at least 50% branded anchor text. He claims this is what search engines consider "natural" and we should not go past a threshold of 50% optimized anchor text to make sure we avoid any penalties or decrease in rankings. 50% brand term anchor text seems too high to me. I pointed out that most of our competitors who outrank us have a much greater percentage of optimized links. I've also read other industry experts state that somewhere in the range of 30% branded anchor text would be considered natural. What percent of branded vs. optimized anchor text do you feel looks "natural" and what do you base your opinion on?
Intermediate & Advanced SEO | | DeannaTallman0