Links to Facebook pages
-
I would like to ask if anyone has any knowledge regarding linking to a company's facebook page. I have built a few links to a client's facebook page in an effort to have it rank better in SERPs.
I just learned that unlike twitter and linkedin, it is apparently not possibly to directly link to facebook pages. At least it is not possible from a search engine's perspective. If you follow any facebook page link while you are not logged into facebook, you are redirected to the facebook home page.
I can't think of any way around this obstacle. I'd love some clever solution such as providing a URL which includes a basic dummy facebook login but there is nothing I am aware of to achieve this result.
Does anyone have any ideas on this topic?
-
I figured out the root issue. The site's facebook page had an Age Rating of "alcohol related". It seems if you have any age setting other then "13+" then the page is not publicly visible unless you are logged in to facebook. Mystery solved.
Thanks to all for sharing your ideas.
-
Thanks Mike.
Yes, I am using the identical URL which works perfectly when logged in. The moment I log out I am taken from the client's facebook page to the facebook home page. I am very familiar with the need to have 25 likes to obtain a named page. He definitely has a named page and it works.
-
Thanks for the feedback Joshua.
I advise each client to establish a Facebook page if they do not have one already. Once the page is created, I also advise them to immediately gather 25 likes and secure their company name for the facebook page. The reasons are many and include the reputation management factors you mentioned.
What I am finding interesting is it seems all facebook pages are directly linkable without being logged into facebook, while my client's page is not.
The above link works fine. I have also tried facebook links for company pages newer then my client's page and they work fine. For some reason my client's URL is redirected to the facebook home page when the user is not logged in. The issue is apparently only with my client's facebook URL. I am trying to determine the cause.
-
Hey Ryan,
Are you making sure to use the custom Facebook URL for your client's page? You'll need at least 25 Likes before you can do that.
Check it out (try clicking without being logged in): http://www.facebook.com/the.bacc
(shameless plug to prove a point!)
-
Ryan,
We have built links directly to clients Facebook pages that we were doing Reputation Management for and they did indeed climb in the SERPS. I am not sure if this correlation or causation since we also began managing those clients social media as well around the same time. It could have been just by increasing the engagement on their Facebook pages that it helped it rank. Obviously if you get the custom URL from Facebook after getting 25 followers, and make the page focus on the branded keyword of the company name then that will help to an extent. Hoep that helps a bit. Take care.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.com/blog?page=2 treated to be on the same level in terms of page depth as https://moz.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
Should I remove all vendor links (link farm concerns)?
I have a web site that has been around for a long time. The industry we serve includes many, many small vendors and - back in the day - we decided to allow those vendors to submit their details, including a link to their own web site, for inclusion on our pages. These vendor listings were presented in location (state) pages as well as more granular pages within our industry (we called them "topics). I don't think it's important any more but 100% of the vendors listed were submitted by the vendors themselves, rather than us "hunting down" links for inclusion or automating this in any way. Some of the vendors (I'd guess maybe 10-15%) link back to us but many of these sites are mom-and-pop sites and would have extremely low authority. Today the list of vendors is in the thousands (US only). But the database is old and not maintained in any meaningful way. We have many broken links and I believe, rightly or wrongly, we are considered a link farm by the search engines. The pages on which these vendors are listed use dynamic URLs of the form: \vendors<state>-<topic>. The combination of states and topics means we have hundreds of these pages and they thus form a significant percentage of our pages. And they are garbage 🙂 So, not good.</topic></state> We understand that this model is broken. Our plan is to simply remove these pages (with the list of vendors) from our site. That's a simple fix but I want to be sure we're not doing anything wring here, from an SEO perspective. Is this as simple as that - just removing these page? How much effort should I put into redirecting (301) these removed URLs? For example, I could spend effort making sure that \vendors\California- <topic>(and for all states) goes to a general "topic" page (which still has relevance, but won't have any vendors listed)</topic> I know there is no distinct answer to this, but what expectation should I have about the impact of removing these pages? Would the removal of a large percentage of garbage pages (leaving much better content) be expected to be a major factor in SEO? Anyway, before I go down this path I thought I'd check here in case I miss something. Thoughts?
Intermediate & Advanced SEO | | MarkWill0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Optimize Pages for Keywords Prior to Building Links?
Greetings MOZ Community: According to site audit by a reputable SEO firm last November, my commercial real estate web site has a toxic link profile which is very weak (about 58% of links qualified as toxic). The SEO firm suggests than we immediately start pruning the link profile, requesting removal of the toxic links and eventually filing a link disavow file with Google for links that web masters will not agree to remove. While removing toxic links, the SEO firm proposes to simultaneously solicit very high quality links, to try to obtain 7-12 high quality links per month. My question is the following: is it putting the cart before the horse to work on link building without optimizing pages (with Yoast) for specific keywords? I would think that Google considers how each page is optimized for specific terms; which terms are used within the link structure, as well as terms within the meta tags. My site is partially optimized, but optimization has never been done thoroughly. Should the pages of the site be optimized for the top 25-30 terms before link building begins. Or can that be done at a later stage. Note that my link profile is pretty atrocious. My site at the moment is receiving about 1,000 unique visitors a week from organic search. However 70% of the traffic is from terms that are not relevant. The firm that did my audit claims that removal of the toxic links while building some new links is imperative and that optimization for keywords can wait somewhat. Any thoughts?/ Thanks for your assistance. Alan
Intermediate & Advanced SEO | | Kingalan10 -
2 pages lost page rank and not showing any backlinks in google
Hi we have a business/service related website, 2 of our main pages lost their page rank from 3 to 0 and are not showing any backlinks in google. What could be the possible reason. Please guide me.
Intermediate & Advanced SEO | | Tech_Ahead0 -
How to avoid too many "On Page Links"?
Hi everyone I don't seem to be able to keep big G off my back, even though I do not engage in any black hat or excessive optimization practices. Due to another unpleasant heavy SERP "fluctuation" I am in investigation mode yet again and want to take a closer look at one of the warnings within the SEOmoz dashboard, which is "Too many on page links". Looking at my statistics this is clearly the case. I wonder how you can even avoid that at times. I have a lot of information on my homepage that links out to subpages. I get the feeling that even the links within the roll-over menus (or dropdown) are counted. Of course, in that case then you will end up with a crazy amount of on page links. What about blog-like news entries on your homepage that link to other pages as well? And not to forget the links that result from the tags underneath a post? What am I trying to get at? Well, do you feel that a bad website template may cause this issue i.e. are the links from roll-over menus counted as links on the homepage even though they are not directly visible? I am not sure how to cut down on the issue as the sidebar modules are present on every page and thus up the links count wherever you are on the site. On another note, I've seen plenty of homepages with excessive information and links going out, would they be suffering from the search engines' hammer too? How do you manage the too many on page links issue? Many thanks for your input!
Intermediate & Advanced SEO | | Hermski0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0