No Follow for Social Media Buttons?
-
Our website has all social media buttons for Facebook, Twitter, LinkedIn and Google+ are located in the footer of all pages. These links are set to "no-follow".
Running an SEMRUSH audit shows these "no-follows" coming up as an "issue". Is it best practices to set these links to social media sites as "follow" as opposed to "no-follow"? I am somewhat concerned about losing link juice but perhaps that is an outdated point of view.
Any thoughts??
Thanks, Alan
-
Honestly speaking, it was some other website, I would have encouraged you to set a do-follow tag but as they are social media plguins, there is no problem with that!
I personally think, anything follow or no-follow both should work fine and it should any affect your SERP positions.
Just a Thought!
-
Alan, it is an out-dated mind set.
It is good to supply external links to authoritative social portals belonging to the web portal.
/Chenzo
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Anchor Text vs. Button Links
Hi How important are anchor text links within your own site vs. buttons for SEO? We've redesigned some of our pages from anchor text links to buttons which are just clickable images.I know historically this isn't the best way, but is it still as important as it used to be?
Intermediate & Advanced SEO | | BeckyKey0 -
How to handle potentially thousands (50k+) of 301 redirects following a major site replacement
We are looking for the very best way of handling potentially thousands (50k+) of 301 redirects following
Intermediate & Advanced SEO | | GeezerG
a major site replacement and I mean total replacement. Things you should know
Existing domain has 17 years history with Google but rankings have suffered over the past year and yes we know why. (and the bitch is we paid a good sized SEO company for that ineffective and destructive work)
The URL structure of the new site is completely different and SEO friendly URL's rule. This means that there will be many thousands of historical URL's (mainly dynamic ones) that will attract 404 errors as they will not exist anymore. Most are product profile pages and the God Google has indexed them all. There are also many links to them out there.
The new site is fully SEO optimised and is passing all tests so far - however there is a way to go yet. So here are my thoughts on the possible ways of meeting our need,
1: Create 301 redirects for each an every page in the .htaccess file that would be one huge .htaccess file 50,000 lines plus - I am worried about effect on site speed.
2: Create 301 redirects for each and every unused folder, and wildcard the file names, this would be a single redirect for each file in each folder to a single redirect page
so the 404 issue is overcome but the user doesn't open the precise page they are after.
3: Write some code to create a hard copy 301 index.php file for each and every folder that is to be replaced.
4: Write code to create a hard copy 301 .php file for each and every page that is to be replaced.
5: We could just let the pages all die and list them with Google to advise of their death.
6: We could have the redirect managed by a database rather than .htaccess or single redirect files. Probably the most challenging thing will be to load the data in the first place, but I assume this could be done programatically - especially if the new URL can be inferred from the old. Many be I am missing another, simpler approach - please discuss0 -
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
Rel=next/prev for paginated pages then no need for "no index, follow"?
I have a real estate website and use rel=next/prev for paginated real estate result pages. I understand "no index, follow" is not needed for the paginated pages. However, my case is a bit unique: this is real estate site where the listings also show on competitors sites. So, I thought, if I "no index, follow" the paginated pages that would reduce the amount of duplicate content on my site and ultimately support my site ranking well. Again, I understand "no index, follow" is not needed for paginated pages when using rel=next/prev, but since my content will probably be considered fairly duplicate, I question if I should do anyway.
Intermediate & Advanced SEO | | khi50 -
Does Google make continued attempts to crawl an old page one it has followed a 301 to the new page?
I am curious about this for a couple of reasons. We have all dealt with a site who switched platforms and didn't plan properly and now have 1,000's of crawl errors. Many of the developers I have talked to have stated very clearly that the HTacccess file should not be used for 1,000's of singe redirects. I figured If I only needed them in their temporarily it wouldn't be an issue. I am curious if once Google follows a 301 from an old page to a new page, will they stop crawling the old page?
Intermediate & Advanced SEO | | RossFruin0 -
Follow or nofollow to subdomain
Hi, I run a hotel booking site and the booking engine is setup on a subdomain.
Intermediate & Advanced SEO | | vmotuz
The subdomain is disabled from being indexed in robots.txt Should the links from the main domain have a nofollow to the subdomain? What are you thoughts? Thanks!0 -
Is this link follow or nofollow? Does it pass linkjuice?
I have been seeing conflicting opinions about how Google would treat links using 'onclick'. For the example provided below: Would Google follow this link and pass the appropriate linking metrics(it is internal and points to a deeper level in our visnav)? =-=-=-=-=-=-= <div id='<a class="attribute-value">navBoxContainer</a>' class="<a class="attribute-value">textClass</a>"> <div id="<a class="attribute-value">boxTitle</a>" onclick="<a class="attribute-value">location.href='bla</a>h.example.com"> <div class="<a class="attribute-value">boxTitleContent</a>" title="<a class="attribute-value">Text Here</a>"><a href<a class="attribute-value">Text Here</a>"><a ="blah.exam.cpleom">Text Herea>div> ``` =-=-=-=-=-=-= An simple yes/no would be alright, but any detail/explination you could provide would be helpful and very much appreciated. Thank you all for your time and responses.
Intermediate & Advanced SEO | | TLM0 -
How can I check if the FOLLOW,NOINDEX tag is working?
Hi everyone! After reading about pagination practices, a few days ago we introduced the <meta name="robots" content="FOLLOW,NOINDEX" /> tag, to prevent duplicate content. You can find an example below: http://www.inmonova.com/en/properties?page=2 I have been checking yahoo site explorer and result pages still get indexed. My question is: Am I doing something wrong? Is the code incorrect (follow,noindex - noindex,follow)? Or does it just take some time to have effect? Thanks in advance.
Intermediate & Advanced SEO | | inmonova0