Erica,
Excellent work I got the same info as well.
All the best,
Thomas
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Erica,
Excellent work I got the same info as well.
All the best,
Thomas
Apparently the issue that you are having is a big problem. I took the time to chat with Sucuri they confirmed that Facebook is having large issues with sites that are actually no longer containing malware and have been blacklisted continuously.
[Note from Keri: I edited transcript format to make it easier to read, and kept it in just one response]
Thomas: Hi do you guys help people that are blacklisted on Facebook there is a person with a website that they believe is clean however Facebook keeps blacklisting it can you be of service?
Mark Z.:hello
Hi
Mark Z.:we would run our cleanup procedures and get it submitted to facebook for review
can you elaborate more because this person has tried to submit their site for review and it keeps coming back blacklisted?
Mark Z.:We have tools that we would first run on the server and all of it's site files. It's likely that facebook is blacklisting the site continuously because there is an underlying infection. So once we take care of that, we will resubmit to facebook.
they believe the website is clean I have not tested this site however would you guarantee Bt would be removed be removed If you're service were to be used? I agree I bet the Site is still infected however would this be part of your guarantee?
Mark Z.:yes, we guarantee to get it cleaned for a year, as we offer unlimited cleanups and scans during that timeframe
That it would be taken off the Facebook blacklist? I'm sorry but the person who only cares about removal from the Facebook blacklist so that is why have to ask if you guarantee that?
Mark Z.:double checking for you if we can guarantee Facebook as a service we can back
one moment
cool thanks
Mark Z.:okay I got a straight answer for you
cool
Mark Z.:we can't make a guarantee on facebook in that respect
we have apparently cleaned a lot of facebook blacklisted sites, and sometimes they come back clean, and sometimes they don't. They are apparently either neglecting, or evolving their methods and sometimes keep completely clean sites blacklisted, even after multiple submissions
just got those details from our administrator
Wow good to know. thank you so what do you do resubmit over & over?
Thank you for asking them
Mark Z.:That's really the only thing you could do at this point
And no problem
Mark I really appreciate your time your help I will pass this on.
Thanks man
Mark Z.:my pleasure!
Have a good night bye
I would double check your site anyway to make sure it is clean apparently this is a huge issue.
Thomas
I have dealt with sites that have not been banned by Facebook however they have been blacklisted by McAfee the same company that does the blacklisting for Facebook to the best of my knowledge.
they also guarantee removal from some blacklists not every blacklist but the one that I believe is causing this issue.
http://sucuri.net/services/web-integrity-monitoring
The reason I stated that he should go on the site and ask them because Sucuri has have a pop-up question-and-answer bar On their website that can be used to say "I have a problem with Facebook blacklisting can you help me?"
Sucuri will check your site to make sure that the hack or whatever malware it's left behind is truly gone. They will also remove you from all blacklists because the 2 gentlemen I mentioned in my 1st post have so much knowledge regarding blacklists and blacklist removal for sites that are no longer carrying malware by speaking to them or by using their software only after making sure that the blacklist for Facebook is indeed McAfee which I know is what they use for general security. however they have their own security team I know they also use this list of companies the Facebook security page is right here
https://www.facebook.com/note.php?note_id=10150492832835766
McAfee, Google, Web of Trust, and Websense
https://community.mcafee.com/thread/63938
I apologize if my idea of actually asking them for help was that bad. I realize that this gentleman has seemingly gone through a lot of trouble with one blacklist.
and Facebook does not care.
I would use services of people that have gotten clients of mine off of blacklists many times and know more about this issue than I do. However because when Intel acquired McAfee it became to the best of my knowledge the main blacklist for Facebook and Sucuri will make sure your site is not blacklisted on McAfee I would speak to the people and ask them if they can help you.
I apologize if I was unclear in my messages.
However, I do think taking to a social network and asking a large corporation like Facebook for help in a very public forum is a way to get attention if they are ignoring you.
Sincerely,
Thomas
Use social media to help your self go on twitter, Google+ Every thing that connects to Facebook as well like instagraham and say @facebook #facebook will not stop blocking a clean sight it is hurting my site if it is your business and it is affecting your Ability to make a living I have written to the correct channels and Facebook multiple times would use social media to try and get somebody from Facebook to help me
they should notice this or post it on your own account on Facebook or in their security forum.
Should be legitimate about your issue, and they should take it seriously. Here's more in formation I hope helps
https://www.facebook.com/note.php?note_id=10150492832835766
http://www.youtube.com/watch?v=uo06i6_sdBw
You must be 100% legitimate about it tries to get somebody that will actually care and explain to you why you're on a continuous loop of being banned. It could be that you are just blacklisted on a company that you do not know about which Facebooks takes signals from. Sucuri Can help with that type of stuff as well as Facebook by proving that your site is clean and that you are doing nothing malicious they should fix this for you. however I suggest that you contact them prior to purchasing their product and asked them because they have a tremendous amount of knowledge about blacklists I believe they can help you.
I wish you luck,
Thomas
contact http://sucuri.net/ ask for Dre or Tony
There specialist in blacklist removal
http://sucuri.net/services/malware-removal
Hope this helps,
Thomas
Great info Rikki
thats goid news!
Hi Antonio,
I would take a look at your entire site using
One of my very favorite tools this tool will crawl your site and tell you if you have no follow's or other issues that would cause Google bot have trouble indexing your site.
Simply put your sites URL in the box presented in the tool you can find in the link here
http://www.feedthebot.com/tools/spider/
Then use link 2
Displays amount of links (internal, external, nofollow, image, etc.) found on webpage.
http://www.feedthebot.com/tools/linkcount/
You can then see if there is a no follow that might be creating a real problem inside of a page using the two URLs you should be a will to get about of this.
Check this much of your site is you possibly can with this as it will show you A lot of information that would be very relevant as to if your site can be crawled correctly or not
This third tool Will show you if your robots.txt file is still blocking all or part of your website the nice thing about this tool is is is built to make her about star text files however if you simply put your URL in the top and hit the upload button it will pull your robots.txt file this is very helpful when making comparisons between changes that have been made or you wish to make
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
Two check out your robot.txt file against what could be something blocking it I think that will
http://moz.com/blog/interactive-guide-to-robots-txt
http://moz.com/learn/seo/robotstxt
http://tools.seobook.com/robots-txt/
http://yoast.com/x-robots-tag-play/
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?hl=de
http://www.searchenginejournal.com/x-robots-tag-simple-alternate-robots-txt-meta-tag/67138/
A citation that I hope will help you is the not too noticeable difference between allowing everything and not allowing everything simply having a / after disallow: Will tell Google that you do not want to be showing up in their search engine results
Simply put I have the information below websites by default are set up with
Allow: /
Example Robots.txt Format
Allow indexing of everything
User-agent: *
Disallow:
or
User-agent: *
Allow: /
Disallow indexing of everything
User-agent: *
Disallow: /
Disallow indexing of a specific folder
User-agent: *
Disallow: /folder/
Please remember there are multiple ways to block a website for instance
PHP-based websites are extremely popular and if you're using a WordPress or agenda for many other
header("X-Robots-Tag: noindex", true);
I want to remind you what Tom Roberts said in the first response about using Twitter I have quoted him here however you can read it at the top of the Page below the first question
The most frequently crawled domain on the web is Twitter. If you could legitimately get your key URLs tweeted, either by yourselves or others, this may encourage the Google crawler to revisit the URLs, and consequently re index them. There won't be any harm SEO wise in sending tweets with your URLs, it's a quick and free method and so may be worth giving it a shot
Hope This Helps,
Thomas
Sorry for all the posts however maybe this will help you as well that get rid of the dynamic uRLs
http://www.webconfs.com/url-rewriting-tool.php
Thomas
A great completely and this is a good example of the type of difference changing the robots.txt file could make
I would read all the information you can on it as it seems to be constantly updating.
I used this info below as an example of a happy ending but to see the problems I would read all the stories you will see if you check out this link.
http://wordpress.org/support/topic/max-cpu-usage/page/2
CPU usage from over 90% to less than 15%. Memory usage dropped by almost half, from 1.95 GB to 1.1 GB including cache/buffers.
My setup is as follows:
Linode 2GB VPS
Nginx 1.41
Percona SQL Server using XtraDB
PHP-FPM 5.4 with APC caching db requests and opcode via W3 Total Cache
Wordpress 3.52
All in One Event Calendar 1.11
All the Best,
Thomas
I got the robots.txt file I hope this will help you.
This is built into every GetFlywheel.com website they are a managed WordPress only hosting company
website the reason they did this was the same reason Dan as described above.
I'm not saying this is a perfect fix however after speaking with the founder of GetFlywheel I know they place this in the robots.txt file for every website that they host in order to try get rid of the crawling issue.
This is an exact copy of any default robots.txt file from getflywheel.com
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /calendar/action:posterboard/
Disallow: /calendar/action:agenda/
Disallow: /calendar/action:oneday/
Disallow: /calendar/action:month/
Disallow: /calendar/action:week/
Disallow: /calendar/action:map/
As found on a brand-new website. If you Google "Max CPU All in one calendar" you will see more about this issue.
I hope this is of help to you,
Thomas
PS
here is what
The maker of the all in one event calendar has listed on their site as a fix
If you want more information yes Apache server does make
Most Apache web servers have set the following pages as the default page with a directory.
default.htm default.html index.htm index.html index.php
If using WordPress you could see index.php because WordPress
Uses a PHP Database regardless of what your friend told you ideally you should have
Properly 301 redirected static links when talking about links that are going to be seen by Google.
this
Means WordPress websites ideally should not contain publicly visible index file of any kind.
To cut to the chase I have cited this response from Matt Cutts, the head of Google’s web spam team. to Yoast
Referenced from http://yoast.com/wordpress-seo-url-permalink/
I emailed Matt and asked whether it makes sense to add .html for systems like WordPress. His response:
In general I wouldn’t. My WP has urls like http://www.mattcutts.com/blog/remove-result/ and that’s pretty ideal.
So. Case closed.
A good collection of resources. Are posted as links below this line.
Use Yoast http://yoast.com/wordpress/seo/
Is an excellent source of WordPress knowledge I strongly recommend using the Yoast WordPress seo tool the reason a side from being one of the very best Word press plug-ins that enhance your Word press site this one uses
http://yoast.com/change-wordpress-permalink-structure/
http://yoast.com/wp-content/permalink-helper.php
If you need to make changes to your link structure A great
resource I understand redirects is the link under this line
http://24ways.org/2013/url-rewriting-for-the-fearful/
If you need to redirect an index tag using Nginx
http://moz.com/blog/htaccess-file-snippets-for-seos
http://codex.wordpress.org/Linking_Posts_Pages_and_Categories
Merry Christmas,
Ruben
Your friend is speaking about a 100% HTML site and the answer is your site should not end using
www.yoursite.com/index.htm or home.htm or default.htm
Using Word press your site should end in www.yoursite.com or www.yoursite.com/
Some people when they ring there site over from an HTML static site to WordPress they might find a parma links
http://codex.wordpress.org/Using_Permalinks
If you do this and 301 redirect any links that would be changed by using WordPress to the new Permalink structure shown below or if you feel like using one of the others which I recommend against you may. However most sites are best served using the settings below.
Sincerely,
Thomas