Screaming Frog tool left me stumped
-
Hi there again,
I found a major cloaking hack in our client's website that is really well camouflaged and all the seo tools that I tried to help me check for cloaking couldn't find it.
I know that screaming frog is a great tool and I want to use it to help me, however, I can't seem to get my way around their system that I downloaded.
Can you help me with the screaming frog program? Do you know where I can make a full site check for cloaking, maybe there are more links that I wasn't notified about?
I would really appreciate if you could help me with that.
Thanks so much, Ruchy
-
Hi Ruchy,
That's fantastic deep crawl is a great tool.
I was talking about two different types of cloaking one using an URL with a C name for hidden cloaking e.g. https://www.rebrandly.com/
Thank you for the kind words. Giving good answers and thumbs up or thumbs down definitely does make a difference I have outlined a lot of it below the official MozPoints URLhttp://moz.com/community/mozpoints
Yes marking anything as a giving thumbs-up is telling somebody they have helped you and in the event of thumbs up is their response was valuable and maybe contributed to answering your question.
In the case, of giving good answer person has answered your question giving as well as thumbs-up is the way anything is a way of saying thank you.
Welcome to the most community, and I look forward to seeing you here.
All the best,
Tom
-
Hi Tom
we finally solved the issue. Thanks for your clear answers, the info really helped me. We bought DeepCrawl and it really was helpful.
What was the C name you were talking about?
Thank you, Ruchy
P.S. i marked good answer by the wrong one, does it make a difference?
-
Hey just wanted to check was any answer helpfull to you?
Is there anything we might've missed or failed to help you solve the issue?
All the best,
Tom
-
-
Thanx Andy
That was a helpful piece of information.
I am still a little confused, can you give me some further advice?
I there any way for me to do a full site cloaking check?
Ruchy
-
Hi Ruchy,
Screaming Frog is an excellent tool. Dan is a great guy to update and the on that.
In my opinion, I have been able to find formation using https://www.deepcrawl.com/
The cost is higher deepcrawl is also a hosted platform.
Still, it can do remarkable things that no other tool that I know of can do.
There are built just for checking cloaking this one is free
http://www.seotools.com/seo-cloaking-checker/
If you have access to the DNS will want to examine this thoroughly.
I have laid out how you would find c name DNS records if you do not have access to the DNS as well.
One other method you might have to use to determine whether or not domain or subdomain has been cloaked as shown here: http://www.ghosturl.com/ if the cloaking happens in a manner described in the URL for this sentence.
So the scenario would be the cloaked URL would be something like
http://www.cloaked.example.com/?=ku2b4B30ijbasT47720sb534Nbq6
( please know I used example.com is the domain because I did not want to pointed that any live site not because of anything to do with cloaking)
Therefore they're using most likely a C name inside the DNS to pointed to http://www.example.com
You will need to use a tool that would be able to look at the IP or C name being created by the DNS.
The free tool I know if that can look for a C name or A record that should not be there is https://www. cloudflare.com the way you would do this is run the site through the DNS wizard that pulls the current DNS including all the records (it can miss some, but it does an excellent job of catching most of all DNS records. (9 times out of 10 will get all of them)
Because most tools make it impossible to see the C name when looking at DNS, it is important to remember there is a free service that is not designed for this purpose but can be used to discover the rogue cloaked URL.
https://www.cloudflare.com/a/sign-up (the free-tier offers the same ability to get DNS information)
http://i.imgur.com/Wm7W0aM.mp4
https://cl.ly/0A3k3d2j401n/Screen Recording 2017-01-26 at 07.09 AM.mov
http://i.imgur.com/lIN9Gsm.png
Hope This helps,
Tom
-
Dan from Screaming Frog has just had a look at this, and although he is on the move at the moment, he said this...
On the road to a meeting, but they can switch user-agent & change to JavaScript rendering mode to test for cloaking. Depends how it's cloaked, though. Some may do via referrer which you can't test for within the Spider (yet!). Will reply a bit later
-Andy
-
Hi there
You can spider the site with screaming frog and then do a bulk export of all the outlinks into a CSV and run through and see if there are links in there that shouldn't be
Have you downloaded the latest version?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
Google webmasters tools, Majestic and Ahref in a simple case study (Bad links and Good links)
Hey guys, This case study started from here. A simple summary, I discover that I got +1000 backlinks from Blogspot though Google webmasters tools after making a connection with owners of these blogs which points to my new blog. Before starting I proudly invite Thomas Zickell and Gary Lee in this discussion. I wish you accept my invitation. Let's go to the main point, I've used Google webmaster tools so I will start with. Then Ahref which used by **Thomas **and then Majestic which used by Gary. Take a look at "001" screenshot, you will see that Google webmaster tools discovered 1291 links points to my site. Take another look at "002" screenshot, you will find that there are 22 domains points to my site. Most of them are good links since they are coming from websites such as Google.com, Wikipedia.org, Reddit, Shoutmeload, WordPress.org, ...etc. Beside +1000 backlinks came from Blogspot.com (blogs). Also, there's some bad links such as this one came from tacasino.com Necessary to say that I've got some competitors and they nicely asked me to stop the competition for some keywords and I've ignored their request. So, I'm not surprised to see these bad links. At "002" screenshot, we can see that Google didn't discover the bad links as they discovered the good links. And they discovered a lot of backlinks which not discovered by any other tools. **Let's move to Ahref, ** I will use screenshots provided by Thomas. At "003" screenshot, you can see Ahref report that say 457 links from 10 domains. By the way, social engagements data are wrong. I got more than zero engagements .. really. At "004" screenshot, you can see domains points to my site, links with anchor text. Take a look at the second link you will find that it's a spammy link coming from PR2 home page since it's is over optimized. the third link is also a spammy link since it coming from a not-relevant website. Beside other bad links need to be removed. So, Ahref didn't discover all of my good links. Instead of that it discovered few good links and a lot of bad links. In a case like this a question come needs to be answered since there are some people trying so hard to hurt my site, Do I have to remove all this bad links? Or, just links discovered by Google. Or, Google understand the case? **Let's move to majestic, ** Gray Lee provided data from majestic which say "10 Unique Referring Domains, with 363 links, 2 domains make up a majority." Since Gray didn't take any screenshots I will provide mine. At "005" screenshot, you can see some of the bad links discovered by Majestic. Not all of them discovered by Ahref or Google. In the other hand, Majestic didn't discover all of my Good links. Also, there's a miss understand I would like to explain here. When I published the Discussion about +1000 link. Some people may think that I trying to cheat you by providing fake info and this totally wrong. I said before and I'm saying that again you are so elite and I respect you. Also, I'm preparing for an advanced case study about this thing. If any expert would like to join me this will be great. Thank you for reading and please feel free to share you thoughts, knowledge and experience in this Discussion. EE5bFNc jYg21cf Xyfgp28.png iR4UOwi.png D1pGAFO
White Hat / Black Hat SEO | | Eslam-yosef1 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
When to NOT USE the disavow link tool
Im not here to say this is concrete and should never do this, and please if you disagree with me then lets discuss. One of the biggest things out there today especially after the second wave of Penguin (2.0) is the fear striken web masters who run straight to the disavow tool after they have been hit with Penguin or noticed a drop shortly after. I had a friend who's site who never felt the effects of Penguin 1.0 and thought everything was peachy. Then P2.0 hit and his rankings dropped of the map. I got a call from him that night and he was desperately asking me for help to review his site and guess what might have happened. He then tells me the first thing he did was compile a list of websites back linking to him that might be the issue and create his disavow list and submitted it. I asked him "How long did you research these sites before you came the conclusion they were the problem?" He Said "About an hour" Then I asked him "Did you receive a message in your Google Webmaster Tools about unnatural linking?" He Said "No" I said "Then why are you disavowing anything?" He Said "Um.......I don't understand what you are saying?" In reading articles, forums and even here in the Moz Q/A I tend to think there is some misconceptions about the disavow tool from Google that do not seem to be clearly explained. Some of my findings with the tool and when to use it is purely based on logic IMO. Let me explain When to NOT use the tool If you spent an hour reviewing your back link profile and you are to eager to wait any longer to upload your list. Unless you have less than 20 root domains linking to you, you should spend a lot more than an hour reviewing your back link profile You DID NOT receive a message from GWT informing you that you had some "unnatural" links Ill explain later If you spend a very short amount of time reviewing your back link profile. Did not look at each individual site linking to you and every link that exists, then you might be using it WAY TO SOON. The last thing you want to do is disavow a link that actually might be helping you. Take the time to really look at each link and ask your self this question (Straight from the Google Guidelines) "A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee" Studying your back link profile We all know when we have cheated. Im sure 99.9% of all of us can admit to it at one point. Most of the time I can find back links from sites and look right at the owner and ask him or her "You placed this back link didn't you?" I can see the guilt immediately in their eyes 🙂 Remember not ALL back links you generate are bad or wrong because you own the site. You need to ask yourself "Was this link necessary and does it apply to the topic at hand?", "Was it relevant?" and most important "Is this going to help other users?". These are some questions you can ask yourself before each link you place. You DID NOT receive a message about unnatural linking This is were I think the most confusing takes place (and please explain to me if I am wrong on this). If you did not receive a message in GWT about unnatural linking, then we can safely say that Google does not think you contain any "fishy" spammy links in which they have determined to be of a spammy nature. So if you did not receive any message yet your rankings dropped, then what could it be? Well it's still your back links that most likely did it, but its more likely the "value" of previous links that hold less or no value at all anymore. So obviously when this value drops, so does your rank. So what do I do? Build more quality links....and watch you rankings come back 🙂
White Hat / Black Hat SEO | | cbielich1 -
Access Denied - 2508 Errors - 403 Response code in webmaster tools
Hello Fellow members, From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed. Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP on 9th may I got a message in GWT as well for "http://www.mysitename.co.uk/ Increase in authorization permission errors " Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors. After this all problem started. Kindly tell what is the issue & how can I solve this. WGsu8pU
White Hat / Black Hat SEO | | sourabhrana390 -
Webmaster Tools Showing Bad Links Removed Over 60 Days Ago
Hello, One of my clients received the notorious message from Google about unnatural links late last March. We've removed several hundred (if not thousands) of links, and resubmitted several times for reconsideration, only to continue with responses that state that we still have unnatural links. Looking through the "links to your site" in google webmaster tools, there are several hundred sites / pages listed, from which we removed our link over 60 days ago. If you click each link to view the site / page, they contain nothing, viewable or hidden, regarding our website / address. I was wondering if this (outdated / inaccurate) list is the same as the one their employees use to analyze the current status of bad links, and if so how long it will take to reflect up-to-date information. In other words, even though we've removed the bad links, how long do we need to wait until we can expect a clean resubmission for reconsideration. Any help / advice would be greatly appreciated -
White Hat / Black Hat SEO | | Bromtec0