Can you use Screaming Frog to find all instances of relative or absolute linking?
-
My client wants to pull every instance of an absolute URL on their site so that they can update them for an upcoming migration to HTTPS (the majority of the site uses relative linking). Is there a way to use the extraction tool in Screaming Frog to crawl one page at a time and extract every occurrence of _href="http://" _?
I have gone back and forth between using an x-path extractor as well as a regex and have had no luck with either.
Ex. X-path: //*[starts-with(@href, “http://”)][1]
Ex. Regex: href=\”//
-
This only works if you have downloaded all the HTML files to your local computer. That said, it works quite well! I am betting this is a database driven site and so would not work in the same way.
-
Regex: href=("|'|)http:(?:/{1,3}|[a-z0-9%])|[a-z0-9.-]+.
This allows for your link to have the " or ' or nothing between the = and the http If you have any other TLDs you can just keep expanding on the |
I modified this from a posting in github https://gist.github.com/gruber/8891611
You can play with tools like http://regexpal.com/ to test your regexp against example text
I assumed you would want the full URL and that was the issue you were running into.
As another solution why not just fix the https in the main navigation etc, then once you get the staging/testing site setup, run ScreamingFrog on that site and find all the 301 redirects or 404s and then use that report to find all the URLs to fix.
I would also ping ScreamingFrog - this is not the first time they have been asked this question. They may have a better regexp and/or solution vs what I have suggested.
-
Depending on how you've coded everything you could try to setup a Custom Search under Configuration. This will scan the HTML of the page so if the coding was consistent you could put something like href="http://www.yourdomain.com" as the string it's looking for and in the Custom tab on the resulting pages it'll show you all the ones that match the string.
That's the only way I can think of to get Screaming Frog to pull it but looking forward to anyone else's thoughts.
-
If you have access to all the website's files, you could try finding all instances in the directory using something like Notepad++. Could even use find and replace.
This is how I tend to locate those one-liners among hundreds of files.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use a 301 redirect to pass 'back link' juice to a different domain?
Hi, I have a backlink from a high DA/PA Government Website pointing to www.domainA.com which I own and can setup 301 redirects on if necessary. However my www.domainA.com is not used and has no active website (but has hosting available which can 301 redirect). www.domainA.com is also contextually irrelevant to the backlink. I want the Government Website link to go to www.domainB.com - which is both the relevant site and which also should be benefiting from from the seo juice from the backlink. So far I have had no luck to get the Government Website's administrators to change the URL on the link to point to www.domainB.com. Q1: If i use a 301 redirect on www.domainA.com to redirect to www.domainB.com will most of the backlink's SEO juice still be passed on to www.domainB.com? Q2: If the answer to the above is yes - would there be benefit to taking this a step further and redirect www.domainA.com to a deeper directory on www.domianB.com which is even more relevant?
Technical SEO | | DGAU
ie. redirect www.domainA.com to www.domainB.com/categoryB - passing the link juice deeper.0 -
Canonical issues using Screaming Frog and other tools?
In the Directives tab within Screaming Frog, can anyone tell me what the difference between "canonicalised", "canonical", and "no canonical" means? They're found in the filter box. I see the data but am not sure how to interpret them. Which one of these would I check to find canonical issues within a website? Are there any other easy ways to identify canonical issues?
Technical SEO | | Flock.Media0 -
We just can't figure out the right anchor text to use
We have been trying everything we can with anchor text. We have read here that we should try naturalistic language. Our competitors who are above us in Google search results don't do any of this. They only use their names or a single term like "austin web design". Is what we are doing hurting our listings? We don't have any black hat links. Here's what we are doing now. We are going crazy trying to figure this out. We are afraid to do anything in fear it will damage our position. Bob | pallasart web design | 31 | 1,730 |
Technical SEO | | pallasart
| website by pallasart a texas web design company in austin | 15 | 1,526 |
| website by the austin design company pallasart | 14 | 1,525 |
| created by pallasart a web design company in austin texas | 13 | 1,528 |
| created by an austin web design company pallasart | 12 | 1,499 |
| website by pallasart web design an austin web design company | 12 | 1,389 |
| website by pallasart an austin web design company | 11 | 1,463 |
| pallasart austin web design | 9 | 2,717 |
| website created by pallasart a web design company in austin texas | 9 | 1,369 |
| website by pallasart | 8 | 910 |
| austin web design | 5 | 63 |
| pallasart website design austin |0 -
The use of robots.txt
Could someone please confirm that if I do not want to block any pages from my URL, then I do not need a robots.txt file on my site? Thanks
Technical SEO | | ICON_Malta0 -
Hit hard by EMD update, used to be #1 now not in top 50, what can I do?
We have what I think is a pretty good site, unique articles a few widgets, lots of reviews, decent enough bounce rates and user times (60% and 2:15) based on drupal. Previous updates haven't touched us and an almost identical duplicate (same site compltely different content) of the site targetting a different but related EMD is unaffected which provides a control. I have seen some discussion on it having to do with link profiles. We did pay some backlinkers to link to us, much more on the site that has dropped, and quite a few for a partial match keyword. I'm supposing this is a lot of the issue. If we try and delete these backlinks will it make the situation better or worse? I have also notice some duplicate content warnings in seomoz that weren't there previously. Any ideas?
Technical SEO | | btrr690 -
Can I turn off Google site links?
I thought at one time I had turned off the option to have Google sitelinks. I did this so that each of our pages that had a strong presence would occupy a unique slot on the first and second page of Google. This was important to us as we were battling some reputation management issues and trying to push out negative listings from the front page. Recently I noticed sitelinks were back up and when going into Google Webmaster Tools, I could figure out how to opt out of them. Any suggestions?
Technical SEO | | BRConsulting0 -
HTTP301 or link ?
We have a page on a website (let's name it ABC) which ranks very well on Google for a specific keyword but this keyword is not the main activity of website ABC. For this reason we created website XYZ for offering the services related to the specific keyword. How shall we redirect the visitors from website ABC to website XYZ so XYZ gets all the weight ? Is it best to do an HTTP301 from the specific page on site ABC or from site ABC, remove nearly all content related to the keyword and create a link to website XYZ ? Your advice is well appreciated.
Technical SEO | | netbuilder0 -
External link optimization
The company I work for sells software online. We have deals learning institutes that allow their students to use our software for next to nothing. These learning institutes, which are usually quite strong domains, link to our sign in area. Nice way to get powerful links hey… or is it? There are a couple of problems with these links: They all link to a subdomain (signin.domain.com) The URLs also contain unique identifiers (so that we know which institute they are coming from). Meaning they all link to different signin URLs. (eg. signin.domain.com/qwerty, signin.domain.com/qwerta, signin.domain.com/qwerts, etc. ) So all these links aren't as effective as they could be (or at all?). In a perfect SEO world these links would all point to the start page, however, due to the fact that our start page is of a commercial set up this would run the risk of communicating the wrong idea to the institutes and their students. So… are there any extremely brilliant pro mozzers that have a savvy idea how set this up in a more SEO friendly way? Thanks in advance!
Technical SEO | | henners0