reddit hackernews mail facebook facebook linkedin
crithit

crithit

Takes a single wordlist item and tests it one by one over a large collection of websites.

Website Directory and file brute forcing at extreme scale.

CritHit takes a single wordlist item and tests it one by one over a large collection of hosts before moving onto the next wordlist item. The intention of brute foricng in this manner is to avoid low limit Web Application Firewall (WAF) bans and to allow brute forcing to run faster than it normally would when approaching any single host with multiple simultaneous requests.

CritHit can perform multiple verifications of results using proxy lists, as well as filter out noise by base lining websites. Additionally, if looking for a specific item over a large number of websites (to cross compare a vulnerablity over more hosts) you can build and use --signatures to write only hosts containing specific data points to an output file.

Best results can be sought from CritHit by using it as a quick "first pass" with a smaller (100 critical items) wordlist, a very large target list, and then deep diving more directly with a project such as ffuf where results are found.