Description
Hakrawler is a simple Go web crawler designed for quick discovery of endpoints and assets within a web application. It crawls a URL and outputs discovered links, script sources, and form actions.
Installation
BASH
go install github.com/hakluke/hakrawler@latest
Basic Usage
BASH
# Basic crawl
echo "https://target.com" | hakrawler
# Set depth
echo "https://target.com" | hakrawler -d 3
Advanced Usage
BASH
# Show all discovered links
echo "https://target.com" | hakrawler -all
# Include subdomains
echo "https://target.com" | hakrawler -subs
# Custom User-Agent
echo "https://target.com" | hakrawler -h "User-Agent: Mozilla/5.0"
# Plain output (URLs only)
echo "https://target.com" | hakrawler -plain
Common Workflows
BASH
# Pipe alive hosts to crawler
cat alive.txt | hakrawler -d 2 -plain | sort -u > crawled-urls.txt
# Full recon: subdomains → alive → crawl
subfinder -d target.com -silent | httpx -silent | hakrawler -plain | sort -u