User Tools

Site Tools


zseano:recon

Recon & Expanding Attack Surface

Run recon while doing manual testing in parallel. Don't wait for tools to finish before hacking.

Pre-Hack Research

Before touching the site:

  1. Search for disclosed bugs: site:google.com “domain.com” vulnerability
  2. Check HackerOne hacktivity and OpenBugBounty for past findings
  3. Read disclosed reports – they create leads and show what bypasses worked
  4. Sometimes you can bypass old “fixed” bugs
  5. Identify the tech stack: Wappalyzer, BuiltWith, retire.js
  6. Check for known CVEs on identified platforms

Subdomain Enumeration

amass enum -brute -active -d domain.com -o amass-output.txt
subfinder -d domain.com -o subs.txt
cat amass-output.txt subs.txt | sort -u | tee all-subs.txt
cat all-subs.txt | httprobe -p http:81 -p http:3000 -p https:8443 -c 50 | tee online.txt
cat online.txt | aquatone
cat all-subs.txt | dnsgen - | httprobe   # permutation discovery

What to Look for in Subdomains

  • Functionality: login pages, upload features, APIs
  • Keywords: dev, qa, staging, prod, admin, internal
  • Third-party controlled domains (careers.target.com)
  • Different country TLDs – different codebases, different bugs
  • Forgotten servers, abandoned projects

Google Dorking

site:target.com inurl:& -movies
site:target.com inurl:register inurl:&
site:target.com inurl:login
site:target.com ext:php | ext:aspx | ext:jsp | ext:txt | ext:xml | ext:bak
  • Use -keyword to exclude noise
  • Scroll to last page, click “repeat the search with the omitted results included”
  • Check with mobile user-agent – Google may return different results

GitHub / Shodan Dorking

  • “domain.com” + api_secret, api_key, apiKey, password, admin_password
  • Check employee repos and forked projects

robots.txt & Historical Files

# Scan robots.txt on every subdomain via Burp Intruder
# Historical URLs
gau target.com | sort -u > gau-urls.txt
waybackurls target.com | sort -u > wb-urls.txt
  • Old robots.txt via Wayback Machine – forgotten endpoints often still live
  • Look for backup files: .bak, .old, .zip
  • Check for exposed git repos

Directory & File Brute Force

ffuf -ac -v -u https://domain/FUZZ -w /usr/share/seclists/Discovery/Web-Content/raft-large-words.txt
  • Start with /admin, /server-status, then expand
  • After finding 401s, fuzz inside them – broken access control
  • Dork for file extensions: php, aspx, jsp, txt, xml, bak

JS File Analysis

gau target.com | grep "\.js$" | tee js-files.txt
# Use linkfinder to extract endpoints
python3 linkfinder.py -i https://target.com/app.js -o cli
  • Look for API endpoints, developer comments, hidden parameters, API keys
  • Monitor JS files daily for changes – new features before release

Parameter Discovery

  • Use ParamSpider / Arjun on discovered endpoints
  • Scrape input names and IDs from HTML
  • Look for var name = “” patterns in JS
  • Test discovered params across all endpoints: /endpoint?param1=xss&param2=xss
  • Don't forget GET vs POST – always test both

Cert Transparency Monitoring

curl https://certspotter.com/api/v0/certs?domain=domain.com
  • Use certspotter/sslmate to catch new subdomains as they're created

Custom Wordlists

  • Build per-target endpoints.txt and params.txt as you discover them
  • Merge across subdomains into global-endpoints.txt
  • Use commonspeak for tech-specific terms

See Also

zseano/recon.txt · Last modified: by drew

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki