Introduction
Subdomains are where developers hide test apps, admin panels, staging APIs and forgotten services.
If you miss the right subdomain, you miss the bug.
This Part shows a practical, combined approach: start with passive sources, then validate and expand with active techniques.
We merge outputs, dedupe, resolve, and prioritise.
You get a clean inventory that is ready for JS harvesting, URL collection and parameter testing.
Why a merged approach works
Passive sources find many names without touching the target.
Active techniques validate and discover names that never appeared in public logs.
Merging both gives you breadth and accuracy.
This saves time when you move to manual review and fuzzing.
Find a subdomain once and you can build a chain from it. A small discovery often opens a big path.
What to check – short checklist
Passive sources – certificate transparency, public feeds, SecurityTrails, DNS archives.
Active sources – brute force, permutations, wordlist-based enumeration.
Resolution – confirm which names actually resolve.
Web response – check which resolved hosts return HTTP content.
Wildcard detection – identify and handle wildcard DNS responses.
Ownership and CNAMEs – map to Cloud providers and external services.
Collect everything, then merge, dedupe, and prioritise.
Tools you will use
- Passive: crt.sh, amass (passive), subfinder, SecurityTrails (API).
- Active: massdns, dnsx, shuffledns, altdns, ffuf (for DNS fuzzing), dnsenum.
- Brute/permutation: ffuf, dnsgen, altdns, wordlists (SecLists).
- Enrichment: whois, dig, curl, jq.
- Automation helpers: bash, Python, small scripts for merge and tagging.
Use a small set of tools you will actually run repeatedly. Do not hoard tools.
Step-by-step pipeline (copy-paste ready)
1. Passive collection – certificate transparency plus passive tools
# CT entries
curl -s "https://crt.sh/?q=%25.example.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u > crt_subs.txt
# amass passive
amass enum -passive -d example.com -o amass_passive.txt
# subfinder passive
subfinder -d example.com -silent -o subfinder_passive.txt
Explanation: These commands gather hostnames from public records. They are non-intrusive.
2. Merge passive outputs
cat crt_subs.txt amass_passive.txt subfinder_passive.txt | sort -u > passive_merged.txt
Explanation: Deduplicate passive results to form the base list.
3. Active discovery – wordlist and brute force
Pick a tuned wordlist. Use company names, product names, environment words, and SecLists.
ffuf -w /path/to/subdomains-top1million-5000.txt -u FUZZ.example.com -mc all -fs 0 -t 50 -o ffuf_subs.json
Note: ffuf here is used for DNS probing via HTTP resolution of FUZZ.example.com. For pure DNS brute, use dnsx in a pipeline.
4. Permutation and candidate generation
# use altdns to generate permutations
cat passive_merged.txt | altdns -w /path/to/words.txt -o altdns_out.txt
Alternative: use dnsgen to create permutations then test them.
5. Fast resolution with massdns or shuffledns
# massdns
massdns -r resolvers.txt -t A -o S -w massdns_results.txt all_candidates.txt
# or shuffledns (faster and simpler)
shuffledns -d example.com -list all_candidates.txt -r resolvers.txt -o shuffledns_resolved.txt
Explanation: Resolve candidate names quickly to check which ones are live.
6. Filter HTTP hosts with dnsx
cat resolved.txt | dnsx -resp -a -cname -silent -o dnsx_enriched.txt
Explanation: dnsx gives you A records, CNAMEs and basic enrichment for each host. Use this to separate hosts that can be probed further.
7. Wildcard detection
# check wildcard behaviour
echo "random1random2.example.com" | massdns -r resolvers.txt -t A -o S -w /dev/stdout
If random host resolves, you have wildcard DNS. Handle it by filtering CNAMEs and real responses via content checks.
8. Final dedupe and create live list
cat passive_merged.txt ffuf_subs.txt altdns_out.txt | sort -u > merged_candidates.txt
shuffledns -d example.com -list merged_candidates.txt -r resolvers.txt -o final_resolved.txt
Open final_resolved.txt and mark hosts as web-enabled, API, or internal-only.
Wildcard handling and false positives
Wildcard DNS often causes false positives. Do not assume a resolved host is meaningful.
Quick check to filter:
# fetch HTTP status and content length
cat final_resolved.txt | while read host; do curl -sI --max-time 10 -o /dev/null -w "$host %{http_code} %{size_download}\n" "http://$host"; done > http_status.txt
If every random subdomain returns 200 with same content length, treat as wildcard. Focus on hosts with unique content, different headers or different titles.
Prioritisation for web app recon
Tag hosts with simple flags:
- Web response yes/no
- Title/Server header present
- Contains JS yes/no
- Linked to main app high/medium/low
Priority rules:
- Hosts with web response, JS and API patterns are top priority.
- Hosts found in certs and repos are higher confidence.
- Hosts on unfamiliar CDNs or external services may be interesting for takeover checks.
What to do after enumeration
- Send live web hosts to URL collection and JS harvesting.
- Add hosts with CNAMEs to Cloud provider checks and takeover checks.
- Move high priority hosts into parameter discovery and fuzzing.
- Document any suspicious or leaked hostnames in your findings template.
Practical use-cases
- Finding staging applications that expose admin UIs.
- Discovering API subdomains not mentioned in main docs.
- Identifying subdomains that point to third-party hosting where takeover is possible.
- Mapping forgotten developer services that may have weak auth.
Mini lab exercise – 30 minutes
- Use a lab domain you own or an allowed test domain. Replace
example.combelow. - Passive collection:
curl -s "https://crt.sh/?q=%25.example.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u > crt_subs.txt
amass enum -passive -d example.com -o amass_passive.txt
subfinder -d example.com -silent -o subfinder_passive.txt
- Merge and generate permutations:
cat crt_subs.txt amass_passive.txt subfinder_passive.txt | sort -u > passive_merged.txt
cat passive_merged.txt | altdns -w /path/to/words.txt -o altdns_out.txt
cat passive_merged.txt altdns_out.txt | sort -u > all_candidates.txt
- Resolve candidates:
shuffledns -d example.com -list all_candidates.txt -r resolvers.txt -o shuffledns_resolved.txt
- Quick HTTP check:
cat shuffledns_resolved.txt | dnsx -silent -resp -a -cname -o dnsx_final.txt
- Open top three unique hosts in your test browser and save a note in tracker.csv.
This exercise will give you a clean list of live hosts ready for the next Parts.
Common mistakes and fixes
Mistake: treating wildcard responses as valid hosts.
Fix: check content length, titles and headers; use random domain test.
Mistake: using a massive wordlist without rate limiting.
Fix: use tuned lists, increase parallelism carefully, obey scope.
Mistake: not enriching CNAMEs.
Fix: collect CNAMEs and map to cloud providers to spot takeover opportunities.
Mistake: relying on a single tool.
Fix: combine passive and active sources for coverage and validation.
Commands summary – copy-paste
Certificate transparency:
curl -s "https://crt.sh/?q=%25.example.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u > crt_subs.txt
Passive collectors:
amass enum -passive -d example.com -o amass_passive.txt
subfinder -d example.com -silent -o subfinder_passive.txt
Permutation:
cat passive_merged.txt | altdns -w /path/to/words.txt -o altdns_out.txt
Resolution:
shuffledns -d example.com -list all_candidates.txt -r resolvers.txt -o shuffledns_resolved.txt
HTTP enrichment:
cat shuffledns_resolved.txt | dnsx -resp -a -cname -silent -o dnsx_enriched.txt
Wildcard check:
echo "random$(date +%s).example.com" | massdns -r resolvers.txt -t A -o S -w /dev/stdout
Short checklist – copy into your notes
- Run CT and passive collectors.
- Merge passive outputs.
- Generate permutations and candidate list.
- Resolve candidates with massdns or shuffledns.
- Run HTTP enrichment and filter wildcards.
- Prioritise hosts for JS and URL collection.
Next steps and where this feeds
- Move web-enabled hosts into Part 11 for URL collection.
- Send hosts with JS to Part 13 for frontend reconnaissance.
- Put hosts with odd CNAMEs into Part 8 and Part 30 for takeover and cloud checks.
Closing notes
Subdomain enumeration is both art and engineering.
Combine many sources, validate aggressively, and keep your lists clean.
When you find the small, forgotten host, follow the chain. The important stuff often hides behind one careless subdomain.
Next post preview
Part 4 – GitHub and Repo Scraping for Endpoints and Secrets.
We will cover GH search tips, API queries, regexs for secrets, safe verification and responsible disclosure steps.
Disclaimer
This material is for educational purposes only. Use it ethically and only against targets you own or have explicit permission to test. Do not use any techniques described here in ways that break laws, platform rules or third-party rights. If in doubt, stop and get permission.

