Work from non-invasive (OSINT) up to more invasive tools (PING)
What sources are useful?
Cached pages and web archived information
Keywords, site specific and file specific (passwords, RSA token, *PDF etc.)
Think about not just your Google-Fu but also what's shared on social media from company and employees
What pieces of data are we looking for?
Think high level before you deep dive
Footprint
DNS Nameservers
IP Ranges
Banners
Operating Systems
IDS/IPS and other security systems
Network Devices
Other Devices
Job postings (New and old!)
Mobile Opportunities
Social Media
Forgotten devices, or even old listserves can be treasure troves of info
Unsecured buckets of data
What are we looking for?
Names
Jobs
Emails
Personal info
Documents
Financials, company structure
*CAUTION*More invasive options *CAUTION*
DNS
Servers
Automated Collections of data and web scraping
Metadata
Documents Posted? Who wrote them
Pictures posted? Who took them and where?
Jobs listed? What tools/services are used?
DNS
Dig - domain information groper
fierce - Brute force approach
Nslookup - less info, but almost always available
Nameservers
wildcard sub-domains, word lists for sub-domains
Alternate Nameservers
What DNSSEC might there be? What does DNSSEC cover? what doesn't it cover?
Traps
Honey traps (Older term was HoneyPot, can offend people)
Example: Email only seen by scraper can be monitored to detect activity
Fake Social Media Accounts for "Employees"
Junk info to detect scraping
White text on white background
Small text hidden in places humans are unlikly to look
Be careful of Accessibility! Screenreaders and bots can look similar
How to defeat scrapers
Jane@doe.com vs Jane at doe dot com
Limit how many searches can be done on employee lists
Have anonymous emails for postions like CEO at Acme dot com or HR at Acme dot com
Web scrapers are bots, they have regular activity and timing
Limit requests from single IP address, or have wait times
Work with user agent detection (mobile vs Desktop vs headless browsers
Reporting
Reports are the least fun and most needed part of the job
Shows what worked, what didn't, and samples of data that was found
Open Office can be used, but doesn't like strange file types (such as PCAP) and awful at large volume
Magic Tree
Uses a tree structure for organizing data
Dradis
Uses Nodes (in place of folders for organization)
Issues are the vulnerabilities that were found
Methodoloties means checklists and time lines, think of project management
Uses an IP address or website so it's easier to share with a whole team
Suggested Activities and Discussion Topics:
Find one PASSIVE tool that wasn't mentioned here, what is it? How does it work? Why is it considered passive not active or invasive? What info does it give? How might that help us do recon?
In pairs start working on a mind map for the footprint of a well known company. OSINT ONLY Make sure you're thinking about what is valuable data and what you can do beyond just Googling the company
Would you like to see some more classes?
Click here