Save and Search Your Web Traffic Forever with elasticArchive for Mitmproxy
Log and perform full-text searches on all of your web traffic with Mitmproxy and ElasticArchive, a tool for bug bounty hunters, red teams, and OSINT.
Introducing ElasticArchive — a Mitmproxy Add-on to Store Everything in Elasticsearch
I was looking for an easy way to record all of my web traffic in elasticsearch so that I could search full requests and responses for cookie names, parameter names, strange URLs, and short-lived content but I couldn’t find one. So I made one — elasticArchive.
How elasticArchive Works
ElasticArchive is an add-on to the popular mitmproxy, a free and open-source HTTPS proxy. It creates a HTTP/HTTPS/Websocket proxy and captures all requests and responses in full in an elasticsearch server of your choice. I run everything through docker which is the easiest way to get it running.
Who is ElasticArchive For?
While anyone can use it, I created it with the following use cases in mind.
Bug Bounty Hunters
If you’re a bug bounty hunter, you can use this tool to record all of your requests and responses on every website you visit. This is how I use it. Only traffic from my testing browser runs through elasticArchive. Everything else in my day to day browser bypasses the proxy and goes straight to the web. This limits the logs in elasticsearch to only traffic that is in scope for testing, or has been generated while testing in-scope websites and applications.
Why Not Just use Burp Suite Search?
Search is only available in Burp Suite Pro. Even though I am a pro user and search is available to me, I still wanted a tool to collate all of my testing logs together in one searchable place. Like many of you, I create individual burp project files for each company or program that I’m testing against. This means my search is limited to a small subset of what I’m testing against.
If I find a vulnerability in a library in use on one website or web application, I can search all of my old testing logs across all companies and programs for the same library and possibly the same vulnerability. I can do this by searching for input parameters, cookies, headers, or anything else within a request or response.
Generating a List of All Possible Input URLs
If I discover a new variant of a vulnerability which I’ve never seen before I can use elasticArchive to generate a list of all URLs on all sites that accept some form of input. With this list, I can test the same payload against all sites in scope in all companies and programs through a script.
Red Teams, OSINT, and Researchers
There are commercial tools used by red teams, OSINT analysts, researchers, and law enforcement, that capture and provide searchable logs for all of your web traffic. One example is the excellent Hunchly. While elasticArchive doesn’t give you anywhere near the feature set of commercial offerings, it will let you log and search everything you do online. If that is your only requirement and you don’t have the budget for a commercial tool, elasticArchive may work for you.
How to Run ElasticArchive
- Spin up an elasticsearch docker container or cloud instance
- Spin up a kibana container or cloud instance
- Run an elasticArchive container in your local docker service and point it at the elasticsearch endpoint
- Configure your browser or burp suite proxy to use elasticArchive as the upstream proxy
For the technical details see the elasticArchive documentation.
How to Search All Of Your Web Activity
There are two ways to search your browser activity:
- Using the elasticsearch query language and the elasticsearch API
- Using kibana query language through the kibana dashboard
My preference is to use Kibana query language through the web dashboard as it feels easier to work with and perform iterative searches until you find exactly what you’re looking for. I’d use the elasticsearch query interface if I was looking to generate automated reports or a custom web interface.
Where to Find ElasticArchive
I’ve published elasticArchive on github: https://github.com/craighays/elasticArchive
Originally published at https://craighays.com on June 8, 2020.