Alternative Data Acquisition
Professional Backtesting
Proprietary Trading Infrastructure
Alternative Data Acquisition

Not all data comes quietly into your database.

📈 Finance and Accounting

There are experienced professionals manually entering financial transactions into Excel off pieces of paper. This is not okay. It's a waste of expertise and easy to screw up. I can help you streamline that.

🧪 Alternative Data

Is there data on a website, spreadsheet, PDF, or in an streaming video that you regularly need updated in your database? I can get that set up as an automatic pipeline.


These are some examples of relevant projects I was responsible for.

USDA Market Data Report Parser

A snippet of an E-mail notification I created that shows the USDA afternoon beef cutout for a particular day with graphs
  • Project developed for a private firm
  • Collects data from USDA reports that are human-friendly, human-typed text reports that are challenging to parse (above data is from here)
  • Automated execution ensures user's database is always up to date
  • Immediate e-mail notifications summarize new report contents with charts and statistics

Charity Job Hub

Multiple Sites, Unified Filtering

A list of jobs from charity job hub, showing two pinned jobs.
  • Collected job postings from the largest charity and non-profit job posting sites and harmonized the job postings into a single database of listings
  • Provided a filtering interface to users to select postings based on the relevance to their careers
  • Supported both positive ("must have one of ...") and negative ("must not have any of ...") filters on all important columns
  • Helped users keep track of relevant postings with a calendar of application deadlines and in-interface marking of "applied" and "interesting" jobs
  • Updated daily and deleted expired postings (which many sites leave up for months)
A graph of new jobs over time from the chairty job hub admin interface

Reach out for help with data acquisition!

Professional Backtesting

Do you think you have a profitable trading idea, but want it independently verified? I have spent thousands of hours avoiding being fooled by randomness. Go beyond MetaTrader, MultiCharts, and Trading View's built in backtesting and get an independent report on the performance of your strategy. You'll get a devil's advocate to reign you in before you lever up.

I'll give you an understandable analysis of your strategy from the perspective of a practitioner, not a statisticians report. I worked for a futures trading firm for 3 years and traded my own money and quantitative strategies in the futures market. I have first-hand experience of the highs and lows of leverage.

A snippet from a backtesting report showing that a equity curve of a random strategy has a very similar shape to a

Reach out for help with your quantitative strategy!

Proprietary Trading Infrastructure

I created the data acquisition, analysis, and visualization system for a family office trading livestock futures.

It is a multi-threaded scraping system that uses a regular expression meta-language that I created to aid the parsing of USDA reports. The system also collects data using JSON, XML, XLS, and HTML scraping. FTP connection to the broker enables cross referencing futures data with account and position details. Automated emails notify traders of liquid spreads and contracts near expiry. A self-hosted (AWS) Tableau Server that I installed and configured integrates all of the above data. I know a little something about roll yield and basis and enough about options to handle your data.

I've spent countless hours making custom indicators and automated trading strategies, as well as building trading simulators and macro analytics systems. I haven't found a money machine yet, but I know how to avoid a false sense of security in backtested data.

If you have similar needs, reach out!


Python is the third most loved programming language according to the Stack Overflow 2020 survey. It's the first language for many new developers, and for good reason. I switched from using Perl as my primary develop language to Python in 2011 because I wanted to leverage Python's modern code patterns and libraries. It stuck - for nearly 10 years. I developed many interesting tools, toys, and applications quickly and had a great time doing it.

Today, I would say that Python's primary advantage is development speed. You can get your software feature complete quickly and it'll work. The interpreter stays out of your way. You want to write a function that can return a string, or an integer, depending on how you call it? Sure! You want to multiply a string by an integer? Sure, it'll do what you probably expect. The dynamically typed interpreter and rich libraries make your job easy - at the start.

I switched from Python  to Rust as my primary language because I built a large code base (~30,000 lines). Python got out of my way while I was writing it, but in hindsight I wish it slowed me down and forced me to write clearer code with better interface guarantees. I started adding type hints, but these are not enforced and did not address the challenge of refactoring. Static analysis tools exist that can enforce compliance, but I (possibly irrationally) do not trust their ability to fix what isn't ultimately broken; the language is designed to not get in your way.

I still use Python for data science in Jupyter notebooks and quick prototypes. My more serious backend projects are all Rust now. I will admit, systems programming and the nitty-gritty detail of memory has always been attractive for me and C was my first language.

If you want help with a Python project from someone who appreciates how they can get out of hand, reach out!


In 2020 (during the COVID-19 lockdown) I stopped fantasizing about learning the Rust programming language and I started building. I've always been a systems programmer at heart, and as a enthusiast for the details of security vulnerabilities and reverse engineering I always had safety on my mind when programming. The core focus of Rust is safety. You get the low level power of C with modern code patterns. You get fearless concurrency as easy as .into_par_iter() and the confidence that if your code compiles, you didn't screw up. Well, mostly.

Rust has a steep learning curve. The first stage is easy, you battle it out with the borrow checker and the helpful compiler gives extremely direct and useful suggestions on how to make your code work. There is an entire world waiting beyond that stage though: generics, traits, and some poorly documented crates. As a young language there are many libraries available for doing typical things but its not often clear which you should use. Many of them are abandoned, many are poorly documented, and few are consensus picks. Sometimes you will find no analogous library that you relied on in Python and have to make it yourself, as I did with the fbm crate.

With that said, this is my core language for the foreseeable future. The focus on safety forces me to write good code the first time instead of refactoring two years from now. The early development stage for my new projects is slower, but the total labor cost of development is actually lower due to the higher standards my code has to be held to. Libraries are improving, and I am making an active effort to contribute to that. The code patterns available to me in Rust are delightful: match statements and patterns are beautiful. The functional programming influence encourages very elegant code.

Rust forces humans to write good code. I humbly accept these handcuffs, and am having a great time developing higher quality applications as a result.

Reach out if you want a reliable backend interface built with Rust!


I've liberated complicated data visualizations utilizing multi-worksheet calculations and visual basic scripts into pure Tableau implementations. I've installed Tableau Server on a dedicated AWS virtual machine and stored its data in an adjacent AWS RDS instance for efficiency. I've migrated Tableau Server from Linux, to Windows, and back, without losing visualizations. I have developed dozens of complicated and simple Tableau dashboards utilizing neat Tableau tricks and a great deal of custom SQL to get the data structured appropriately.

Reach out if you'd like help getting your data to dance for you in Tableau.

Latest Articles