Not all data comes quietly into your database.
📈 Finance and Accounting
There are certified, experienced professionals manually entering financial transactions into Excel off pieces of paper.This is not okay. It's a waste of expertise and easy to screw up. I can help you streamline that.
🧪 Alternative Data
Is there data on a website, spreadsheet, PDF, or in an streaming video that you regularly need updated in your database? I can get that set up as an automatic pipeline.
These are some examples of relevant projects I was responsible for.
USDA Market Data Report Parser
- Private project developed for a private firm
- Collects data from USDA reports that are human-friendly, human-typed text reports that are challenging to parse (above data is from here)
- Automated execution ensures user's database is always up to date
- Immediate e-mail notifications summarize new report contents with charts and statistics
Charity Job Hub
Multiple Sites, Unified Filtering
- Collected job postings from the largest charity and non-profit job posting sites and harmonized the job postings into a single database of listings
- Provided a filtering interface to users to select postings based on the relevance to their careers
- Supported both positive ("must have one of ...") and negative ("must not have any of ...") filters on all important columns
- Helped users keep track of relevant postings with a calendar of application deadlines and in-interface marking of "applied" and "interesting" jobs
- Updated daily and deleted expired postings (which many sites leave up for months)
Do you think you have a profitable trading idea, but want it independently verified? I have spent thousands of hours avoiding being fooled by randomness. Go beyond MetaTrader, MultiCharts, and Trading View's built in backtesting and get an independent report on the performance of your strategy. You'll get a devil's advocate to reign you in before you lever up. I'll give you an understandable analysis of your strategy from the perspective of a practitioner, not a statisticians report.
I created the data acquisition, analysis, and visualization system for a family office trading livestock futures.
It is a multi-threaded scraping system that uses a regular expression meta-language that I created to aid the parsing of USDA reports. The system also collects data using JSON, XML, XLS, and HTML scraping. FTP connection to the broker enables cross referencing futures data with account and position details. Automated emails notify traders of liquid spreads and contracts near expiry. A self-hosted (AWS) Tableau Server that I installed and configured integrates all of the above data. I know a little something about roll yield and basis and enough about options to handle your data.
I've spent countless hours making custom indicators and automated trading strategies, as well as building trading simulators and macro analytics systems. I haven't found a money machine yet, but I know how to avoid a false sense of security in backtested data.
If you have similar needs, reach out!
- Code Example: Extracting Historical Data from Interactive Brokers API
- Compliment to Standard Move Risk Management: Volatility Bands
- Method to Sync MultiCharts Database + Studies Across Computers
- Function + Indicator for Risk Management: Standard Move
- Interactive Brokers Order Types and Algo Overview (For Ordinary Folks)