Experience
Building systems that hold up in real environments.
My experience spans analytics, software, data workflows, forecasting, reporting, and machine learningacross roles where the work needed to be practical, reliable, and tied to real outcomes.
Career Timeline
Roles, responsibilities, and measurable contribution.
A closer view of the environments I have worked in, the kinds of systems I built, and how my role evolved across analytics, operations, reporting, and technical delivery.
At Adero Law, I design and engineer forensic analytics systems used in large-scale employment and wage compliance investigations. My work sits at the intersection of data engineering, legal modelling, and financial analysis, translating complex employment legislation and historical workforce data into structured analytical frameworks that can withstand legal scrutiny.
I build reproducible analytical pipelines that integrate payroll systems, timesheets, rosters, employment contracts, and leave records to reconstruct historical employment conditions and evaluate compliance with Australian industrial awards.
Key Contributions
- Designed and implemented Python-based forensic analytics frameworks capable of analysing millions of payroll, roster, and timesheet records across multi-year employment histories.
- Built programmatic models implementing 20+ award clauses, penalty rules, overtime structures, and entitlement conditions, translating complex legal language into deterministic computational logic.
- Engineered end-to-end analytical pipelines that combine payroll, HR, roster, and leave datasets into unified, audit-ready analytical environments.
- Developed automated entitlement calculation engines capable of reconstructing historical employment conditions across hundreds of employees and thousands of pay periods.
- Built reconciliation systems comparing actual payroll payments against award-based entitlements, enabling identification of potential underpayments and compliance discrepancies.
- Implemented multi-layer data validation frameworks, including cross-dataset reconciliation, anomaly detection, and integrity checks, significantly improving analytical reliability.
- Created structured reporting systems and reproducible analytical outputs used by lawyers and analysts in case preparation and legal investigations.
- Engineered automation workflows that reduced previously manual financial analysis processes into fully reproducible, scalable Python pipelines.
- Collaborated closely with legal teams to validate modelling assumptions, interpret analytical outputs, and explain complex findings in a defensible, evidence-ready format.
- Maintained extensive documentation and version-controlled analytical workflows to ensure traceability, transparency, and reproducibility of analytical results.
This work requires a combination of data engineering, analytical modelling, and regulatory interpretation, where analytical outputs must be transparent, explainable, and defensible under external scrutiny.