What are you doing, step size: fast and accurate computation of marginal effects in parallel
Abstract
Numerical derivatives are fundamental in gradient-based optimisation algorithms used in statistics, machine learning, and scientific computing.We develop improved numerical differentiation algorithms and a new open-source R package ‘pnd’ (Parallel Numerical Derivatives) containing their fast, accurate, parallelised implementations. The talk is organised as follows:
- We begin by providing practical methods for selecting optimal step sizes in numerical derivatives by balancing truncation and machine-rounding errors. We derive closed-form expressions for arbitrary differencing orders and accuracy levels.
- Next, we explore four popular data-driven step-size-selection algorithms – previously unavailable in open-source software – and show how our enhancements improve their speed and stability.
- We demonstrate how parallelised versions of these algorithms substantially reduce run-time on multi-core systems.
- Lastly, we showcase the pnd package, designed as a drop-in replacement for the popular numDeriv package. Its functions are compatible with any gradient-based optimiser or solver. It offers the user control over accuracy–speed trade-offs and provides suggestions on multi-core usage to minimise overhead.
- We demonstrate how to practically harness the power of modern multi-core machines for fast gradient and Jacobian calculation of computationally intensive functions.
- Benchmarks against multiple software packages highlight pnd’s accuracy and robustness, with visualisations emphasising the importance of proper step-size selection.
Language
English
This is a free seminar. Registration is mandatory