The cost function to differentiate
Optionaloptions: NumericalDifferentiationOptionsOptional numerical differentiation settings
A gradient function that can be passed to optimization algorithms
import { gradientDescent, createFiniteDiffGradient } from 'numopt-js';
// Define your cost function
const costFn = (params) => Math.pow(params[0] - 3, 2) + Math.pow(params[1] - 2, 2);
// Create a gradient function (no need to worry about parameter order!)
const gradientFn = createFiniteDiffGradient(costFn);
// Use it with an optimizer
const result = gradientDescent(
new Float64Array([0, 0]),
costFn,
gradientFn,
{ maxIterations: 100, tolerance: 1e-6 }
);
Creates a gradient function from a cost function using finite differences.
This is a convenience wrapper around finiteDiffGradient that returns a gradient function suitable for use with optimization algorithms like gradientDescent.