axtreme.utils.gradient¶
Helper for gradient assessment.
NOTE: currently just used as a helper in tests. Could move their if we do not consider it useful to users
Functions
|
Helper to warn if 1d function is likely not smooth (twice differntiable). |
- axtreme.utils.gradient.is_smooth_1d(x: Tensor, y: Tensor, d1_threshold: float = 3.0, d2_threshold: float = 150, *, plot: bool = False, test: bool = True) None | Figure ¶
Helper to warn if 1d function is likely not smooth (twice differntiable).
- Parameters:
x – (n,) points representing the x (input) values of a function
y – (n,) points representing the y (output) values of a function
d1_threshold – Maximum step size allowed in 1st derivative function to be considered smooth.
d2_threshold – Maximum step size allowed in 1st derivative function to be considered smooth.
plot – If true, will plot the first and second derivative functions.
test – is assert statments should be run
- Details
- Smoothness: (defined up to K, the Kth derivative you can take that give a contious funciton over domain)
C_0: set of continous functions
- C_1 (once differentiable):
C_0 can be differentiated
AND the resulting function is continuous (No steps or holes)
…
- We want C_2 (twice differentiable) for optimisation. As such here we test.
- if f’(x) is continous.
Check if there are not step = not big changes between points.
- Knowing the appropriate step threshold is hard, the gradient is easier to think about.
Easy way to do this is check if max slope is not exceeded (derivate 2nd)
- f’’(x) is continous.
Check if there are not step = not big changes between points
Easy way to do this is check if max slope is not exceeded (derivate 2nd)
NOTE: The thresholds have been set heuristically. It is recomended to plot your function and establish smoothness, the use this for regression testing once suitable values have been determined.
NOTE: The smaller the step size, the better the estimate of gradient (large and small). See y = sim(50 * x)` with torch.linspace(0,1,100)` and torch.linspace(0,1,1000)