# Gradient Vector

The gradient vector of $f(x,y)$ is given by $∇f=(∂x∂f ,∂y∂f )$

The gradient vector points in the ascending slope position.

### Examples

$f(x,y)=x+y→∂x∂f =1∂y∂f =1$

The gradient of the $max$ function took me a bit to understand, it is used to compute the gradient of ReLU and SVM multiclass loss. Think about how the function changes as you change a particular value. If $x$ is less than $y$, than no matter what value, it’s a fixed value, i.e. $y$. $f(x,y)=max(x,y)→∂x∂f =1(x≥y)∂y∂f =1(y≥x)$