Cost Functions and Objectives
This page details the functions related to building and evaluating cost functions and objectives.
Cost Functions
TrajectoryOptimization.CostFunction
— TypeAbstract type that represents a scalar-valued function that accepts a state and control at a single knot point.
TrajectoryOptimization.QuadraticCostFunction
— TypeAn abstract type that represents any CostFunction
of the form
\[\frac{1}{2} x^T Q x + \frac{1}{2} u^T R u + u^T H x + q^T x + r^T u + c\]
These types all support the following methods
As well as standard addition.
TrajectoryOptimization.DiagonalCost
— TypeDiagonalCost{n,m,T}
Cost function of the form
\[\frac{1}{2} x^T Q x + \frac{1}{2} u^T R u + q^T x + r^T u + c\]
where $Q$ and $R$ are positive semi-definite and positive definite diagonal matrices, respectively, and $x$ is n
-dimensional and $u$ is m
-dimensional.
Constructors
DiagonalCost(Qd, Rd, q, r, c; kwargs...)
DiagonalCost(Q, R, q, r, c; kwargs...)
DiagonalCost(Qd, Rd; [q, r, c, kwargs...])
DiagonalCost(Q, R; [q, r, c, kwargs...])
where Qd
and Rd
are the diagonal vectors, and Q
and R
are matrices.
Any optional or omitted values will be set to zero(s). The keyword arguments are
terminal
- ABool
specifying if the cost function is terminal cost or not.checks
- ABool
specifying ifQ
andR
will be checked for the required definiteness.
TrajectoryOptimization.QuadraticCost
— TypeQuadraticCost{n,m,T,TQ,TR}
Cost function of the form
\[\frac{1}{2} x^T Q x + \frac{1}{2} u^T R u + u^T H x + q^T x + r^T u + c\]
where $R$ must be positive definite, $Q$ and $Q_f$ must be positive semidefinite.
The type parameters TQ
and TR
specify the type of $Q$ and $R$.
Constructor
QuadraticCost(Q, R, H, q, r, c; kwargs...)
QuadraticCost(Q, R; H, q, r, c, kwargs...)
Any optional or omitted values will be set to zero(s). The keyword arguments are
terminal
- ABool
specifying if the cost function is terminal cost or not.checks
- ABool
specifying ifQ
andR
will be checked for the required definiteness.
TrajectoryOptimization.LQRCost
— FunctionLQRCost(Q, R, xf, [uf; kwargs...])
Convenience constructor for a QuadraticCostFunction
of the form:
\[\frac{1}{2} (x-x_f)^T Q (x-xf) + \frac{1}{2} (u-u_f)^T R (u-u_f)\]
If $Q$ and $R$ are diagonal, the output will be a DiagonalCost
, otherwise it will be a QuadraticCost
.
TrajectoryOptimization.is_diag
— Functionis_diag(::QuadraticCostFunction)
Determines if the hessian of a quadratic cost function is strictly diagonal.
TrajectoryOptimization.is_blockdiag
— Functionis_diag(::QuadraticCostFunction)
Determines if the hessian of a quadratic cost function is block diagonal (i.e. $\norm(H) = 0$).
TrajectoryOptimization.invert!
— Functioninvert!(Ginv, cost::QuadraticCostFunction)
Invert the hessian of the cost function, storing the result in Ginv
. Performs the inversion efficiently, depending on the structure of the Hessian (diagonal or block diagonal).
Adding Cost Functions
Right now, TrajectoryOptimization supports addition of QuadraticCost
s, but extensions to general cost function addition should be straightforward, as long as the cost function all have the same state and control dimensions.
Adding quadratic cost functions:
n,m = 4,5
Q1 = Diagonal(@SVector [1.0, 1.0, 1.0, 1.0, 0.0])
R1 = Diagonal(@SVector [1.0, 0.0, 0.0, 0.0, 0.0, 0.0])
Q2 = Diagonal(@SVector [1.0, 1.0, 1.0, 1.0, 2.0])
R2 = Diagonal(@SVector [0.0, 1.0, 1.0, 1.0, 1.0, 1.0])
cost1 = QuadraticCost(Q1, R1)
cost2 = QuadraticCost(Q2, R2)
cost3 = cost1 + cost2
# cost3 is equivalent to QuadraticCost(Q1+Q2, R1+R2)
Objectives
TrajectoryOptimization.Objective
— Typestruct Objective{C} <: TrajectoryOptimization.AbstractObjective
Objective: stores stage cost(s) and terminal cost functions
Constructors:
Objective(cost, N)
Objective(cost, cost_term, N)
Objective(costs::Vector{<:CostFunction}, cost_term)
Objective(costs::Vector{<:CostFunction})
TrajectoryOptimization.LQRObjective
— FunctionLQRObjective(Q, R, Qf, xf, N)
Create an objective of the form $(x_N - x_f)^T Q_f (x_N - x_f) + \sum_{k=0}^{N-1} (x_k-x_f)^T Q (x_k-x_f) + u_k^T R u_k$
Where eltype(obj) <: DiagonalCost
if Q
, R
, and Qf
are Union{Diagonal{<:Any,<:StaticVector}}, <:StaticVector}
TrajectoryOptimization.get_J
— FunctionGet the vector of costs at each knot point. sum(get_J(obj))
is equal to the cost
TrajectoryOptimization.dgrad
— Functiondgrad(E::QuadraticExpansion, dZ::Traj)
Calculate the derivative of the cost in the direction of dZ
, where E
is the current quadratic expansion of the cost.
TrajectoryOptimization.dhess
— Functiondhess(E::QuadraticCost, dZ::Traj)
Calculate the scalar 0.5dZ'GdZ where G is the hessian of cost
TrajectoryOptimization.norm_grad
— Functionnorm_grad(E::QuadraticExpansion, p=2)
Norm of the cost gradient
Evaluating the Cost
TrajectoryOptimization.cost
— Functioncost(obj::Objective, Z::Traj)
cost(obj::Objective, dyn_con::DynamicsConstraint{Q}, Z::Traj)
Evaluate the cost for a trajectory. If a dynamics constraint is given, use the appropriate integration rule, if defined.
cost(::Problem)
Compute the cost for the current trajectory
TrajectoryOptimization.stage_cost
— Functionstage_cost(costfun::CostFunction, x, u)
stage_cost(costfun::CostFunction, x)
Calculate the scalar cost using costfun
given state x
and control u
. If only the state is provided, it is assumed it is a terminal cost.
stage_cost(cost::CostFunction, z::AbstractKnotPoint)
Evaluate the cost at a knot point, and automatically handle terminal knot point, multiplying by dt as necessary.
TrajectoryOptimization.gradient!
— Functiongradient!(E::QuadraticCostFunction, costfun::CostFunction, z::AbstractKnotPoint, [cache])
Evaluate the gradient of the cost function costfun
at state x
and control u
, storing the result in E.q
and E.r
. Return a true
if the gradient is constant, and false
otherwise.
If is_terminal(z)
is true, it will only calculate the gradientwith respect to the terminal state.
The optional cache
argument provides an optional method to pass in extra memory to facilitate computation of cost expansion. It is vector of length 4, with the following entries: [grad, hess, grad_term, hess_term]
, where grad
and hess
are the caches for gradients and Hessians repectively, and the []_term
entries are the caches for the terminal cost function.
TrajectoryOptimization.hessian!
— Functionhessian!(E, costfun::CostFunction, z::AbstractKnotPoint, [cache])
Evaluate the hessian of the cost function costfun
at knotpoint z
. the result in E.Q
, E.R
, and E.H
. Return a true
if the hessian is constant, and false
otherwise.
If is_terminal(z)
is true, it will only calculate the Hessian with respect to the terminal state.
The optional cache
argument provides an optional method to pass in extra memory to facilitate computation of cost expansion. It is vector of length 4, with the following entries: [grad, hess, grad_term, hess_term]
, where grad
and hess
are the caches for gradients and Hessians repectively, and the []_term
entries are the caches for the terminal cost function.
TrajectoryOptimization.cost_gradient!
— Functioncost_gradient!(E::Objective, obj::Objective, Z, init)
Evaluate the cost gradient along the entire tracjectory Z
, storing the result in E
.
If init == true
, all gradients will be evaluated, even if they are constant.
TrajectoryOptimization.cost_hessian!
— Functioncost_hessian!(E::Objective, obj::Objective, Z, init)
Evaluate the cost hessian along the entire tracjectory Z
, storing the result in E
.
If init == true
, all hessian will be evaluated, even if they are constant. If false, they will only be evaluated if they are not constant.
TrajectoryOptimization.cost_expansion!
— Functioncost_expansion!(E::Objective, obj::Objective, Z, [init, rezero])
Evaluate the 2nd order Taylor expansion of the objective obj
along the trajectory Z
, storing the result in E
.
If init == false
, the expansions will only be evaluated if they are not constant.
If rezero == true
, all expansions will be multiplied by zero before taking the expansion.