Matches in DBpedia 2015-10 for { ?s ?p "The Hamilton–Jacobi–Bellman (HJB) equation is a partial differential equation which is central to optimal control theory. The solution of the HJB equation is the 'value function' which gives the minimum cost for a given dynamical system with an associated cost function.When solved locally, the HJB is a necessary condition, but when solved over the whole of state space, the HJB equation is a necessary and sufficient condition for an optimum."@en }
Showing triples 1 to 1 of
1
with 100 triples per page.
- Hamilton–Jacobi–Bellman_equation comment "The Hamilton–Jacobi–Bellman (HJB) equation is a partial differential equation which is central to optimal control theory. The solution of the HJB equation is the 'value function' which gives the minimum cost for a given dynamical system with an associated cost function.When solved locally, the HJB is a necessary condition, but when solved over the whole of state space, the HJB equation is a necessary and sufficient condition for an optimum.".