Chapter 5 Robust Performance Tailoring with Tuning - SSL - MIT
Chapter 5 Robust Performance Tailoring with Tuning - SSL - MIT
Chapter 5 Robust Performance Tailoring with Tuning - SSL - MIT
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
is considered.<br />
4.2.1 Hardware-only <strong>Tuning</strong><br />
One approach to the problem is to perform a tuning optimization and replace per-<br />
formance predictions from the model <strong>with</strong> hardware data. In effect, a real-time<br />
optimization is conducted using the hardware and test data. At each iteration of<br />
a gradient-based optimization algorithm the cost function, f (xk), is evaluated, the<br />
gradient of the objective is calculated and a new search direction and step-size are ob-<br />
tained. The use of a model in this process allows for quick computation and analytical<br />
gradient calculations.<br />
Replacing the model simulation <strong>with</strong> actual test data is both computationally<br />
expensive and labor intensive as each function call requires an actual hardware test.<br />
In addition, analytical gradients are no longer available and finite-difference approx-<br />
imations must be used instead. There are two methods to compute finite-difference<br />
gradients, the forward-difference and central-difference equations:<br />
∂f (x)<br />
∂xi<br />
∂f (x)<br />
∂xi<br />
= f (x +∆xei) − f (x)<br />
∆x<br />
= f (x +∆xei) − f (x − ∆xei)<br />
2∆x<br />
(4.3)<br />
(4.4)<br />
where i denotes an element of x, ei is a unit vector <strong>with</strong> a 1 in the i th location,<br />
and ∆x is a small change in the design parameter. The central difference equation<br />
(Equation 4.4) is more accurate, but requires an additional function evaluation at<br />
each step. These approximations are both sensitive to the size of ∆x, and large<br />
parameter changes may be outside the range of linear approximation. The need to<br />
use finite-difference gradient approximations adds to the time and cost burden of<br />
real-time tuning optimizations.<br />
A second consideration of hardware optimizations is that it is not always possible<br />
to evaluate iterates that are beyond the constraint boundaries. In the example of mass<br />
tuning considered in this chapter there are two constraints: a total mass constraint<br />
123