PDF (double-sided) - Physics Department, UCSB - University of ...
PDF (double-sided) - Physics Department, UCSB - University of ... PDF (double-sided) - Physics Department, UCSB - University of ...
ages are based on this method, including Matlab’s “fminsearch” function. The algorithm also handles noise reasonably well. The major drawbacks of the simplex method are that its performance is extremely dependent on the choice of the initial simplex, which is not always easy to do right. It also gets stuck fairly easily in local minima. Furthermore, the algorithm interleaves function evaluations with decision steps that are based on the results of these function evaluations. This makes it impossible to pipeline the evaluations except for the rare cases when all vertices need to be reevaluated. This leads to a low and somewhat random experimental duty cycle and thus potential thermal drifts. 11.3.4 Particle Swarm Optimization Relatively recently, an attempt to model social decision making behavior led to the invention of a new class of Direct Search algorithms called “Particle Swarm Optimization” [Eberhart and Kennedy, 1995]. These algorithms are based on the random placement of “particles” throughout the parameter space. Each particle probes the fitness of the function at its position and keeps track of the position of the best fitness it has ever observed. The particles’ positions are updated sequentially by simulating their motion through parameter space under forces that accelerate them randomly towards the current globally known best position 256
and back towards the position of their personal best fitness. The particles are assigned a uniform inertial mass to favor exploration, and the space is given a viscosity to eventually damp the motion and cause convergence. Particle Swarm Optimization is considered a class of algorithms, as there are many possible ways in which the algorithm can be implemented. Choices include whether the particles’ knowledge of the global best position is limited to information about only a few neighboring particles, whether particles get added to or removed from the swarm dynamically, whether other points of attraction or repulsion are kept, and the exact values of the particles’ mass and the space’s viscosity. This flexibility makes it possible to customize the algorithm to yield optimal behavior for the circumstances of the given problem. Particle Swarm Optimization has several major advantages. It is very robust against noisy data and getting stuck in local minima (if the number of particles, their mass and the vicosity are chosen right). It can be modified to allow for pipelining with 100% function evaluation duty cycle. The modification requires a slight relaxation of the definition of global best fitness in that it must exclude the new values for points that are currently being evaluated. This restriction does not noticeably reduce the performance of the algorithm, though. Last, but not least, it is extremely easy to implement and has very little overhead. One disadvantage is the number of function evaluations that the algorithm 257
- Page 234 and 235: possible to read out all qubits cor
- Page 236 and 237: simultaneous application of such me
- Page 238 and 239: Figure 9.4: Capacitive Coupling Swa
- Page 240 and 241: Figure 9.6: Capacitive Coupling Pha
- Page 242 and 243: a phase difference of 0 ◦ rather
- Page 244 and 245: Figure 9.7: Fine Spectroscopy of Re
- Page 246 and 247: Figure 9.8: Swapping Photon into Re
- Page 248 and 249: esonator. The latter calibration is
- Page 250 and 251: Figure 9.11: Resonator T 1 - a) Seq
- Page 252 and 253: 224
- Page 254 and 255: He does not throw dice” [Einstein
- Page 256 and 257: (independent of which axis it is),
- Page 258 and 259: of E xy is measured by expressing i
- Page 260 and 261: For a measurement of two particles
- Page 262 and 263: 10.2.1 Photons The first and most n
- Page 264 and 265: 10.2.3 Ion and Photon In the attemp
- Page 266 and 267: 238
- Page 268 and 269: 11.1 State Preparation Since the ab
- Page 270 and 271: Figure 11.1: Bell State Preparation
- Page 272 and 273: Table 11.1: Entangled State Density
- Page 274 and 275: Figure 11.2: Bell Measurements - a)
- Page 276 and 277: 11.2.3 Statistical Analysis For the
- Page 278 and 279: 11.3 Calibration The most difficult
- Page 280 and 281: Table 11.4: Sequence Parameters - R
- Page 282 and 283: ally determine an optimal value for
- Page 286 and 287: equires to converge. Since the algo
- Page 288 and 289: Table 11.7: Bell Violation Results
- Page 290 and 291: Table 11.9: Error Budget - Capaciti
- Page 292 and 293: Table 11.10: Bell Violation Results
- Page 294 and 295: Bell inequality. To make this claim
- Page 296 and 297: Table 11.12: Bell Violation Results
- Page 298 and 299: 11.5.2 Dependence of S on Sequence
- Page 300 and 301: point where the measurement happens
- Page 302 and 303: The second qubit is not driven and
- Page 304 and 305: Figure 11.6: Resonator Coupled Samp
- Page 306 and 307: From the resulting states, the 16 p
- Page 308 and 309: Table 11.14: Error Budget - Resonat
- Page 310 and 311: Table 11.15: Bell Violation Results
- Page 312 and 313: sponding to a violation by 244.0σ.
- Page 314 and 315: educe energy decay and dephasing an
- Page 316 and 317: John F. Clauser, Michael A. Horne,
- Page 318 and 319: J. Majer, J. M. Chow, J. M. Gambett
- Page 320: Matthias Steffen, M. Ansmann, Rados
ages are based on this method, including Matlab’s “fminsearch” function. The<br />
algorithm also handles noise reasonably well.<br />
The major drawbacks <strong>of</strong> the simplex method are that its performance is extremely<br />
dependent on the choice <strong>of</strong> the initial simplex, which is not always easy<br />
to do right. It also gets stuck fairly easily in local minima. Furthermore, the<br />
algorithm interleaves function evaluations with decision steps that are based on<br />
the results <strong>of</strong> these function evaluations. This makes it impossible to pipeline the<br />
evaluations except for the rare cases when all vertices need to be reevaluated. This<br />
leads to a low and somewhat random experimental duty cycle and thus potential<br />
thermal drifts.<br />
11.3.4 Particle Swarm Optimization<br />
Relatively recently, an attempt to model social decision making behavior led<br />
to the invention <strong>of</strong> a new class <strong>of</strong> Direct Search algorithms called “Particle Swarm<br />
Optimization” [Eberhart and Kennedy, 1995]. These algorithms are based on the<br />
random placement <strong>of</strong> “particles” throughout the parameter space. Each particle<br />
probes the fitness <strong>of</strong> the function at its position and keeps track <strong>of</strong> the position<br />
<strong>of</strong> the best fitness it has ever observed.<br />
The particles’ positions are updated<br />
sequentially by simulating their motion through parameter space under forces<br />
that accelerate them randomly towards the current globally known best position<br />
256