Question
James Wilkinson’s discovery that the results of this task are highly sensitive to small perturbations of a “perfidious” input was the “most traumatic experience [of his] career.” Polishing the results of this task when using forward or backward deflation minimizes the impact of increasing errors. The basins of convergence for an algorithm for this task form a fractal, as shown by interpreting the algorithm as a meromorphic function and looking at its Julia set. A superlinear algorithm for this task has an order of convergence equal to the golden ratio. Ridders’s method for this task makes the false position method more robust. Bracketing methods for this task rely on the intermediate value theorem. An algorithm for this task subtracts the input function over its derivative at every iteration. For 10 points, the Newton–Raphson method performs what task of determining where a function crosses the x-axis? ■END■
Buzzes
Player | Team | Opponent | Buzz Position | Value |
---|---|---|---|---|
Matthew Siff | Yale B | WUSTL B | 121 | -5 |
June Yin | WUSTL B | Yale B | 144 | 10 |