We learned, so long ago we've probably forgotten when, about the
Taylor Series.
\begin{equation}
f(x) = f(x_0) + (1/2)(x-x_0) f^{'}(x_0) + (1/6)(x-x_0)^2 f^{''}(x_0) + ...
\end{equation}
If the derivatives are large, this might not converge very quickly (if at all).
If the function $f$ is reasonably well behaved, and positive, we can try looking at products instead. Never mind the complex logarithms for now.
\begin{eqnarray}
f(x) = e^{log(f(x))} \\
g(x) \equiv log(f(x)) \\
g(x) = g(x_0) + (x-x_0)g^{'}(x_0) + (1/2)(x-x_0)^2 g^{''}(x_0) + \dots \\
f(x) = f(x_0) e^{(x-x_0)g^{'}(x_0)} e^{(1/2)(x-x_0)^2 g^{''}(x_0)} \dots
\end{eqnarray}
Will this converge any faster? For a
distance use the difference between the approximation so far and the true value, divided by the true value.
Pick a couple of simple examples: $f(x)= e^{x}$ and $f(x) = x^2$.
The first one converges much faster with a "Taylor Product"
\begin{eqnarray}
f(x) = e^x \\
g(x) = x \\
g(x) = x_0 + (x-x_0) 1 + 0 + 0 + 0 \dots \\
f(x) = f(x_0) e^{(x-x_0)} \times 1 \times 1 \dots
\end{eqnarray}
The "distances" for the approximations are
\begin{eqnarray}
(f(x)-f(x_0))/f(x) = 1 - e^{x-x0} \\
(f(x)-f(x_0) e^{(x-x_0)})/f(x) = 0 \\
0 \\
\dots
\end{eqnarray}
The second function example is, of course, much easier to approximate with a Taylor Series; you only need three terms for it to be exact.
\begin{equation}
f(x) = x_0^2 + (x-x_0)\times 2x_0 + (1/2) (x-x_0)^2 \times 2 = x^2
\end{equation}
But never mind that; let's use the "Taylor Product" anyway.
Here $g(x) = 2\log(x)$
If we let $x=x_0+1/2$, and let $x_0 = 1$
order | g deriv | $g^{(n)}$ at $x_0=1$ | scale term | f cumulative |
| | at $x_0=1$ | | error frac |
0 | $2\log(x)$ | 0 | 1 | .555 |
1 | $2/x$ | 2 | 2.718 | -.208 |
2 | $-2/x^2$ | -2 | .7788 | .059 |
3 | $4/x^3$ | 4 | 1.0869 | -.023 |
4 | $-12/x^4$ | -12 | .9692 | .009 |
... | ... | ... | ... | ... |
Suppose instead that $x=x_0 + 1/2$ but $x_0 = 1000$. It probably won't surprise you to see that it converges faster, using the given distance measure.
order | g deriv | $g^{(n)}$ at $x_0=1$ | scale term | f cumulative |
| | at $x_0=1$ | | error frac |
\hline
0 | $2\log(x)$ | 0 | 1000000 | .001 |
1 | $2/x$ | .002 | 1.0010005 | -2.5 E-7 |
2 | $-2/x^2$ | -.000002 | .999999 | 8.3 E-11 |
3 | $4/x^3$ | $4\times 10^{-9}$ | 1.00000 | -3.1 E-14 |
... | ... | ... | ... | ... |
I don't know what this is actually called, and search engines turned up reams of irrelevancies.
On a related note, MathJax in Blogger doesn't understand
tabular mode.
No comments:
Post a Comment