Error accumulation / propagation



On 2013-10-08, Rupert Swarbrick <rswarbrick at gmail.com> wrote:

> I'm a bit confused about this bit. This is sine/cosine rather than their
> inverses. So at the edge of their output ranges, they have zero
> derivative. Since the trig functions are analytic with well behaved
> coefficients, I'm interested to know what can go weird?

Well, linear combinations are by far the easiest to handle, since it is
just rescaling and shifting. But anything other than a linear
combination changes the shape too. E.g. suppose foo is a Gaussian
density (easiest to deal with), then sin(foo) will be some asymmetrical
shape with steep peaks at 1 and -1, and perhaps with another peak
somewhere in the middle (depends on the mean & s.d. of foo). sin(foo)
is approximately Gaussian if its mean is several s.d. away from peaks
and troughs of the sine function -- otherwise it's "interesting".

> I agree with this and I'd be interested to read what people have done on
> the subject. Presumably cleverer than "Input noise with some
> distribution. Record output distribution" ! Where should I be looking
> for more information?

Well, in some simple cases, it is possible to find the density exactly
via a change of variables method. E.g. see Papoulis, "Probability,
Random Variables, and Stochastic Processes", somewhere in the first
few chapters. But that might be hard to work with -- e.g. can you find
the convolution of that density with another one (to find the density
of a sum of variables). Also if there are variables which are not
independent, that will make the problem more complicated too.
If you're interested in this topic, feel free to drop me a line.

best

Robert Dodier