Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: back to the roots



I wrote "it *can* be a huge mistake". In my experience, it almost always is because few people take the time and spend the effort to perform sensitivity analyses (for example using Monte-Carlo simulation) to see how sensitive computed answers are to input uncertainties. Our big advantage is supposed to be that we can get rigorous bounds on the sensitivity of computations to input uncertainties and do so more efficiently.

On 7/1/13 5:24 AM, Vincent Lefevre wrote:
On 2013-06-29 07:16:51 -0700, G. William (Bill) Walster wrote:
I agree that in situations, such as your example, when you are evaluating a
monotonic function over an interval and therefore can use endpoint
evaluation, extra precision can be useful.  However, surely, this is not the
general case.  It can be a huge mistake to use input numbers that are only
good to 4 or 5 decimal digits of accuracy and then interpret them as
infinitely precise, as Ulrich himself did in his note about the steam
turbine that blew up and killed 6 people.
It isn't necessarily a mistake to regard such inputs as exact, as
long as at the end, one remembers that they weren't. But wanting to
get exact results from them (such as with the EDP) is a bad idea in
general, since it takes resources (time and memory) and is not really
useful.