The relevant issues, was Re: Let's not BE NP-hard, shall we...?
> Date: Fri, 05 Aug 2011 10:05:47 -0400
> From: Ralph Baker Kearfott <rbk@xxxxxxxxxxxx>
> To: "Kreinovich, Vladik" <vladik@xxxxxxxx>
> CC: Dan Zuras Intervals <intervals08@xxxxxxxxxxxxxx>,
> "stds-1788@xxxxxxxxxxxxxxxxx" <stds-1788@xxxxxxxxxxxxxxxxx>
> Subject: Re: Let's not BE NP-hard, shall we...?
>
> The question here, however, is the relevance of this to our deliberations.
Indeed.
> In particular, "equivalent transformations," if I understand your idea
> correctly, deal with rearranging expressions according to the usual rules
> for real-valued expressions. On the other hand, isn't the potential
> determinism we are considering here dealing with the result of a single
> unary or binary operation (i.e. one of the four elementary operations,
> a list of designated standard functions, or basic interval set operations
> such as union or intersection)?
No. We can define the basic required functions (& even
the optional ones) to be deterministic just as easily as
754 did.
What everyone is concerned with here is the evaluation
of expressions. And part of the misunderstandings
floating about in the discussion are that everyone is
considering their own class of potential rearrangements
or optimizations or indeterminate parallel behavior.
BTW, part of Prof Demmel's (& others') work on parallel
sums & dot products demonstrates that there are methods
of correctly rounding (i.e. deterministic) the results
in most cases (bounded number of operands) even in the
face of on-the-fly indeterministic parallelism. The
cost is high but only by constant factors. (Not unlike
intervals themselves. :-) The trick involves partial
sorting & knowing how much extra precision is needed for
that number of operands to insure a correct result.
(Prof Kulisch's superaccumulator work many years ago
implied something similar but he did not carry it out in
this direction.)
I mention this because although an efficient deterministic
parallel algorithm does not yet exist, this work suggests
it may exist in the very near future.
Not for optimal enclosures. As Nick & others pointed out
they are forever NP-hard. No, algorithms may soon exist
for some particular definable, feasible, but not optimal
evaluation.
> Isn't determinism as the result of rearrangement of expressions best left
> to people (or standardization committees) dealing with compilers?
On the one hand, yes, the compiler people will take on
the task of rearrangement of expressions as they see fit.
On the other hand, speed rather than accuracy or even
determinism is overwhelmingly their goal.
> Given a particular order in which operations are done, the results of the
> operations (and corresponding decorations) can be well-defined.
>
>
> Best regards,
>
> Baker
>
Just so.
And this was the form of deterministic evaluation I was
proposing in my recent failed discussion in this forum.
In particular, I suggested that the obvious order of
operations was that which was to be found in the source
code itself with no rearrangements or optimizations at
all.
I called this "blind evaluation". After having read
Vladik's new paper, I see he calls this feasible method
a "straightforward interval computation".
But the discussion got side tracked into issues of
indeterminism due to parallelism, unspecified associations,
& the NP-hard nature of optimal enclosures.
None of these things touch on the question of does there
exist a feasible evaluation that everyone can understand
& agree on. I suggest "straightforward interval
computation" is that standard form.
Now that's not quite true. In languages like C & Fortran
there is a tradition (specification, unspecification,
laziness, whatever) of unneeded latitude in matters of
association & the passing of parameters. Don't get me
started on side effects.
We can try to do something about that but we can only go
so far to aid those determined to shoot themselves in the
foot.
Yours,
Dan