Re: requiring hardware is futile
Arnold Neumaier <Arnold.Neumaier@xxxxxxxxxxxx>
wrote on 19/11/2008 09:01:42 AM:
> Bob Davis schrieb:
> >
> > IA, to be successful, will have to be a living implementation
based on
> > the Standard and capable or surviving changing technology and
changing
> > needs. If you required a specific hardware implementation, it
would: 1)
> > become slowly obsolete based on new architectures and processors;
and 2)
> > be extremely difficult to get the approval of the larger group
of
> > experts that will serve on the sponsor ballot, including, most
likely,
> > representatives of the various hardware, silicon, and software
> > implementors as well as other academicians along with many members
of
> > this group.
>
> My next version of the proposal (to come end of this week or early
next
> week) will contain (only) the following about hardware support:
>
> Suggestions for hardware implementation:
> A proposed minimal useful set of hardware operations which would
> significantly speed up existing applications in constraint programming,
> global optimization, and computational geometry are the following
> operations defining core interval arithmetic.
> - forward and reverse interval operations for
> plus, minus, times, divide, sqr
> - mixed operations with one interval argument for
> plus, minus, times, divide
> - forward sqrt, exp, log
> - linearInt, linearExt, shiftedDivision
> [new proposed operations essential for computational
geometry]
> - mid, rad
> - division with gaps
> - optimal enclosure of inner product
> These operations might be singled out to represent level 1 of
> the standard; the remaining would represent level 2.
> An appendix might suggest explicit model algorithms for these
> operations.
>
> Could you please comment on this?
>
>
> Arnold Neumaier
A standard development team should start
with the requirements (typically varied and conflicting), imagine ideal
solutions, then creatively compromise to something practical. Splitting
the standard into core and one or more advanced parts can be useful.
Aside from the standard, useful products
are a Rationale (which should include requirements both addressed and sacrificed),
usage suggestions/examples, and implementation suggestions. The standard
should require that a processor accomplish the required result, not that
it use a particular design, not that it must be CISC or RISC, not that
it must use hardware instead of software or a human with pen and paper.
Alternatives and optimizations that do not change the result should
of course be allowed. Those that slightly change results (whether
for the better or for the worse) can be allowed under user control and
do not need to be part of the standard.
Interval Arithmetic is clearly important
for some applications, but I'm skeptical that mainstream processors will
make major changes for it. The standard will see greater acceptance
if it is reasonably implementable with no changes to existing CPUs, languages
and compilers, and better implementable with just minor changes. I
think that's doable.
For example, an Interval Add is conceptually
one floating point add rounded down and one rounded up, plus adjustments.
How much that can share hardware with that for doing a complex add
or two floating point adds must be an implementation decision, but the
standard will affect the practicality. Fundamental decisions in the
drafts require the adjustments to or after (and in some systems before
and between) the adds. With existing systems the adjustments could increase
the cost by up to an order of magnitude, so what adjustments are really
needed and what aren't needs some thought.
Another example is that compiler optimizers
already know that for float types the operation combination X+Y-X is not
the same as Y but in a fast non-strict mode can be approximated by Y (possibly
losing an exception or two, possibly giving a different answer by avoiding
precision loss). Similar issues apply to Interval operations and
to the floating point adds etc. in them. Do optimizers have to take
care to suppress existing transformations? New opportunities exist
(eg, in fast non-strict mode replacing Interval X+Y-X with Y avoiding precision
loss), but can existing optimizers recognize them?
Adding Intervals as built in types to
multiple existing (or new) languages is expensive and slow, and will delay
availability and acceptance. In extensible languages an obvious alternative
is to define them as classes, leading to questions like whether that's
semantically practical, whether it's syntactically acceptable (I think
that depends on the language), whether and which minor compiler tweaks
would help, and how any Interval-enhanced hardware can be used. It
also emphasizes the importance of some of the issues above. One of
the benefits is that people could be using the new standard years sooner.
I assume we're still at the "imagine
ideal solutions" stage, so some of these issues don't need much thought
yet but will later.
As an IA newbee, I'd appreciate (off
list is ok) pointers to descriptions (preferably online) of typical and
non-typical IA application usage and needs. Some design decisions
are not what I expected and I would like to understand their reasons. Even
with that, I promise to at some point ask enough more ignorant questions
and offer enough more ignorant suggestions to get you all all riled up,
if the above wasn't enough to do that. 8<)
- Ian Toronto
IBM Lab 8200 Warden D2-445 905-413-3411