Thread Links Date Links
Thread Prev Thread Next Thread Index Date Prev Date Next Date Index

Re: Please listen to Ulrich here...



On 2013-08-20 08:43:49 -0700, G. William (Bill) Walster wrote:
> Finally, my concern has implications for any proposed interval standard.  As
> we did at Sun, I believe that default I/O conventions should be that when a
> small number of digits are input to an interval, no more accuracy should be
> assumed than is actually supplied by input digits.  Therefore, 0.100 should
> be interpreted either as [0.099, 0.101] or [0.995, 0.1005].  Interval I/O
> should require extra effort to input the number 0.1 as a degenerate
> interval, for example, by requiring [0.1], [0.1, 0.1], or
> 0.1000000000000000000000...
> to be input.

Actually you don't know. This may also be [0.095,0.105], in which case
[0.099,0.101] and [0.0995,0.1005] would be wrong. For this reason,
there should be no default interpretation for numbers like 0.100.

-- 
Vincent Lefèvre <vincent@xxxxxxxxxx> - Web: <http://www.vinc17.net/>
100% accessible validated (X)HTML - Blog: <http://www.vinc17.net/blog/>
Work: CR INRIA - computer arithmetic / AriC project (LIP, ENS-Lyon)