As someone who has plumbed the depths of bigint optimization across multiple ISAs for elliptic curve cryptography, yes, it is indeed very awkward.

But in the context of your analysis, wouldn’t decoupling arithmetic from machine word size–entirely–be the purest ADT?

]]>Actually “continuous time” seems like unfortunate language. That issue is whether time is the *continuum*, ie. the real line. “Continuous” should be kept for functions between topological spaces. I could be talking out of the wrong orifice, but it seems to me that the only functions from the reals
to a discrete space like {0,1} that can be defined constructively, are precisely the constant functions. (But if we took time to be Baire-space, which
is homeomorphic to the irrationals, there are lots of non-constant continuous functions that can be defined constructively.)

Oh, on the topic of grep, I think in one respect it is *not* a (total) function on infinite streams.
After all “grep a input” does not have such a value when the
argument does not contain infinitely many ‘a’s.

Thanks for raising this issue about sinc. Perhaps the infinite extent is not necessarily a problem. Suppose we want to extract only finite information/precision *in the end* (e.g., 44K samples per second and 32 bits per channel per sample), while (in theory) computing the exactly correct signal in the middle (during composition). What I really mean is that the extracted approximation must agree with the exact value to the precision of the approximation. Now the trick is to implement the composition steps finitely (and better yet, efficiently) *and* correctly.

Although sinc has infinite extent, I’d guess that its influence on the finite information eventually extracted is (provably) finite. So, with some cleverness, an implementation could probably get away with carrying forward only finite information.

What do you think?

]]>I’ve been thinking about approximation and denotational design a fair bit lately, exploring two different paths. One is to work with *exact* values, and then allow extraction of any finite amount information to be extracted. We play this game as a matter of course in lazy functional programming, e.g., with infinite lists & trees. Additional examples are *images* and *surfaces*, defined over infinite & continuous space, and *behaviors*, defined over infinite & continuous time. Similarly, I recently realized a simple way to perform exact numeric integration.

Another path is to compute only *inexactly* and use the ideal denotational semantics in order to quantify precisely the distance between ideal and implementation. With this path, we can still speak precisely about the *accuracy* of our implementations, and about choices that improve or degrade accuracy (no hand-waving needed).

You do indeed deal with approximations in some of your semantic designs. For instance sampling continuous time to make it discreet. I suppose that is one example of what I was looking for.

Still, given a denotational semantics, it might not be obvious how and where to introduce the required approximations to derive an effectively computable implementation.

]]>You advocate a “semantic” or “denotational” way of designing things which establishes the nature of the problem we are trying to solve. One can also say that it acts as a specification for the problem. But you go further than using the semantics as a specification, you have demonstrated several times how to derive an actual computable implementation from the semantics.

I wonder if you could elaborate on the case when deriving an implementation from the semantics inherently requires some form of approximation. It seems that in all your examples you haven’t had to deal with that question. But I have a background in program analysis. In that field it is also a good idea to establish the underlying semantic property one wishes to establish about programs before embarking on designing the actual program analysis. But one can never hope to derive a computable analysis from the semantic property. There is an inherent need for introducing some form of approximation in order to achieve that.

One answer could be Abstract Interpretation. But it doesn’t give us a particular approximation, it only tells us what an approximation should/could look like.

Anyway, I’d like to hear your thoughts on the matter.

]]>