recip (D x x’) = D (recip x) (- x’ / sqr x)

(I mean minus is required)

]]>this confusion is traditional in mathematics and the name “calculus of several variables” though nonsensical really is the name of the subject.

Hi Omar,

I relate to naming (more generally, language) as evolving, not static (as in “is” & “the”). I do believe muddled thinking produces muddled language, which then perpetuates muddled thinking. So I like to contribute my own bit of energy to evolving toward language (and indirectly, thinking) I’d like to see catch on. Given your encouragement to use a non-traditionally accurate replacement, I suspect you agree.

Thanks for mentioning Spivak’s rant. I vaguely remember reading & enjoying it. BTW, his book made a lasting impression on me when I was 19, and only last year was I finally able to use what I learned there.

]]>So you could call it that, using it not as description of the subject but literally just a name.

But your suggestion of “calculus on vector spaces” is even better; I hope you go with that.

By the way, in Calculus on Manifolds Spivak has a nice rant about the classical notation for derivatives, a notation which incorporates the variable confusion you mentioned.

]]>I’m unsure what label (if any) to use, as the traditional names you mentioned are problematic for me. I hear “Calculus of several variables” (or “multi-variate” calculus) as a (traditional) syntax/semantics confusion, since functions don’t have variables (though syntax for them might). And “calculus on R^n” is a more specific setting than I have in mind. Perhaps “calculus on vector spaces”.

]]>I just wanted to point out a mathematical naming issue:

In the abstract you mention “calculus on manifolds”, but your paper doesn’t use manifolds at all. Finite-dimensional real vector spaces are very simple special cases of manifolds so it is misleading to mention manifolds at all. Instead of “calculus on manifolds” you can use the traditional names for the calculus on vector spaces: “calculus of several variables” or “calculus on R^n”.

You should correct this lest more mathematicians get excited by the reference to manifolds in the abstract and later disappointed when they read the paper, as happened to me.

]]>If you define an integral function for AD values, it’s likely recursive. For example, using Euler method with a fixed global dt:

dt = 0.01 next (D v d@(D u _)) = D (v + u * dt) (next d) integral x t = xs !! (truncate (t / dt)) where xs = x : (map next xs)

Then if you use it with any AD value (all high order AD values would be recursive, including constants), there is nested recusion. For example:

e = D 1.0 e

Enter “integral e 1″ and “exp 1″. Well, approximately the same, and that’s good.

Enter “integral e 100″. Boom! It blows up immediately.

Note that by using the list “xs”, it’s already memoizing past values. So I don’t think there exists an easy memoization solution to this problem.

]]>BTW, the most basic operation on a high order derivative is to integrate it over an given interval. But it’s difficult to do this efficiently with the stream representation for AD in Haskell. So I would question the efficiency claim you made in the introduction.

The comment about efficiency is about sampling the derivatives, not integrating them. For instance, to compute surface normals during rendering.

It’s very easy to run into space leaks when there is nested recursion. I won’t get into the details, but it’s yet another problem caused by the loss of sharing in lazy evaluation. Explicit knot tying is a possible solution, but it’s ugly and utterly destroys the beauty of AD.

Are you referring to the challenges with recursive *integrals*, or with derivatives?

I’m not sure if the exact ideas have been implemented in Haskell before, but for me, I first learnt about it from Henrik Nilsson’s paper “Functional Automatic Differentiation with Dirac Impulses”.

Thanks for the reminder about Henrik’s paper, which I like very much. Henrik’s paper extends Jerzy Karczmarczuk’s higher-order AD to impulses (distributions), while mine extends it to vector spaces (foundation for general manifolds). I’ve added a citation to Henrik’s work. Thanks!

]]>