I think that you may find some interesting reading on a similar vein and with some curious twists in the book “Probability Theory, The Logic of Science” by E T Jaynes, specifically chapter 2. This derives the quantitative rules of probability theory from the ‘basic desiderata’ of ‘plausible reasoning’. What I find interesting is firstly another fine example of the approach you have applied here and from what I surmise quite widely elsewhere; the approach is the same and the particular steps are also similar although as the problem is more general the equations involved are more general functional equations, one being ‘The Associativity Equation’. But it is also interesting that the author took this approach, with some inspiration from G Polya, around 65 years ago and while active in applying computation to probability and statistics was certainly not of a theoretical computer science orientation.

]]>It seems more natural to express Vect as a symmetric bimonoidal category (http://ncatlab.org/nlab/show/bimonoidal+category) using the tensor product and direct sum. (***) is basically the tensor product, but I think the direct sum of transformations would be more like (+++).

]]>in what way is this a “reimagining”? That matrices represent linear transformations is absolutely fundamental.

Thanks for asking.
What I mean by “reimagining” is (a) packaging of linear maps via the `Category`

& `Arrow`

vocabulary (more explicit in the library), (b) structuring the representation and semantics to match the algebraic structure of `dot`

and `(&&&)`

, and (c) derivation of operations from semantics.