A flip-flop in a digital computer is indeed describable by a differential equation, as surely as any other analog system is (all implementational hardware is of course analog), but the computation it is performing is not. To know what that is you need to look at the level of what the flip-flop patterns are encoding. That's implementation independence.
McDermott suggests that I am holding ``computers'' and ``computation'' to distinctions that are either irrelevant or untenable. If this is meant to endorse ecumenism about computation, I would refer him to my response to Dietrich: If computation is allowed to become sufficiently broad, ``X is/is-not computation'' becomes vacuous (including ``cognition is computation''). McDermott doesn't like my own candidate (interpretable symbols/manipulations) because sometimes you can't specify the symbols. Fine, let it be interpretable code then (is anyone interested in uninterpretable code?). Code that ``refers'' only to its own physical implementation seems circular. Causal connections between the code and computer-external things that it is interpretable as referring to, on the other hand, are unexceptionable (that's what my own TTT calls for), but surely that's too strong for all the virtual things a computer can do and be. (When you reconfigure a digital computer to simulate all others -- say, when you go from a virtual robot to a virtual planetary system -- are you reconfiguring the (relevant) ``causal connections'' too? But surely those are wider than just the computer itself; virtual causal connections to a virtual world are not causal connections at all -- see the Cheshire cat response to Dyer).
One can agree (as I do) that nothing essential is missing in a simulated rainstorm, but the question is: Nothing essential to what? I would say: to predicting and explaining a rainstorm, but certainly not to watering a parched field. So let's get to the point. We're not interested in rainstorms but in brainstorms: Is anything essential missing in a simulated mind? Perhaps nothing essential to predicting and explaining a mind, but certainly something, in fact everything, essential to actually being or having a mind. Let's not just shrug this off as (something interpretable as) ``self-modeling capacity.'' Perhaps the meanings of McDermott's thoughts are just something relative to an external observer, but I can assure you that mine aren't!