I don't agree with the author's conflation of concurrency with non-determinism. To me, concurrency means there may be more than one operation logically "in flight" at any given time; parallelism means there may be more than one operation physically happening at any given time. Determinism vs non-determinism is an orthogonal issue.
You could have a single-threaded cycle-counting CPU simulator which is entirely deterministic, yet if it is simulating a multi-threaded program, that program would exhibit concurrency (for example, if it were a web server, it could be serving multiple requests over separate TCP connections concurrently, working a little on each request round-robin).
To appreciate his point, you must be aware that Harper is thinking about a formalization of concurrency. In such a formalization, you could have a deterministic execution of the concurrent processes as you hint, but you will also have a trace of incoming events in the formalization, not under the control of the CPU.
For the system as a whole to be deterministic, you would have it be deterministic for arbitrary event traces. This is rarely the case in practice though. Harper does not tend to just sling out a postulate unless he has good reason to think it is so, backed up by a formal system in which he identified the association.
It could be that he has identified concurrency goes hand in hand with non-determinism. To me, it does sound rather plausible.
It sounds poorly defined to me. Concurrency for me means a specific thing, and that specific thing does not have a necessary implication of non-determinism. My position is similar to dmbarbour's (https://existentialtype.wordpress.com/2011/03/17/parallelism...).
Your distinction appeals to physicality, to a "real world". The author comes from a purer mathematical perspective, where it does not matter whether the code runs on silicon, is emulated by another machine, or is evaluated by humans on a blackboard.
Parallelism is purely an efficiency issue; the result of a computation does not change if run sequentially or in parallel. In other words, it only affects the cost model, not the semantics.
Concurrency means adopting some non-determinism in the semantics, either for necessity (e.g. in concurrent servers) or as a program structuring mechanism. In your example, if we are concerned with the semantics of the simulator, then there is no parallelism OTOH, if we are concerned with the semantics of the simulated code, then it's likely that we would use concurrency in it's definition (even if we "really know" that there is no underlying physical nondeterminism).
Parallelism is just an optimization; and concurrency does not imply non-determinism. Concurrency means you've got more than one thing in flight at a time. Having more than one task in flight, whether you switch using coroutines, or asynchronous completions, or an OS scheduler running on a timer interrupt, or cycle counting in a virtual machine, or via higher level language constructs like futures and promises, or reactive programming, is concurrency; but not all of these things are non-deterministic.
Peter Van Roy's book defines it as "an execution is called nondeterministic if there is an execution state in which there is a choice of what to do next". I believe the only thing you mention that does not conform to this definition is co-routining, but I wouldn't categorise it as concurrency either (nor does the book).
There's a nice answer on FRP at http://stackoverflow.com/questions/1028250/what-is-functiona... - if choices in execution are not revealed to the program, you can still have concurrency without non-determinism. But again, one needs to call in to question what tasks are making progress concurrently, because depending on the level of abstraction you're looking at, you can say that there is concurrency or there isn't. In other words, you might have parallelism at a low level of abstraction, which is a physical optimization of concurrency, and at that low level it might even be non-deterministic, but that non-determinism may not be exposed at a higher level.
Take a spreadsheet implementation as an example. There may be multiple events coming in, and numbers in the sheet updating as the expressions are recalculated. It might even be a multi-user spreadsheet; so it seems there is concurrency (for some level of abstraction). But is there non-determinism?
Sure, you can have deterministic concurrency. But that's not what he's talking about. Concurrent operations being non-deterministic is the accepted norm. If you're talking about deterministic concurrency, then you can qualify it as such; anything else is pedantry.
If someone is trying to be precise in what they talk about, pedantry is correct. The trouble I had with the author is that he is redefining what I understand by the term concurrency.
You could have a single-threaded cycle-counting CPU simulator which is entirely deterministic, yet if it is simulating a multi-threaded program, that program would exhibit concurrency (for example, if it were a web server, it could be serving multiple requests over separate TCP connections concurrently, working a little on each request round-robin).