You cannot measure "actual" harm, because the system is dynamical.
You wanted a definition of externality, I gave it to you. There's a function that gives you the upper bound on the cost of an externality, which for lack of a better term, we can call the clean up function. This is the cost of returning the system to its original state.
You don't need a magic list, you need a way of demonstrating that certain actions are within the flight envelope of the environment, ie, there is some corrective force within the environment that maintains the invariant for you, and a policy of denying all actions that do not come with such a demonstration. The list can expand as scientific understanding expands, and we are allowed some leeway by the intrinsic stability of the system. If evolution could not absorb some unexpected perturbations, we wouldn't exist in the first place, but we should treat this as a finite resource, for which we do not have an amount.
Evidence that we don't do this at all is just you being in denial about the terribleness of the human strategy.
Imagine, for metaphor, that you are in some (solar powered) alien plane, and the pilot dies. You go into the cockpit, knowing that you just have to keep the plane flying. You make a mental note that all of the instruments on the panel are fixed, and the position of all of the controls. You decide that you want to fly faster so you fiddle around with the contrls until you can tell that the landscape is going underneath you faster than it was before. You note that some instrument on the panel is increasing. Do you A) ignore it, or B) return the controls to their original state?
If you can't measure it how do you know it exists? That is at least half of my point, the other half being you can't assume a specific course of action is necessary and proper without demonstrating the harm actually occurred. That you think harm might occur is not enough, you need to show it has occurred (and can probably use proof of past harm to show that inevitable harm will occur).
Your example is disingenuous because a cockpit instrument by definition monitors something you want to pay attention to. We can measure and record many things, most of which are probably irrelevant to many problems and their solutions.
You wanted a definition of externality, I gave it to you. There's a function that gives you the upper bound on the cost of an externality, which for lack of a better term, we can call the clean up function. This is the cost of returning the system to its original state.
You don't need a magic list, you need a way of demonstrating that certain actions are within the flight envelope of the environment, ie, there is some corrective force within the environment that maintains the invariant for you, and a policy of denying all actions that do not come with such a demonstration. The list can expand as scientific understanding expands, and we are allowed some leeway by the intrinsic stability of the system. If evolution could not absorb some unexpected perturbations, we wouldn't exist in the first place, but we should treat this as a finite resource, for which we do not have an amount.
Evidence that we don't do this at all is just you being in denial about the terribleness of the human strategy.
Imagine, for metaphor, that you are in some (solar powered) alien plane, and the pilot dies. You go into the cockpit, knowing that you just have to keep the plane flying. You make a mental note that all of the instruments on the panel are fixed, and the position of all of the controls. You decide that you want to fly faster so you fiddle around with the contrls until you can tell that the landscape is going underneath you faster than it was before. You note that some instrument on the panel is increasing. Do you A) ignore it, or B) return the controls to their original state?