The key claim from this idea is that we should measure information not in the traditional sense of information, but as what percent of the total number of configurations can reach a certain amount of function. As an example, say that out of some configuration of 20 DNA amino acids, 10 of them were able to generate 1 ATP/second. This would give you a functional information of -log_2 \frac{10}{4^{20}} . The important thing to note is that this is always in relation to some sort of arbitrary function.
My rough vibes here are that this is as immeasurable as something like Kolmogorov. The claim that normal information theory is lacking something since the same DNA in a zygote, adult, and dead person lead to very different outcomes is true, but this certainly isn’t a final synthesis of how it is true. Perhaps there are better ways to integration Shannon information theory into the entire system?