Thanks for the feedback!
That’s a really good question. ReLU indeed only outputs values between 0 and 1 but often works really well in practice. I think part of the reason for this is that ReLU is most often used in combination with batch normalization, which normalizes the outputs of the ReLU after the activation is…
Depending on what it is that you’re trying to achieve, ReactiveX might also be a perfectly good solution. Just because it’s not pure Functional Reactive Programming in theory doesn’t mean it cannot get your job done.