Inconsistency between timer and timer object
It seems that the global timer can take a value such as
0.1but when using a timer object I'd have to pass it a value of
Yes this is one of the most annoying inconsistencies, but I can't change it without breaking every single use of the timers.
The reason for it was that I created the global timer first and didn't bother about milliseconds as unit, but the UI timer object is just a wrapper around a JUCE class which uses milliseconds as unit.