CreepyPants
2016-07-19 19:20:57
Still a n00b at coding MTPro, so please bear with me (although I have pgmg background, but that's from decades ago when we still used mainframes).
Situation:
Crossfader from Controller sends MIDI CC data, which I want to convert to outgoing CC data on a different channel and CC#. That's the easy bit.
Problem:
When using the crossfader quickly, MIDI seems to choke - I believe too much data. Can use the crossfader with moderate speeds, and slow speeds quite effectively, although I'm still in testing mode and have yet to develop the layers of modulation I'm hoping I can achieve.
Recgnizing:
When the crossfader is used quickly (lots of CC data in short amount of time), fine data changes are not necessary.
Assumed algorithm:
A time based test against Timer values to only pull every so many ticks. (Suggestion on resolution, perhaps?)
Data:
Note: Current data may be invalid due to miscalculation, but I'm assuming calculations are correct.
When crossfade is used quickly, it generates a CC value every 1.42ms. Slow use of the Crossfade is around 60-90ms. Moderate use in practice is about 10-14ms.
I'm thinking with 8 channels of MIDI data, some general CC data flowing *through* MTPro, and processing (converting Ch9 CC 48 -> Ch 1 CC1 & Ch 2 CC 4 for example) through MTPro I could get a 10ms resolution with no audible dramas.
Question: So, I figure I'd set a timer to sample the CC data to process every 10ms or so. Is there a thread that highlights this technique that I'm missing on searches or (sorry for the n00b question) a section of the manul someone might direct me to?
Situation:
Crossfader from Controller sends MIDI CC data, which I want to convert to outgoing CC data on a different channel and CC#. That's the easy bit.
Problem:
When using the crossfader quickly, MIDI seems to choke - I believe too much data. Can use the crossfader with moderate speeds, and slow speeds quite effectively, although I'm still in testing mode and have yet to develop the layers of modulation I'm hoping I can achieve.
Recgnizing:
When the crossfader is used quickly (lots of CC data in short amount of time), fine data changes are not necessary.
Assumed algorithm:
A time based test against Timer values to only pull every so many ticks. (Suggestion on resolution, perhaps?)
Data:
Note: Current data may be invalid due to miscalculation, but I'm assuming calculations are correct.
When crossfade is used quickly, it generates a CC value every 1.42ms. Slow use of the Crossfade is around 60-90ms. Moderate use in practice is about 10-14ms.
I'm thinking with 8 channels of MIDI data, some general CC data flowing *through* MTPro, and processing (converting Ch9 CC 48 -> Ch 1 CC1 & Ch 2 CC 4 for example) through MTPro I could get a 10ms resolution with no audible dramas.
Question: So, I figure I'd set a timer to sample the CC data to process every 10ms or so. Is there a thread that highlights this technique that I'm missing on searches or (sorry for the n00b question) a section of the manul someone might direct me to?