New Gadget Madly In Hope
Jump to navigation
On: 17 March 2021
I'm starting to do a few experiments with Apple's haptic support. Yeah, I know, software when I really want everything to not need batteries. But what I'm thinking of is right up Haptician's alley. I'm going to make an Audio Unit that can interpret audio and MIDI into signals onthe haptic motor (only possible in certain iPhone models).
Basically:
- Signals will be turned into events. signals are MIDI messages, band passed filtered audio envelope followers, pitch trackers, gat detectors, noise detectors: a number of raw sources that become event messages
- A Mapper maps events into synthesizer commands The events are mapped, using state machines, into synthesizer commands. In this case, the synth is the tactic motor (although also possibly audio as well). The synth commands aget put in the synth's scheduling queue.
- Synthesis The synth operates onthe commands that are scheduled in the queue.
This is more or less how AUMI Sings works.
What will develop on the synth side is a number of tactile patterns that act like notes. There's a small amount of polyphony: a continuo "drone" that can be modulated a little, with one or two "solos" on top. Because it's an AU, it can be integrated into the iOS Music ecosystem of MID and audio effects, preset saving and sharing, and dynamic parameter changing.
Comments
Add Comments