Script Only!

One strength of Maestro is playing Music without any scripting. But sometimes you want to use Prefab MPTK only by script. This is also possible!

Of course you will not be able to set prefab properties from the inspector because the prefab will be visible in the hierarchy only when application will runs.

So, all properties must be set by script. Below in the example, the minimum properties to be set are defined. Feel free to use the full API available for each MPTK Prefab and Class!!!

Want a demonstration? Add these scripts to a gameObject in your hierarchy … and run.

The simplest MIDI loader:

// ------------------------------------------------------------------------ // Load a MIDI file and process each MIDI events // this script is provided in this folder: // Assets\MidiPlayer\Demo\FreeDemos\Script\TheSimplestMidiLoader.cs // ------------------------------------------------------------------------ using System.Collections; using System.Collections.Generic; using UnityEngine; using MidiPlayerTK; namespace DemoMPTK { /// <summary> /// This demo is able to load all events from a MIDI file only by script.\n /// There is nothing to create in the Unity editor, just add this script to a GameObject in your scene and run! /// </summary> public class TheSimplestMidiLoader : MonoBehaviour { MidiFileLoader midiFileLoader; private void Awake() { Debug.Log("Awake: dynamically add MidiFileLoader component"); // MidiPlayerGlobal is a singleton: only one instance can be created. if (MidiPlayerGlobal.Instance == null) gameObject.AddComponent<MidiPlayerGlobal>(); // When running, this component will be added to this gameObject midiFileLoader = gameObject.AddComponent<MidiFileLoader>(); } public void Start() { Debug.Log("Start: select a MIDI file and load MIDI events. Any sequencer and synth are instanciated"); // Select a MIDI from the MIDI DB (with exact name) midiFileLoader.MPTK_MidiName = "Bach - Fugue"; // Load the MIDI file if (midiFileLoader.MPTK_Load()) { // Read all MIDI events List<MPTKEvent> sequence = midiFileLoader.MPTK_ReadMidiEvents(); Debug.Log($"Loading '{midiFileLoader.MPTK_MidiName}', MIDI events count:{sequence.Count}"); } else Debug.Log($"Loading '{midiFileLoader.MPTK_MidiName}' - Error"); } } }
Code language: C# (cs)

The simplest MIDI player:

// ------------------------------------------------------------------------ // Load a MIDI file and Play // this script is provided in this folder: // Assets\MidiPlayer\Demo\FreeDemos\Script\TheSimplestMidiPlayer.cs // ------------------------------------------------------------------------ using System.Collections; using System.Collections.Generic; using UnityEngine; using MidiPlayerTK; namespace DemoMPTK { /// <summary> /// This demo is able to play a MIDI file only by script.\n /// There is nothing to create in the Unity editor, just add this script to a GameObject in your scene and run! /// </summary> public class TheSimplestMidiPlayer : MonoBehaviour { // MidiPlayerGlobal is a singleton: only one instance can be created. Making static to have only one reference. MidiFilePlayer midiFilePlayer; private void Awake() { Debug.Log("Awake: dynamically add MidiFilePlayer component"); // MidiPlayerGlobal is a singleton: only one instance can be created. if (MidiPlayerGlobal.Instance == null) gameObject.AddComponent<MidiPlayerGlobal>(); // When running, this component will be added to this gameObject. Set essential parameters. midiFilePlayer = gameObject.AddComponent<MidiFilePlayer>(); midiFilePlayer.MPTK_CorePlayer = true; midiFilePlayer.MPTK_DirectSendToPlayer = true; } public void Start() { Debug.Log("Start: select a MIDI file from the MPTK DB and play"); // Select a MIDI from the MIDI DB (with exact name) midiFilePlayer.MPTK_MidiName = "Bach - Fugue"; // Play the MIDI file midiFilePlayer.MPTK_Play(); } } }
Code language: HTML, XML (xml)

Updating Maestro MPTK

Before updating, check that you are using a Unity version 2019.4.28 or newer. Maestro has been tested with the last 2019 version (LTS), 2020 and 2021.

Unity Official Releases

Obviously if you have never downloaded Maestro from the Unity store, download here the Free MPTK or the Pro MPTK.

Unity update limitation

The Unity import asset have some limitation. It’s an update, so files removed from the asset will not be removed from the Unity project.

Consequently, if you are updating Maestro Midi Player ToolKit, some older classes will remain in the folders because Unity import won’t removed these files and you will get tricky issues.

If after updating you got strange errors, remove the full MidiPlayer folder.

Please follow this steps:

First, Backup your project.

Secondly, remove the entire current Maestro version in the Unity Editor project panel.

Which means: remove folder MidiPlayer. Warning: all your MIDI’s and SoundFonts will be deleted but it’s also possible to preserve them look here.

Delete the folder MidiPlayer (one column mode)
Delete the folder MidiPlayer in the two columns mode

Finally, import Maestro. See below how to:

So don’t forget : never create your own assets (scene, script, …) in the MidiPlayer folder.  Or You will have hard difficulties to update Maestro. For example above, TestMPTK is another asset which is using the Maestro asset.

Unity Package Manager

For updating, go to the menu Window / Package Manager.

See here Unity documentation.

image.png

For updating, search Maestro in package “My Asset”. If you can’t find it, perhaps you have never downloaded Maestro from the Unity store. Download here: the Free MPTK or the Pro MPTK.

image.png
Windows view
MacOS view with Maestro update from 2.82 to 2.88.2
image.png
image.png

Preserve your Resources!

When updating, if you want to keep your MIDI and SoundFont, it’s also possible but that need some operations:

After making a backup, delete these folders from your project.
Then delete these files but not the Resources folder!
Keep only MidiDB and SoundFontDB folders.

You can now download ans install the last version of Maestro MPTK.

How to Generate Music With High Time Accuracy

The main difficulties with Unity and generated music is the precision and accuracy in time of each notes. Human are able to distinguish delay between sound until 20 ms, below we hear only one sound. It’s obvious that accuracy in music is important !

Look at this post for more information about Midi Timing.

To get a good accuracy, the method to trigger notes is of course, crucial !

Which methods are available ?

From the worse to the best! if you are in a hurry, read only the 4th πŸ˜‰

  1. The worse is to use the classical Update Unity method. Unity can’t guarantee that the call is  stable. So the tempo of your music will note be stable !
  2. The second method less worse is using the coroutine API. WaitForSeconds could be used to wait between each beat, but it doesn’t actually wait for the specified time instead it seems to check every frame whether or not the specified time has elapsed or not. That means there is always a margin of error roughly equal to the time between frames. On low frame rates this seems to be very evident. 
    See here order of execution for Unity event functions.
  3. The third method almost acceptable is the use a C# thread. This method was used in the previous MPTK demo. But, it wasn’t so perfect.
  4. Its’ the reason why we added a new method to the MPTK PRO API : OnAudioFrameStart. This callback will be run before processing each audio frame. Thank to the audio engine thread for doing the job.

OnAudioFrameStart our hero!

OnAudioFrameStart is an event defined in the low class of MPTK Pro, so it is available for these two MPTK classes : MidiStreamPlayer and MidiFilePlayer. Classically you will use MidiStreamPlayer prefab to generate notes from your script.

With the MPTK API you can associate a callback function to this event. Of course, you are responsible to build this callback function. This callback receive in parameter the synthesizer time in millisecond. You can use it to decide what action/note to done. See the example below to understand how to transform this absolute time in beat time.

This event is triggered at very precise time depending of the synthesizer rate and the buffer size. See prefabs inspector to change these values.

It’s easy to calculate the latency :

latency_ms = (BufferSize / SynthRate) * 1000

Synth Rate in HertzBuffer Size in byteDelay between each audio frame
in millisecond
960005125,33
96000102410,67
480002565,33
4800051210,67
48000102421,33
48000204842,67
24000102442,67
Calculation grid of latency vs synth setting (frequency and buffer size)

Important Notes (to be read !) :

  • It’s not possible to call Unity API function inside the callback functions (PlaysHits in the example). It’s because the callback is running in a different thread that Unity, so outside the Unity lifecycle. One exception for the Debug.Log, that could be useful but read the next notes.
  • It’s obvious that processing in the callback must be as quick as possible otherwise the sound will become disastrous. Try to keep outside the callback all costly processing and communicate to the callback with buffer, List<>, Queue<> …
  • The responsible of the Latency is not only the Audio engine! Its’ also the hardware and the UI …. and, sorry, your code!
  • On Android with Oboe, the latency with the audio engine is excellent : 5.33 ms πŸ™‚
  • On iOS, we advice you to select the best latency from Unity menu
From Unity 2017
From Unity 2020

Code example

I encourage you to look at the demo EuclideanRhythm and the source code TestEuclideanRhythme.cs to see the full example (MPTK Pro only). Below some extract :

Define the callback function to be called at each audio frame :

void Start() { BtPlay.onClick.AddListener(() => { IsPlaying = !IsPlaying; Play(); }); } /// <summary> /// Play or stop playing. /// Set the PlayHits function to process midi generated music at each audio frame /// </summary> public void Play() { lastSynthTime = 0f; timeMidiFromStartPlay = 0d; timeSinceLastBeat = 999999d; // start with a first beat if (IsPlaying) // Associate the callback function (PlayHits) to the event midiStream.OnAudioFrameStart += PlayHits; else // Remove association, the callback will not more called midiStream.OnAudioFrameStart -= PlayHits; }
Code language: C# (cs)

The PlayHits method

The PlayHits method is where you can generate midi note (it’s your code, you can named it as you want). It’s a callback function, normally you never call this method from your code (but it’s possible), the call is done back from the system πŸ˜‰

How you generate these notes are out of the competency of MPTK: Markov chain, Star position (yes, that exist!), Procedural, … See here, the subject is huge ! I’m sure you will be creative !

/// <summary> /// This callback function will be called at each audio frame. /// The frequency depends on the buffer size and the synth rate (see inspector of the MidiStreamPlayer prefab) /// Recommended values: Freq=48000 Buffer Size=1024 --> call every 11 ms with a high accuracy. /// You can't call Unity API in this function (only Debug.Log) but the most part of MPTK API are available. /// For example : MPTK_PlayDirectEvent or MPTK_PlayEvent to play music note from MPTKEvent (see PlayEuclideanRhythme) /// </summary> /// <param name="synthTimeMS"></param> private void PlayHits(double synthTimeMS) { if (lastSynthTime <= 0d) { // First call, init the last time lastSynthTime = synthTimeMS; } // Calculate time in millisecond since the last call double deltaTime = synthTimeMS - lastSynthTime; lastSynthTime = synthTimeMS; timeMidiFromStartPlay += deltaTime; // Calculate time since last beat played timeSinceLastBeat += deltaTime; /// SldTempo in BPM. /// 60 BPM means 60 beats in each minute, or 1 beat per second. /// 120 BPM would be twice as fast: 120 beats in each minute or 2 per second. /// Calculate the delay between two quarter notes in millisecond CurrentTempo = (60d / SldTempo.Value) * 1000d; // Is it time to play a hit ? if (IsPlaying && timeSinceLastBeat > CurrentTempo) { //Debug.Log($"{synthMidiMS:F0} {midiStream.StatDeltaAudioFilterReadMS:F2} {deltaTime:F2} {timeSinceLastBeat:F2}"); timeSinceLastBeat = 0d; // Create a random note. MPTKEvent sequencerEvent = new MPTKEvent() { Channel = 0, Duration = 1000, // 1 second // use of System.Random to generate the note // because Unity API can't be used outside the Unity Thread. Value = Rnd.Next(50,62), Velocity = 100, }; // Send the note to the MPTK synthesizer. midiStream.MPTK_PlayDirectEvent(sequencerEvent); } }
Code language: PHP (php)

Have Fun !

SoundFont Exclusive Class – Some Clarification

Within a SoundFont, sounds can be mutually exclusive: for example, a ‘closed hihat’ sound will terminate an ‘open hihat’ sound ringing at the same time.

This  behavior is modeled in SoundFont  using ‘exclusive classes’: turning on a sound with an exclusive class other than 0 will kill all other sound having that exclusive class within the same preset or channel.

Exclusive class defined with Polyphone

Obviously, Midi Player Tool Kit implements this standard behavior. But we let you have the choice to activate or not this way of work.

By script, use this method: MPTK_KillByExclusiveClass and set to false to disable the kill by exclusive. By default the value is true, if you want to change it, call this method once after the synthesizer is initialized. See here how.

With inspector:

With the current version (2.841), these parameters are only visible with MiDiFilePlayer prefab. They will be extended to others prefab in the next version.

Midi Player API Documentation

Midi Player Tool Kit API Documentation published with this Unity asset contains static doc mainly based on PDF format.

  • Description of the Application Programming Interface (also in CHM format).
  • A quick start in PDF.
  • Migration helper from V1 to V2.

You can read the API PDF documentation below in this page before downloading the package. If you want to start with Unity scripting go to this page.

For a Quick Start, it’s better to read the up-to-date version go to this page.

MPTK-API-V2

Set Synthesizer Parameters at Startup

Use case: you want to generate music with an instrument (preset, patch, program, there is a lot of synonyms!) other than Piano at the startup of your application.

By default when the synthesizer is loaded, all channels are set with the preset 0. Generally, preset 0 is associated with Piano moreover when it’s a General Midi SoundFont.

The method MPTK_ChannelPresetChange must be used for changing preset with a prefab MidiPlayerStream.

But, the method must be called just after the synthesizer is initialized. So, we used the event OnEventSynthStarted to call EndLoadingSynth (or the name that you want!) in the script.

This process can be applied for a lot of synthesizer parameters, for example Kill By Exclusive Class , or Release Same Note (If the same note is hit twice on the same channel, then the older voice process is advanced to the release stage).

Look here for detailed information and a script: how to change SoundFont Bank and Preset when application is started.

Don’t forget to initialize your prefab in your script, see link here.

Midi Timing

Timing in music is very important. So MIDI files include a number of parameters related to keeping time.

A lot of information here has been extracted from https://majicdesigns.github.io/MD_MIDIFile/page_timing.html

Look also at this post to understand how get a verify good timing accuracy with MPTK.

What is a Beat?

The fundamental time unit of music is the beat. Beats can be slower or faster depending on the kind of music, and the tempo (speed of the beats) can change even in a single piece. Tempo in standard music notation are typically given in beats per minute (BPM).

Initial Tempo is given with MPTK_InitialTempo in Midi Player Tool Kit API.

In music a bar is a segment of time defined by a given number of beats of a given duration. The values which define a bar, are called the Time Signature.

Notes come in different power-of-two lengths. A MIDI quarter note normally is one beat long. A half note is two beats, and a whole note is four beats (as it takes up a whole measure, if you’re in 4).

so,

  • An eighth note is half a quarter note, so there are two eighth notes per beat,
  • a sixteenth note is half an eighth so there are 4 sixteenths per beat,
  • and so on.

Time Signature

A Time Signature is two numbers, one on top of the other. The numerator describes the number of beats in a Bar, while the denominator describes of what note value a beat is (ie, how many quarter notes there are in a beat).

Time Signature Numerator is given with MPTK_TimeSigNumerator in Midi Player Tool Kit API.

Time Signature Denominator is given with MPTK_TimeSigDenominator in Midi Player Tool Kit API.

denominatorwhat note value a beat is
0whole notes or semibreve
2half notes or minim
4quarter notes or crotchet
8eighth notes or quaver
16sixteenth note or semiquaver

So

  • 4/4 would be four quarter-notes per bar (MIDI default),
  • 4/2 would be four half-notes per bar (or 8 quarter notes),
  • 4/8 would be four eighth-notes per bar (or 2 quarter notes), and
  • 2/4 would be two quarter-notes per Bar.

The default MIDI tempo is 120 BPM, and the default Time Signature is 4/4.

However the Set Tempo meta event can change these defaults. As MIDI only deals in quarter notes, the Set Tempo meta event also only deals in quarter notes but also gives the time signature. If the time signature is 4/8, a quarter-note is not a beat since its described as an eighth-note, so using it to calculate beats per minute on its own is incorrect.

Have a look here for more detail: List of musical symbols

MIDI Beat Time

Musical timing is defined in fractions of a musical beat, so it makes sense to create a timebase that measures time as fractions of a beat.

A quarter note is always one fourth of a whole note – regardless of the tempo. Similarly, a sixteenth note is always the same fraction of a beat. The rate at which the notes occur can change as the tempo changes, but the relative durations are always the same.

So ideal timebase divides a musical beat into many small bits that occur at a rate determined by the current tempo. Each of these tiny fractions of a beat is called a tick, and the number of ticks per beat is independent of the tempo.

The Standard Midi File header chunk contains a 16-bit value that gives the number of ticks per quarter note. If it is not specified the MIDI default is 48 ticks per quarter note. This value is a constant over the whole file.

Within the MIDI data stream are tempo meta-events, which contain a 24-bit value that give the number of microseconds per quarter note. Divide this one by the first one, include the time signature, and you get the number of microseconds per tick.

Ticks per quarter is given with MPTK_DeltaTicksPerQuarterNote in Midi Player Tool Kit API.

Standard Midi File Time Specification

Events in a Standard Midi File are defined in terms of Delta Time. Delta Time determines when an event should be played relative to the track’s last event, in ticks. A delta time of 0 ticks means that it should play simultaneously with the last event. A track’s first event delta time defines the amount of time (number of ticks) to wait before playing this first event. Events unaffected by time are still preceded by a delta time, but should always use a value of 0 and come first in the stream of track events.

Sequencing Time

Delta times are stored as ticks, so what we need to know now is how many ticks make up a quarter-note.

This is given with MPTK_DeltaTicksPerQuarterNote in Midi Player Tool Kit API.

The number of microseconds per quarter note is given in the Set tempo meta event and is by default 500,000 if not specified. So

microseconds per tick = microseconds per quarter note / ticks per quarter note

This is given with MPTK_MicrosecondsPerQuarterNote in Midi Player Tool Kit API.

Delta times are cumulative, and the next event’s delta time needs to be added onto this one after it has been calculated. If the MIDI time division is 60 ticks per beat and if the microseconds per beat is 500,000, then 1 tick = 500,000 / 60 = 8333.33 microseconds. The fractional number of microseconds must be properly accounted for or the MIDI playback will drift away from the correctly synchronized time.

Are you always there ? Bravo !

Performance: Accuracy and CPU Load

Timing accuracy

  • MIDI Sequencer and Audio Synthesizer run in separate system threads, not in the Unity thread, consequently playing music has no impact on your game or application timing.
  • You can modify Unity Audio Parameters from the MPTK prefab. So, synth rate and buffer size could be adapted to your need.
    • lower buffer size to lower the latency.
    • higher rate rise the CPU load and therefor lower the timing accuracy.

Tip: avoid using Unity Update() and Unity Coroutine to create rythmic music. They are not stable enough for creating music. Rather use, system thread or system event. The best approach is to use the Maestro (pro) call back method: OnAudioFrameStart. Look here.

Synth Parameters – Impact on DSP/CPU Performance

The main synthesizer parameters which impact performance are:

  • Interpolation: it’s the process of using two or more data samples to create a data sample between them.
  • Rate: at which rate the Unity internal synth (DSP) is playing the sample.
  • Buffer size: length of the sample buffer. Its is build at each cycle of the DSP.

The measure below has been done with the demo TestMidiStream. With this configuration, 25 voices are playing simultaneously:

  • Piano with one sample
  • New note every 0.2 second 
  • Duration of 5 seconds

Effect Parameters – Impact on Performance

Effects are defined in the SoundFont for each instrument. So, each of them can have a dedicated set of effects. See here. Conversely, Unity effect apply to the whole AudioSource. So, same effect is applied to all the instruments in the Midi.

Remark:

  • Effects are available only with the Pro version of MPTK.
  • By default, effects are disabled in MPTK prefab. You have to enable them from the dedicated inspector or by script with the MPTK API.

The measure has been done only for the SoundFont effect.

First test:

  • Reed organ with two samples
  • One continuous note

Second test:

  • Piano with one sample or reed organ with two samples
  • New note every 0.2 second 
  • Duration of 5 seconds

Two others possibilities: play with sample duration

  1. Increase the cut off volume and lower the device performance:

2.Β  Lowering the default soundfont release time: the full duration of all samples will be decreased.

The release time is the last phase of the ADSR process: time taken for the level to decay from the sustain level to zero after the key is released.

Info: this setting will be moved in the foldout synth with the next version.

Β