Architect: Does anyone have a * macro/script
- KVRian
- 698 posts since 7 Dec, 2009 from GWB
Some improvements coming soon:
- KVRian
- 698 posts since 7 Dec, 2009 from GWB
Here's an update to Wigout:
http://vze26m98.net/loomer/Wigout-19022 ... 190224.zip
Here's a link to my previous version for those who don't know what I'm talking about.
viewtopic.php?p=7322368#p7322368
Here are the changes:
. I ripped out the code that made "slant" ciggles and twiggles. I found it just too hard to get the shapes I wanted with it included. If you want slanted segments, you can still set ploc to something like 0.0 or 1.0 and fix it there.
. I re-worked the wavetable view which now works like so: the view is set to 1024 elements, which is the maximum it can display. When a waveform is pre-computed, the first 1024 points are displayed in the table view. If your waveform is longer than 1024 points, you can use the <-Pg and Pg-> controls to page through the segment. Note that the numbers on the view's top don't update, but the "page" you're on is printed to console, so with a little arithmetic you can figure out exactly where you are. You can also now Clear the buffer, and don't have to look at little bits of crap when your waveform ends short of a 1024 point modulus.
. The Chandra Lua script should precompute on launch, but I've retained a Pre-compute signal to handle changes.
. I remove the [nth tick] stuff, so there's now a one-to-one correspondence between your points and the PPQN. Can't go any faster now!
So with the above you can comment out the sine curve code on line 118 to get this:
and set the pamp segment definition to zero to get this:
Enjoy!
http://vze26m98.net/loomer/Wigout-19022 ... 190224.zip
Here's a link to my previous version for those who don't know what I'm talking about.
viewtopic.php?p=7322368#p7322368
Here are the changes:
. I ripped out the code that made "slant" ciggles and twiggles. I found it just too hard to get the shapes I wanted with it included. If you want slanted segments, you can still set ploc to something like 0.0 or 1.0 and fix it there.
. I re-worked the wavetable view which now works like so: the view is set to 1024 elements, which is the maximum it can display. When a waveform is pre-computed, the first 1024 points are displayed in the table view. If your waveform is longer than 1024 points, you can use the <-Pg and Pg-> controls to page through the segment. Note that the numbers on the view's top don't update, but the "page" you're on is printed to console, so with a little arithmetic you can figure out exactly where you are. You can also now Clear the buffer, and don't have to look at little bits of crap when your waveform ends short of a 1024 point modulus.
. The Chandra Lua script should precompute on launch, but I've retained a Pre-compute signal to handle changes.
. I remove the [nth tick] stuff, so there's now a one-to-one correspondence between your points and the PPQN. Can't go any faster now!
So with the above you can comment out the sine curve code on line 118 to get this:
and set the pamp segment definition to zero to get this:
Enjoy!
- KVRian
- 735 posts since 8 May, 2002 from ... , germany
Indeed surprisingly promising... a fun thing.AZZIN wrote: ↑Tue Jan 08, 2019 9:56 pm Ok, cannot resist: here is a first output of a hopalong sonification experiment.
It is based on (x,y) tuples of an hopalong algorithm mapped on pitch and velocity on three voices,
with some probability outcomes on the note generation itself.
https://soundcloud.com/albertoz/archite ... ong-fields
Still basic stuff, but the results are surprisingly promising!
There seems to be a bug with scale selection via panel. The actual used scale is one before the one you pick in the dropdown: pick composite-blues, blues will be used.
All in all this is a great compositional experiment. Thanks.
Regards,
tl.
- KVRist
- 123 posts since 25 Jul, 2004 from Italy
Hi tl, thanks for pointing it out... I'll check. Glad you have found it interesting..
EDIT: ok, the attached simple modifications ("+1" module and "add a comma" on the selected dropdown) will restore the scale functionality. Regards,
Alberto
EDIT: ok, the attached simple modifications ("+1" module and "add a comma" on the selected dropdown) will restore the scale functionality. Regards,
Alberto
You do not have the required permissions to view the files attached to this post.
- KVRist
- 123 posts since 25 Jul, 2004 from Italy
Hello, attached to this post there is a port of the Musinum algorithm (https://reglos.de/musinum/) developed by L. Kindermann, implemented in Architect.
This is an interesting algorithm, both from a musical as well as a mathematical point of view. Starting from a counter "i" a "number" is produced (from a "start" number and a "step"); the number is the converted in a different "base" and the single digits of the new number are summed up. For example, (from the Musinum site):
Please note the colums 2nd and 4th: here we have a "self-similarity" effect, which is a very interesting property of the output series.
The architect implementation implements the following (Matlab/Octave) code (with some limitations):
Now, if you have Matlab/Octave on your computer try the following:
% in case of base 2
out(1:20)
out(2:2:40)
out(4:4:80)
% in case of base 3
out(1:20)
out(3:3:90)
out(9:9:180)
and you have identical sequences, i.e. self-similar.
The Architect implementation is centered on a macro ("musi_num") which has the following inlets:
- input: an incremental/decremental number triggered by a Metronome, for example set to 1/8
- step: (>=1), as in above code
- start: starting integer (>=0), as in above code
- base: numerical base in which the number is converted
- max range: maximum midi note range on which the algorithm is bounded (def. 66)
- min range: minimum midi note range on which the algorithm is bounded (def. 46)
The main output is a number (left outlet) that can be easily converted into a midi note, while an experimental output* (right outlet) provides a number between [0-1] that is translated to midi velocity by means of a-linmap macro. These two ouput can then be passed through some midi processing (execution probability, force to scale etc.). Attached there is a simple implementation with 1 Midi channel in output.
Experiment with the different parameters, esp. "step" and "base", but also with increasing and decreasing input numbers ("UP", "DOWN") to get interesting sequences. To this aim, a simple graphical interface is provided, which allows the selection of the main parameters as well as scale, min/max midi velocity and channel probability. The default output is Midi OUT, so you have to attach a midi instrument outside Architect or switch the output to Track<N> (internal to Architect).
Suggestions, feedback, error reports are welcome.
Have fun,
Alberto
*this idea is taken by the Musinum implementation in Pure Data by M. Brinkmann.
EDIT: here is a piano piece generated with two instances of the algorithm.
https://soundcloud.com/albertoz/praelude-for-a-robot
This is an interesting algorithm, both from a musical as well as a mathematical point of view. Starting from a counter "i" a "number" is produced (from a "start" number and a "step"); the number is the converted in a different "base" and the single digits of the new number are summed up. For example, (from the Musinum site):
Code: Select all
decimal binary sum of
number number digits Tone 2nd 4th
1 1 1 c
2 10 1 c c
3 11 2 d .
4 100 1 c c c
5 101 2 d . .
6 110 2 d d .
7 111 3 e . .
8 1000 1 c c c
9 1001 2 d . .
10 1010 2 d d .
11 1011 3 e . .
12 1100 2 d d d
The architect implementation implements the following (Matlab/Octave) code (with some limitations):
Code: Select all
% example initialization
N = 1000;
start = 0;
step =1;
base = 2;
% main loop
for i = 1:N
num = start + i*step;
D = dec2base(num,base);
L = length(D);
temp = 0;
for j = 1:L
temp = temp + str2num(D(j));
end
out(i) = temp;
end
% in case of base 2
out(1:20)
out(2:2:40)
out(4:4:80)
% in case of base 3
out(1:20)
out(3:3:90)
out(9:9:180)
and you have identical sequences, i.e. self-similar.
The Architect implementation is centered on a macro ("musi_num") which has the following inlets:
- input: an incremental/decremental number triggered by a Metronome, for example set to 1/8
- step: (>=1), as in above code
- start: starting integer (>=0), as in above code
- base: numerical base in which the number is converted
- max range: maximum midi note range on which the algorithm is bounded (def. 66)
- min range: minimum midi note range on which the algorithm is bounded (def. 46)
The main output is a number (left outlet) that can be easily converted into a midi note, while an experimental output* (right outlet) provides a number between [0-1] that is translated to midi velocity by means of a-linmap macro. These two ouput can then be passed through some midi processing (execution probability, force to scale etc.). Attached there is a simple implementation with 1 Midi channel in output.
Experiment with the different parameters, esp. "step" and "base", but also with increasing and decreasing input numbers ("UP", "DOWN") to get interesting sequences. To this aim, a simple graphical interface is provided, which allows the selection of the main parameters as well as scale, min/max midi velocity and channel probability. The default output is Midi OUT, so you have to attach a midi instrument outside Architect or switch the output to Track<N> (internal to Architect).
Suggestions, feedback, error reports are welcome.
Have fun,
Alberto
*this idea is taken by the Musinum implementation in Pure Data by M. Brinkmann.
EDIT: here is a piano piece generated with two instances of the algorithm.
https://soundcloud.com/albertoz/praelude-for-a-robot
You do not have the required permissions to view the files attached to this post.
-
- KVRist
- 63 posts since 22 Feb, 2019
Alberto that is a great algorithm. Listening to the SoundCloud I thought it would be nice to alter the outgoing stream of notes so that a certain rhythm is imposed? I have just finished reading the long architect beta thread after learning about architect. Are there macros for rythmic styling ?
- KVRist
- 123 posts since 25 Jul, 2004 from Italy
Hi gminorcoles, welcome onboard! Yes, the algorithms I worked out up to now are mostly "pitch-related". An interesting "rhythm generator" is something I'm looking for as well. There is an Euclidean rhythm generator on the first page of this thread but I'm also waiting for the library of examples that will come out from the v1.0 of this beast.gminorcoles wrote: ↑Thu May 16, 2019 7:43 am Alberto that is a great algorithm. Listening to the SoundCloud I thought it would be nice to alter the outgoing stream of notes so that a certain rhythm is imposed? I have just finished reading the long architect beta thread after learning about architect. Are there macros for rythmic styling ?
-
- KVRist
- 63 posts since 22 Feb, 2019
I feel like a clave template transformer would be cool. You start with a set of predefined rythmic phrases which can be parametrically varied, and then set a schedule for superimposing them onto the piece.
-
- KVRist
- 83 posts since 5 Jan, 2019
This is very interesting! I enjoyed the sample. Thanks for sharing!AZZIN wrote: ↑Mon Apr 22, 2019 10:11 pm
Suggestions, feedback, error reports are welcome.
Have fun,
Alberto
*this idea is taken by the Musinum implementation in Pure Data by M. Brinkmann.
EDIT: here is a piano piece generated with two instances of the algorithm.
https://soundcloud.com/albertoz/praelude-for-a-robot
-
- KVRer
- 16 posts since 27 Sep, 2019
Hey sorry for asking for the obvious, I am new to Architect.
There is a question that came up trying wigout regarding internal midi mapping.
I created a track an tried to automate a parameter from a ladspa plugin with wigout. However wigout sends its values to a midi output node. I can select the track of the plugin, but i dont know how to map it to the parameter of choice. I aslo tried to add a midi mapping and created a numeric source node, connecting it to the node, that is also connected to the value input of pack controller, but the results are strange..
There is a question that came up trying wigout regarding internal midi mapping.
I created a track an tried to automate a parameter from a ladspa plugin with wigout. However wigout sends its values to a midi output node. I can select the track of the plugin, but i dont know how to map it to the parameter of choice. I aslo tried to add a midi mapping and created a numeric source node, connecting it to the node, that is also connected to the value input of pack controller, but the results are strange..
- KVRian
- 698 posts since 7 Dec, 2009 from GWB
I haven’t looked at Wigout for a while, but I’m not sure Architect can output anything but MIDI. So you’d have to translate Architect’s MIDI output into a signal that your DAW can use as automation.
If there is a numeric output, I’m sure it’s use is hampered by the 14bit MIDI CC encoding. You’d have to fix that.
I’ll take a look later and see if there’s anything more to add to this comment.
If there is a numeric output, I’m sure it’s use is hampered by the 14bit MIDI CC encoding. You’d have to fix that.
I’ll take a look later and see if there’s anything more to add to this comment.
-
- KVRer
- 16 posts since 27 Sep, 2019
na I mean i created a track in Architect and added the plugin there. Right click on parameter to add midi mapping. couldnt find out how to assign the wigout to it. Not in an external daw, I edited the wigout preset
- KVRian
- 698 posts since 7 Dec, 2009 from GWB
Wigout is set up to output on the Breath Controller which is CC 2/34 MSB/LSB. Are you mapping it correctly? Handling 14bits, not 7?
This is the underlying fragment: viewtopic.php?p=7311970#p7311970
You could in theory utilize the LSB, but I’m not sure the Wigout algorithm would be very cooperative. You’d have to rewrite it as 7-bit.
This is the underlying fragment: viewtopic.php?p=7311970#p7311970
You could in theory utilize the LSB, but I’m not sure the Wigout algorithm would be very cooperative. You’d have to rewrite it as 7-bit.
-
- KVRer
- 16 posts since 27 Sep, 2019
Hmm I havent really gotten my head around the difference between 14bits and 7, I just dont know enough about midi yet. Are there learning resources available somewhere? Maybe it could be a good idea to start a new thread to collect these resources.
So I right click the handle, add mapping, source type MIDI CC and then either Controller 2 or 34? Channel I leave at 1? Now I just press play, but things dont change visibly.. Sorry for asking these really nooby questions. I am also guessing that most of these will be solved in the next bigger release, as you announced that it will focus on extensive documentation (maybe with people like me you can see how basic some things need to be explained )
So I right click the handle, add mapping, source type MIDI CC and then either Controller 2 or 34? Channel I leave at 1? Now I just press play, but things dont change visibly.. Sorry for asking these really nooby questions. I am also guessing that most of these will be solved in the next bigger release, as you announced that it will focus on extensive documentation (maybe with people like me you can see how basic some things need to be explained )
- KVRian
- 698 posts since 7 Dec, 2009 from GWB
I’m afraid I’m not the developer of Architect, so I can’t really respond to this. It sounds like some basic understanding of MIDI would be of help to you. Craig Anderton’s book “MIDI for Musicians” is very good. Though out of print, I’d bet there’s a scan somewhere on the Internet.