Once I tried to start a fair discussion... :)

DSP, Plugin and Host development discussion.
Post Reply New Topic
RELATED
PRODUCTS

Post

xhunaudio wrote: Fri Nov 15, 2019 3:53 pm Hi,

tlaking about next Monday, for the ADC 2019 "Build a synth with SOUL" exhibition you said something about a "local SOUL installation"... Are you talking about Tracktion Waveform, a VST/AU wrapper or a standalone software ?
We expect people will either use Waveform, or the soul console app, or the web playground to write the code. Then we're going to have some embedded hardware devices to deploy the code onto.

Post

Thank you Jules,

since I find this topic very interesting (I created this thread, indeed) I would be tempted to ask for more and more, and more... (for example which hardware and software companies have been "involved" so far at this SOUL early stages) but I imagine you can't give us too many infos about this. It would be one of the most valid sources to sneak-preview the future of computing architectures, as I stated previously, since SOUL is just a single aspect of something much greater.

So, if I could bother you once more :) I would just ask if you already "had a talk" with Cycling'74, a company who actively cooperated with ROLI in the recent past. SOUL seems to potentially be one of MAX/GEN's best friends...
bruno @ Xhun Audio || www.xhun-audio.com || Twitter || Instagram
Image

Post

Yep, Cycling'74 just need to rewrite every Max (control) and MSP (DSP) primitives in SOUL, then programers basically visually describe a SOUL graph, that is JIT compiled on the fly and executed in an efficient manner. And that could even be rendered as a Web page loading the WebAssemby version of the same graph... 8)

Post

sletz wrote: Sat Nov 16, 2019 8:39 am Yep, Cycling'74 just need to rewrite every Max (control) and MSP (DSP) primitives in SOUL...
Eheh, just a Gen->SOUL translator and/or a SOUL "patcher container" block would be enough.
bruno @ Xhun Audio || www.xhun-audio.com || Twitter || Instagram
Image

Post

EDIT : here's an announcement published 7 days ago. Interesting new (minor) details have been disclosed.

The first public release of the SOUL SDK is now available for developers
https://juce.com/discover/stories/soul- ... ta-release

Still waiting for the ADC 2019 "Build a synth with SOUL" video tutorial...
bruno @ Xhun Audio || www.xhun-audio.com || Twitter || Instagram
Image

Post

xhunaudio wrote: Wed Nov 20, 2019 2:47 pm Still waiting for the ADC 2019 "Build a synth with SOUL" video tutorial...
Yeah.. There was some kind of cock-up with the AV and that video never happened :(
The workshop was pretty awesome, 3 hours of working through examples, and we ended up doing the first demos of SOUL patches loading on the AKAI Force...

We're really annoying that the whole thing wasn't recorded, but we're film another similar session as soon as we can and will share that.

Post

Great, let us know when the new video will be out.

Important : this time, be sure to turn your camcorder on.
bruno @ Xhun Audio || www.xhun-audio.com || Twitter || Instagram
Image

Post

Quite distant from music production but seeing how it operates as JIT, what about SOUL being used in other circumstances: for example, real-time graphic engines (UE4, Unity or even Godot) could use some alternatives to already established options like Wwise/FMOD (usually prefered by big companies) or their own internal systems. Any plans for that?

Post

Absolutely! I'd not pitch it as a replacement for Wwise/FMOD (as these engines do much more than just the raw DSP), but consider it to be a portable plugin standard that could be used within these engines.

Post

I'm not yet getting why soul is something new? JIT or any sort of abstraction is ages old... And yes you can abstract away at any level.

Disclaimer: I want to be honest, I didn't scan the whole thread, but just reply to the opening assumption that a JIT approach might be "to our rescue" ;-)

Okay here's a incomplete list of abstraction levels:
* Microcode: Make your pocessor "understand" different instruction sets
* OS: Use VMs, Docker Containers, HAL (Hardware Abstraction Layers)...Vine
* VM: like GraalVM (upcoming super fast, resource optimized Java VM supporting a bunch of other languages natively as well, R, jacascript,...)
* Plugin-Wrapper: Wrapping VST3 to Logic or Protools back and forth
* Plugin-Wrapper + Language: Like Bluecat or reapers Javascript plugin
* Dynamic/Static: Crosscompilers + Abstraction-Libs (X-Code, Juce), JITs

Why I don't think it is completely new. Check out Plugins that let you code in a general purpose, not ecosystem specific language (I would call swift eco system specific). This means you can do a lot of thing. But what you cannot fight with these is any supplier "locking away" any of these posssibilities for which ever (economic) reason. Security, Platform control, Monetarization, ...

Last example for my point that "we're already there", at least to some extent" is Bitwig:
1.) Their Plugins use "Wrapper + general purpose interpreted/jitted Language inside" approach. There's one github repo where the contributor took a plugin, and infused his own code. Here's the repo, the language is called Nitro: https://github.com/zezic/bitwig-device-hacks
2.) The GUI is written in JAVA (please don't start a fight that JAVA is bad in your eyes ...)
Both are abstractions and provided there's a runtime environment for them are able to run everywhere. If now apple decides to kill Java ...
This said, I'm not sure why SOUL is something conceptualy new which will solve more problems ...

Best Regards
] Peter:H [

Post

] Peter:H [ wrote: Sun Nov 24, 2019 8:49 amI didn't scan the whole thread, but just reply to the opening assumption that a JIT approach might be "to our rescue" ;-)
The fact that SOUL backends will (probably) use JITs isn't important. Neither is the fact that (obviously!) almost all computing is about abstraction layers of one kind of another.

The point is that audio doesn't have a high-performance open platform that can move DSP workloads off CPUs and onto more suitable low-latency devices (or at least run it safely with driver-level privilege on the CPU). And such a platform would need to get widespread hardware and software support across the industry to be useful. That's what we're doing. (And it'll make writing DSP code much more enjoyable, easy and safe as a bonus)

As I replied to someone else earlier, I don't have time to personally pitch the "why is SOUL great" story to everyone who comes along saying they don't get it, but you can watch my talks about it or read the docs in our github repo if you want to learn more.

Post

jules wrote: Sun Nov 24, 2019 9:16 am
] Peter:H [ wrote: Sun Nov 24, 2019 8:49 amI didn't scan the whole thread, but just reply to the opening assumption that a JIT approach might be "to our rescue" ;-)
The fact that SOUL backends will (probably) use JITs isn't important. Neither is the fact that (obviously!) almost all computing is about abstraction layers of one kind of another.

The point is that audio doesn't have a high-performance open platform that can move DSP workloads off CPUs and onto more suitable low-latency devices (or at least run it safely with driver-level privilege on the CPU). And such a platform would need to get widespread hardware and software support across the industry to be useful. That's what we're doing. (And it'll make writing DSP code much more enjoyable, easy and safe as a bonus)

As I replied to someone else earlier, I don't have time to personally pitch the "why is SOUL great" story to everyone who comes along saying they don't get it, but you can watch my talks about it or read the docs in our github repo if you want to learn more.
Sure if you don't want to pitch I understand. Still not convinced... Talking about movin off CPU.
Let's start with driver level privileges - this won't be on any of my systems at any time. Potentially JIT compiled audio code running as drivers level ... hm, simply no.
DSP: I own 2 6-DSP each Scope Boards with TI DSPs since they were published, around 2005 or so ;-) Some DSP power to do things off CPU. The current Scope Xcite is about 16 even more potent DSPs . And those guys come with an incredible SDK as well. It's like you have DSP atoms and can drag/drop them and wire them... Seems like this is the View Audio Graph equivalent. Still XCite has never made it past niche product status...why? because customers want to have all seemlessly integrated in *their* DAW, no hazzle with extra hardware and stuff. And putting things off cpu and place it on dedicated hardware has on disadvantage - it adds latency, even more if the computed buffers need to go to the gpu as well to update the real time views of things happening, that we all love. Thatfor instance lead to off loading Audio to GPU has not yet made it into many VSTs, actually only know one reverb and Fathom Synth whoc plans to do so...

And the problem with a single audio platform on a single OS all my plugins would be dependend on... it'S two components that can break *all* your plugins and render your investments unusable in just a single second... when you immediatley regret pushing the okay button on the update dialog. It's actually funny how often devs breaks things. I still remember scala update from 2.11 to 2.12 where those scala dorks skrewed a project I was involved ...

Speaking of DSP here's the Scope links. The landing page http://scope.zone/, the SDK... http://scope.zone/index.php?id=1386&lg=en with these DSP atoms http://scope.zone/index.php?id=1838&lg=en

Cheers
] Peter:H [

Post

Peter,

That sounds great, but it's a specific solution for a specific piece of hardware. When the manufacturer stops supporting it, your investment would be dead. If that's something you're ok with, then fine, but i'll hazard a guess that you're unlikely to get the DSP you've designed with that running in a web browser, right?

Post

(...)

Post

"Fake "companies" arbitrarily discarding support for *GEMS* like OpenGL WITHOUT ANY REASON"

OpenGL was an abomination even for a C library. Apple did not kill OpenGL , OpenGL was already dead 2 years ago with KRONOS board sole focusing on Vulkan. You wont find even a single video in their website about OpenGL. Let's not even talk about documentation. Let's not even talk about performance. Let's not even talk about buggy implementations.

Apple was pushing for Vulkan for quite some time but apparently none shown any interest, so out of frustration they left the board and went to create Metal. Afterall it was Apple and only Apple that popularized OpenGL in the first place. The board panicked because its made up out of the biggest computer companies and they put everything to pushing Vulkan.

Apple did not kill OpenGL, it just moved the corpse outside MacOS. OpenGL was dead 2 years before that.

I do agree though that Apple's move to drop OpenGL support was stupid. Backward compatibility matters a ton, no wonder it made a lot of devs unhappy. 99.5% of the users however did not even noticed, for the simple reason that they were using it indirectly via game engine. By users I dont mean players, but devs. Those APIs afterall are not designed to be used directly but act as a middle layer.

But I am not supporting Apple anymore, my next computer will definetly not be an iMac , I am already slowly porting to Linux (Manjaro) and to my shock I even like win 10. Apple will never be the same without Steve Jobs. But then nothing lasts forever.

I wish you all the best with your language but I am skeptical. C++ has not been the king of performance, even though an abomination as well syntax wise, for no reason. But then it is always great to have alternatives that make life easier.

Post Reply

Return to “DSP and Plugin Development”