Eigenharp Developers Conference, Open Source, 2.0 and Other Sundries
The first ever Eigenharp Developers Conference will be happening the last weekend in January in the UK. Why is there a developers conference? Well if you've been following the Eigenharp, the software was promised to be made Open Source even since the original announcement. That did in fact happen this year.
In fact, anyone can view the source here on GitHub.
EigenD on GitHub
A group of programmers and musicians interested in extending the platform are travelling to a lodge outside of London for a deep dive into the EigenD internals. I'm travelling from West Coast USA to be there. I really cannot wait.
From davek4981 youtube channel
"Mark playing Building Steam With A Grain Of Salt by Dj Shadow on the Eigenharp Alpha"
We've always been told that there is untold power lurking inside the EigenD host software. However, in the early days this was mostly hidden behind a user interface that was not yet mature. If you've read my previous posts, you can see the progression of the software and usability that has culminated in a 1.4.x release of EigenD. This release is quite complete as a stable and mature platform for the Eigenharp. There are different available factory setups that come with the three Eigenharp models. Users have the ability to customize these setups creating a library of custom setups that can be called up at will. 1.4.x represents the end of the 1.x line. It delivers the initial vision of having an instrument and host that allow a musician to go out and perform live without being a slave to a laptop screen like most electronic musicians. The midi and plugin support is top notch with best in class support of all the different possible tweak points that an electronic musician might expect in a performance rig. Add to that the Stage app which can run on an iPhone, iPad for remote control of performance and sound parameters, makes a very solid performance setup. 1.4.x is Open Source which gives some peace of mind regarding long term support of the platform. As good as 1.4.x is, it does represent as far as the platform can be taken without some major changes and re-plumbing that will occur in 2.0. As can be imagined. A lot has been learned from actual users. Most notably the original concept for a tool called Workbench which allows extensive customization of the system was scrapped before the product was released, this was due to it not being ready for prime time. The usability was apparently not good. Without workbench, 1.4.x cannot fulfill the entire vision of unlocking much of the power within EigenD so that users can innovate beyond the factory setup frameworks. With the prerelease of EigenD 2.0, we see a Workbench worth waiting for and a reworked engine that addresses many of the sharp edges regarding customization and the use of the Belcanto language for interacting with the platform.
Why did Eigenlabs create it own host software in the first place? When I say host, I mean the software necessary to actually be a sound source for audio and a host for audio plugins like VSTs, AUs as well as the ability to play back samples from sound fonts and loops.
We've been told that the reason for EigenD is that such an instrument needs a very high bandwidth and low latency environment for dealing with the large volumes of performance data moving through the instrument. If the instrument was forced to communicate with a separate host via midi or Open Sound Control, the latency would be increased and the performance data would have to be severely trimmed down.
In the early days of EigenD, this custom host approach was definitely a liability. Why would a company produce a complex host when so many mature DAWs were already on the market? With the first release of EigenD, yes the instrument would play the factory sounds and built in instruments with extreme control and finesse, but connecting to the outside world of sounds was obtuse and did not provide the same experience. With 1.4.x, we have a complete and capable host that shows benefits over existing DAWs even controlling sound sources like VSTs and AUs which DAWs have down pat. For example its possible to map all the control points on the instrument to VST parameters, poly aftertouch etc at a high data rate, higher than DAWs will deal with gracefully. Such an approach falls down when trying to do the same with an external host such as Ableton Live. Furthermore, the latency when using an external host starts to border on the unacceptable. EigenD has a latency of @7ms from key press to audio. 7ms is extremely fast and gives you a very immediate connection to the control of sound as if you were in fact playing a physical instrument such as the guitar or piano.
Boiling the Ocean
You can like or dislike the Eigenharp concept but few people that understand the challenges would argue against the fact that it is an amazing accomplishment. If you were to set out on a mission to produce a controller that was going to make loads of money in the market in the short term, this would not be it. This vision was not about taking shortcuts. It wasn't solely about making money. It was about creating something significant with a no compromises approach. Having no compromises also meant a daunting full circle vision for the combination of hardware and software. The hardware certainly delivered from day 1. The software is still moving to encompass the full vision. In technology there is always the fear of "Boiling the Ocean" which happens when the goals are so large that they are almost impossible to achieve. This opens you up all sorts of problems. If the instrument hits the market before the vision is realized, users don't care about your vision, they care about what they can do today. If the hear of this vision, then they suddenly will hold the company to fulfilling these promises forever, even if the product direction completely changes. The Spectralis is a really good example of this. For years the user base has been complaining about not getting promises that are years old, even as the software is more mature than the original promises. It's painful to watch unfold. How can this be prevented?
From what I've seen, the problem with large visions and promises is that the end users do not have visibility or control over their own destiny. They feel they are on the outside looking in wishing for things to happen that they have no stake in. Meanwhile, said company insiders see how amazing the technology is, how hard they are working and don't see how the users can be frustrated when they are working so hard to deliver features. There isn't an ecosystem that encourages experimentation, direct customer feedback, sharing and a shared goal of making things better. Instead it's US and THEM. They are the ones that need to do things, WE are the ones sadly waiting for THEM to bring out new features. When such a dynamic is setup, there is literally nothing a company can do to satisfy users. There is always potential that is unfulfilled, and nothing a company can do can satisfy users.
Enter Open Source
So why Open Source? If a user is not technical then why do they care at all that there is source code? Open Source is a big topic so we will not dive in too deeply. Let's make a few high level points first. Open Source does not mean free. Just because you can read the source code and build it yourself if needed does not mean the software is not paid. Open Source means just that, you can read the source code. You can build it yourself.
Open Source allows for a few things. One is transparency. Everyone can see the efforts going into the platform. There aren't insiders and outsiders anymore. The black box is now a transparent box, at at least more transparent than before. There are of course technical barriers but at least the user community becomes involved in a meaningful way. Just the gesture is a sign of trust and can build bridges to your users.
Open Source is an insurance policy. How many promising and ambitious platforms have gone the way of the Dodo? If a company calls it quits, at least the user base can take over managing the source code for the platform and keep the technology alive, even if forward progress is impeded. It's not a nice thought, but it does remove a major barrier to spending a ton of coin on a platform whose lifespan is unknown.
Open Source creates an ecosystem. Obviously only a small percentage of technical users can properly read and do anything with the source code, however this can lead to a multi level ecosystem where the very technical part of your user base can do experiments, build useful things for the community and communicate their findings to the next strata of power users, who similarly experiment and communicate even more useful things to the non power users.
Open Source Commercial Software
There is some perception that Open Source projects are all very difficult and technical to use, because they are free and contributors don't have the time to make them glossy like a commercial product. Remember that Open Source does not equal free. Eigenlabs will still be the primary driving force and control point for the development of the software. In fact, contributions from outside developers will likely be quite small. This is fine. It allows for development of polished commercial software with the benefits of an open ecosystem that encourages exploration, innovation, transparency and sharing.
Building Ladders
If we have one set of users looking at source code and feeling empowered and another set feeling excluded, we have not connected or empowered our community.
We need to be building ladders. Yeah, another silly analogy from the software technology business. If you are standing on a rung and the next one is 10 ft up, you haven't created a system that users can progress through or in which communication and sharing can occur.
If we have very technical users on one side and non-technical users on the other. How are they connected? There needs to be a bridge in which all users can benefit. This bridge is a new application in the Eigenverse called Workbench and it will be part of EigenD 2.0.
EigenD 2.0
Eigenlabs has just done a limited prerelease of 2.0 to the developers who will be attending the developers conference in Jan. This will allows those developers to get acquainted with the capabilities of the 2.0 platform before the conference starts.
2.0 fills in the missing link between what the platform is capable of and what a user can do with it. The infinite configurability of the core platform was never properly exposed in earlier software versions. I for one was extremely skeptical that Eigenlabs could pull off this type of vision. Why should they even make things more configurable? Shouldn't they make things easier and more intuitive for the non technical user?
Yes they should, but what I see now is that even Eigenlabs itself needed to get to 2.0 in order to have the tools to bring about transformation of the platform and to improve the out of the box experience for the non-technical.
Enter Workbench
With workbench I can see how easy it is going to be to fine tune and improve and expand the range of what is possible while improving the whole experience.
Workbench is a new program added to the platform for admittedly the power-user. Workbench presents a Bidule-like component and wires view of all the building blocks of the platform and the connections between those components.
Looking at a Factory setup in workbench shows you just how much work EigenD is doing under the covers to bring everything together into a performance setup.
The thing about workbench is that it is actually quite intuitive. Simple rewiring is fairly straightforward. More complex components are grouped into reusable components called Rigs. The atomic units of functionality are called agents. Agents connect to other agents and provide useful bits of functionality.
The benefit for the average non technical user of having Workbench is that it is now quite trivial to create very targeted components of functionality that are shareable as Rigs.
Some examples would be, a Rig that simulates strumming a guitar. A Rig that simulates a microtonal drum surface. A Rig that connects to Analog equipment via control voltages. A Rig that performs as a drum machine or a sample masher ala Monome.
Video of Geert Bevin interacting with a custom microtonal hang drum surface
The other thing workbench can do is make things that are extremely simple and lightweight.
For example. I want a setup that simply is a midi control surface with no additional layers of complexity.
Geert Bevin controlling a Moog analog synth with the Eigenharp
I remember seeing Imogen Heap on David Letterman perform with the Monome. She had custom software developed that grouped the small buttons into larger button surfaces that were easier to hit. The setup was made into something very simple and foolproof to use under the stress of playing live. Many electronic performers have similar strategies for creating ways to make live performance more predictable. EigenD 2.0 should make this sort of thing a lot easier to accomplish, even for those with limited technical ability.
Here's a screenshot of a small section of setup that uses 6 audio channels with a single AU plugin used for controlling analog modular gear via CV.
A Setup is All You need
The other problem that electronic musicians have is dealing with too many connected and separate components. There may be four or fives different pieces of software that all need to be loaded and made to communicate with each other. When settings change, there are similarly four or five separate setups that need to be saved individually in order to get back to the state you were in when you closed the software down.
EigenD provides extensive abilities for restoring a performance state to exactly the way you left it. A template may be built up using a number of reusable rigs provides the basic starting DNA templates for a performance and set of capabilities. These template becomes the starting point for any number of performance setups. Drop in plugins, samples, setup different mixes, etc. Every single parameter in the core platform as well as in the audio plugins is saved and remembered when you save a set. This includes remembering exactly the state of the instrument when you closed the setup.
Everything is Transparent
I've talked about the initial frustrations in the early software with having a black box that we knew was powerful but without the way to access that power. EigenD 2.0 is the exact opposite. Everything is transparent if you want it to be. The platform's elegance comes from the fact that the building blocks and capabilities of the system are consistent from the bottom to the top. In most DAWs, you have a layer that you can interact with and layers that are hidden and closed to you. EigenD is built from the ground up using the agent concept. Literally everything worth exploring is exposed. This is also what enables very very small and lightweight setups to be created. If you ever been frustrated waiting for a DAW to load, you can create an EigenD with an associated performance setup that loads in under a second. Similarly you can create a setup that pushes your machine to the absolute limits of its processor and memory.
Extending EigenD
For the programmers in the community, the EigenD platform can be extended by programming your own agent. Agents are the DNA of the platform and can be combined into endless combinations of functionality.
An Agent follows a pattern of development. Agents inherit common functionality from the core system, They can handle things like very precise timing and clocking functionality, audio streams, data streams, etc. Or they may handle utility tasks like controlling the key lighting, transposing a scale, etc.
Every parameter and state of an agent can be persisted in a setup or used as a control in the Stage app for live performance.
Some of you may be asking why not use something like an existing plugin API instead such as VST or AU?
Agents are far more generic and flexible than any plugin format however they are proprietary for the EigenD universe. They can be lightweight or heavyweight and connected in infinite ways. Yet they also can be full participants in the flow of all data in the system.
Plugins such as VSTs and AUs still play a very important role in EigenD, however they are used at a high level like for example as part of a Rig for playing Audio Units, or as part of an effects chain or effect insert.
Only a very small percentage of users will want to create or modify an agent, however when they do, all power users will be able to take advantage of the agent directly in Workbench and Stage to create new setups using it.
Conclusion
Well I did tell you it was a bold vision. One that may never pay back its initial investment in monetary terms, however I am grateful that such an instrument was created. It gives me hope that projects like the Eigenharp can still be brought into existence by the sheer audacity of vision and tireless commitment to the value of creativity and innovation without compromise. I certainly don't think every project needs large funding and a multi year skunkworks project to innovate. It does make me smile to think that such projects are still possible though.
Look at the way the Alpha sends all its data across a standard instrument cable. That protocol and data processing architecture needed to be invented and in itself is a major innovation. Yet it enables a musician to move around the stage like a guitar player would. No compromises.
Lately I've had some of the most gratifying, connected sessions with this instrument and I've crossed some invisible barrier where I am learning fast enough that I want to play it constantly. It was a slow start as an early adopter and I even gave up for awhile when the software was immature and frustrating, but now the barriers have fallen. It's truly something you can bond with as a musician that goes beyond the throw away culture of just the next plastic box. I do call it an instrument and not just a controller because it does take virtuosity to play it. The community is literally learning how to learn it. It must be practiced like a new instrument and does not mimic any other instrument in technique even if things like the breath pipe are immediately recognizable. The technique involved is both unique and authentic, meaning, this is the real deal, not a hack to simulate something else. After a while you start learning advanced techniques like string bends, slides, different gating techniques, two handed techniques, etc. These are things I certainly associate with the progression of learning a guitar, but their execution is surprising and new techniques emerge that no other instruments can do. It amazes me to have one hand playing a percussion instrument while the other is playing something with manual vibrato and subtle pitch, timbre bends. Fertile ground for experimentation.
I do now see how the original vision will become reality for users. This is an amazing instrument and platform. There is no equal anywhere. Yes it is complex, but it now hides this complexity under a mature, accessible and very capable host while allowing access and extensibility to all levels of player from beginner to pro users as well as to programmers wishing to extend the platform. Even if you never changed out a sound in the factory setup, you could play, learn and enjoy this instrument for a lifetime.
I send you off with a video from one of the initial Eigenharp demo players (Dave K.) that to me expresses the kind of sessions and experience I like to have with the instrument.
From davek4981 youtube channel.
"Iambic 9 Poetry On The Eigenharp (Squarepusher Cover) "
Sunday, January 1, 2012
Saturday, May 28, 2011
Eigenharp Revisited
It's been almost a year since my last post. Pathetic I know, but worst of all, my last post left the story of the Eigenharp with "The Supreme Difficulty of Birthing a New Instrument" as a sort of cliffhanger with many questions left unanswered.
Eigenlabs has indeed been busy in the last year and has made great strides in addressing usability concerns as well as simply maturing the product into something fit for a more mainstream player.
I wish I could say that I have matured as a player along with the platform, but I took quite a long hiatus from the instrument after a bit of burnout from being an early adopter. There were a few early design decisions in the software that were a nightmare for usability, most notably, having to control the user interface of EigenD on the computer with the harp itself. Luckily, Eigenlabs listened to users and fixed this issue.
The most significant improvement is the introduction of "Stage" which is a new user interface that exposes all the internal routing and settings of the system as a set of user defined controls that you can tweak on the computer or from your iPhone or iPad remotely. This is a brilliant solution for controlling all the settings that are buried within EigenD. It's quite easy now to create very detailed custom setups for oneself, save these or even share them with other users. You basically choose from a menu of options and are able to create new tabs, and drag and drop controls onto them. This interface and all settings are stored along with your saved EigenD setups and can be recalled at will. This can totally replace the awkward original concept of controlling the computer user interface from the harp.
"Stage" was the feature that brought me back to the Eigenharp and convinced me to reengage. Generally the workflow is to drop instrument plugins (VSTs, AUs) into slots in EigenD, along with the ability to connect via Midi or to use soundfonts for sampled instruments, you then create one or more performance setups while sitting in front of the computer. Once you have a setup, you can divorce yourself from the computer and simply recall the different instruments with the eigenharp itself. Because the stage interface is available on an iPad or iPhone, this can be used on stage for simple tweaks to volume of the mix and other settings you wish to make on the fly.
Another big improvement to EigenD are the configuration grids for connecting the various controls on the harp to parameters of different instruments. One grid allows the linking of eigenharp controls such as strip controllers, breath, key controls to Audio unit and VST parameters. The other grid is dedicated to Midi control.
The Midi settings window is one of the most full featured midi implementations I have ever come across. EigenD can now do very advanced things such as send 14 bit midi control information instead of 7 bit. It can send poly pressure to instruments that support it. But most importantly, it can use multichannel midi to send each key presses data stream on a separate midi channel. This allows for per note control of all the key expression (pitch, yaw, pressure, velocity) available on the harp and it can do this using 14 bit CCs if you wish. Some notable plugins such as Spectrasonics Omnisphere and Trillian, Native Instrument Kontact to name a few, support multichannel setups and can offer great expression with such a setup.
To understand the amazing sensitivity and control that the eigenharp offers, you really need sounds that are designed specifically to take advantage of the additional control aspects the harp can offer.
To demonstrate such a sound, I reconfigured a sound that Edmund Eagan had designed for the Continuum (another interesting and expressive controller) to work with the eigenharp. The sound engine is a KYMA Pacarana and the connection is using a multichannel midi setup such as I mention above. The result is a sound that would be impossible to control in such a way on a keyboard.
The last point that I wish to mention regarding Eigenlabs is around the original promise from Eigenlabs founder John Lambert to Open Source the EigenD code. I'm happy to say that Open Sourcing did indeed recently happen. There are many benefits to this. Firstly, a development community can form around connections to EigenD as well as the creation of add ons, bug fixes, etc. Secondly, Open Sourcing creates confidence in the personal investment of owning such an instrument. If Eigenlabs went away in the future or dropped support for the original eigenharps, the software would still be able to be built, upgraded and published by the community at large.
Finally, the last gap that has been filled is support for Windows OS and specifically Windows support for the larger Alpha and Tau models. Eigenlabs has just released an early version of 1.4 that brings Windows OS support to these instruments.
Indeed, Eigenlabs has been busy and I for one applaud the great progress. If you've been on the fence about picking one up. The water is now safe to enter.
Eigenlabs has indeed been busy in the last year and has made great strides in addressing usability concerns as well as simply maturing the product into something fit for a more mainstream player.
I wish I could say that I have matured as a player along with the platform, but I took quite a long hiatus from the instrument after a bit of burnout from being an early adopter. There were a few early design decisions in the software that were a nightmare for usability, most notably, having to control the user interface of EigenD on the computer with the harp itself. Luckily, Eigenlabs listened to users and fixed this issue.
The most significant improvement is the introduction of "Stage" which is a new user interface that exposes all the internal routing and settings of the system as a set of user defined controls that you can tweak on the computer or from your iPhone or iPad remotely. This is a brilliant solution for controlling all the settings that are buried within EigenD. It's quite easy now to create very detailed custom setups for oneself, save these or even share them with other users. You basically choose from a menu of options and are able to create new tabs, and drag and drop controls onto them. This interface and all settings are stored along with your saved EigenD setups and can be recalled at will. This can totally replace the awkward original concept of controlling the computer user interface from the harp.
"Stage" was the feature that brought me back to the Eigenharp and convinced me to reengage. Generally the workflow is to drop instrument plugins (VSTs, AUs) into slots in EigenD, along with the ability to connect via Midi or to use soundfonts for sampled instruments, you then create one or more performance setups while sitting in front of the computer. Once you have a setup, you can divorce yourself from the computer and simply recall the different instruments with the eigenharp itself. Because the stage interface is available on an iPad or iPhone, this can be used on stage for simple tweaks to volume of the mix and other settings you wish to make on the fly.
Another big improvement to EigenD are the configuration grids for connecting the various controls on the harp to parameters of different instruments. One grid allows the linking of eigenharp controls such as strip controllers, breath, key controls to Audio unit and VST parameters. The other grid is dedicated to Midi control.
The Midi settings window is one of the most full featured midi implementations I have ever come across. EigenD can now do very advanced things such as send 14 bit midi control information instead of 7 bit. It can send poly pressure to instruments that support it. But most importantly, it can use multichannel midi to send each key presses data stream on a separate midi channel. This allows for per note control of all the key expression (pitch, yaw, pressure, velocity) available on the harp and it can do this using 14 bit CCs if you wish. Some notable plugins such as Spectrasonics Omnisphere and Trillian, Native Instrument Kontact to name a few, support multichannel setups and can offer great expression with such a setup.
To understand the amazing sensitivity and control that the eigenharp offers, you really need sounds that are designed specifically to take advantage of the additional control aspects the harp can offer.
To demonstrate such a sound, I reconfigured a sound that Edmund Eagan had designed for the Continuum (another interesting and expressive controller) to work with the eigenharp. The sound engine is a KYMA Pacarana and the connection is using a multichannel midi setup such as I mention above. The result is a sound that would be impossible to control in such a way on a keyboard.
The last point that I wish to mention regarding Eigenlabs is around the original promise from Eigenlabs founder John Lambert to Open Source the EigenD code. I'm happy to say that Open Sourcing did indeed recently happen. There are many benefits to this. Firstly, a development community can form around connections to EigenD as well as the creation of add ons, bug fixes, etc. Secondly, Open Sourcing creates confidence in the personal investment of owning such an instrument. If Eigenlabs went away in the future or dropped support for the original eigenharps, the software would still be able to be built, upgraded and published by the community at large.
Finally, the last gap that has been filled is support for Windows OS and specifically Windows support for the larger Alpha and Tau models. Eigenlabs has just released an early version of 1.4 that brings Windows OS support to these instruments.
Indeed, Eigenlabs has been busy and I for one applaud the great progress. If you've been on the fence about picking one up. The water is now safe to enter.
Friday, June 25, 2010
Eigenharp: The Supreme Difficulty of Birthing a New Instrument
CDM posted a blog entry on the Eigenharp Alpha today and linked my video.
CDM on the Eigenharp Alpha
This is the perfect opportunity to write an entry that has been on my mind regarding this instrument. The topic being the supremely difficult task of bringing a radical instrument concept into the world.
I work in software technology and have been in and out of silicon valley startups, before and after the tech crash in 2001. One of the things that we talk about is the technology adoption curve. It takes awhile for a new technology whether it is disruptive or not to become accepted. Many times the companies that develop a new technology are not the ones to capitalize on it. This is because they may be a startup and can't stick it out for as long as it takes before the technology moves beyond early adopters and becomes mainstream. Many times a company that champions a new technology paves the way for others to take the concept further, once the market has started to accept it.
Take a look at at the adoption curve for a technology that I was involved with in the early days, "Web Services". It took a long time for Web Services to become adopted, but now it's a pervasive part of the technology landscape and the backbone of services such as Google, Amazon, Facebook and countless others.
Web Services Adoption Curve
Goeffrey Moore's famous book "Crossing the Chasm" has been required reading for generations of technology leaders. The chasm is the gap between early adopters and the rest of the curve.
Crossing the Chasm on Wikipedia
I think this is fairly relevant to the Eigenharp. Not only is it a revolutionary technology as far as the resolution of the keys and the bandwidth of the information flowing through it, but it is also radical in the sense that it doesn't copy an existing instrument form. Sure it has pieces of different instruments, keys. breath pipe, but really there is no previously existing instrument that prepares you for playing the eigenharp. There is no established method for technique or standard chord positions. It's really a new thing. This is difficult because beyond just the technology adoption curve, you also have a learning curve. It's as if the guitar was just thrown into the world new and there were a half dozen people trying to figure out how to play Stairway to Heaven.
Add to that difficulty, the fact that external software instruments and DAWs and such are still stuck at MIDI resolution, so you don't yet have a virtuous circle where the resolution of the eigenharp can be full expressed through the entire ecosystem of plugins and electronic sound engines.
Right now, the players are early adopters. And who is a candidate early adopter? Well I can see already that programmers are drawn to it. Why? We are comfortable with technology. We are drawn to the pure technical innovation it represents. We make a decent income and can afford it. We are already geeks, so we are not afraid to be labeled one? This is challenging because maybe it would be better to have early adopters that were "musicians" first and foremost. I think this will happen, but it will take time, and time is expensive.
Now the good news is that Eigenlabs seems to be fairly well funded and capable of riding the curve. If you can make a business with early adopters, then hopefully you can cross the chasm.
The one thing I do wish for is for the promise of open-sourcing the software to be realized, as that has enormous benefits for building a community of contributors to the platform. If your early adopters are programmers, that's a lot of talent that could be working for you. John Lambert, the chairman of Eigenlabs has already promised that this will happen although it is already 5 months behind schedule while details and licensing are worked out. I believe that this will be critical to their long term success. I do believe John when he says he is absolutely committed to doing it.
As I have said before, this is a wonderfully expressive instrument. It's a joy to play and I hope that we will see better and better things come out of it.
Monday, June 21, 2010
New Spectralis SoundBank - Analog Modulars 2010
This weekend I created my first soundbanks for the Spectralis II. Big deal right? What's significant about this is that I've finally come full circle to how I got started with synthesizers in the first place. The Spectralis was my first really serious synth. In hindsight it wasn't a great first choice since the depth of the machine requires someone who actually knows something about subtractive synthesis to get the most out of it. So I was pretty lost for a good long while. But some knowledge did trickle in and I read a lot of books, played around with other simpler synths, both hardware and software. I got involved with the monome and writing software and doing all sorts of other audio projects. The Spectralis languished.
During this time, I started building an analog modular synth and that was definitely the best learning experience I could ever have regarding subtractive synthesizers. Even though you can't save patches on the modular I did start getting in the habit of sampling patches that I liked before I tore them down. Fast forward a year and there is a growing list of sampled analog modular patches sitting here.
So to close the circle, I've gone back to the Spectralis which has suddenly opened up to me with my new found education. What was incomprehensible now makes perfect sense and I see the logic of this powerful machine.
The Spectralis has its own wonderful Analog/Hybrid voice (digital OSCs, true analog filters) which sounds as good as any modular on its own. This sample pack is not for that voice. This pack is for the three additional polyphonic digital synth voices that the Spectralis also packs inside it. These voices require a sampled audio source as raw material to work with. After that, you can adjust envelopes, filters, LFOs, pitch envelopes, pan, fx and many other parameters to sculpt the sound. Because these voices are digital, they can sound digital if the audio sources underneath are too predicable. Enter the modular.
This sample set is 400mb of analog oscillations that are raw and harmonically rich and unpredictable. Feeding these into the digital sections of the Specki give me a lot of raw analog clay to mold for additional voices. Notes may beat subtly out of perfect tune in a chord. Some patches are inharmonic or have large timbre changes across their range.
I quite like this idea and think this will be the first of more Soundbanks to get produced this way.
Its an almost 300mb download. If you have a Spectralis, you can get it here.
Analog Modulars 2010 - Spectralis Soundbank
Analog Modulars 2010 by bar|none is licensed under a Creative Commons Attribution-No Derivative Works 3.0 United States License.
During this time, I started building an analog modular synth and that was definitely the best learning experience I could ever have regarding subtractive synthesizers. Even though you can't save patches on the modular I did start getting in the habit of sampling patches that I liked before I tore them down. Fast forward a year and there is a growing list of sampled analog modular patches sitting here.
So to close the circle, I've gone back to the Spectralis which has suddenly opened up to me with my new found education. What was incomprehensible now makes perfect sense and I see the logic of this powerful machine.
The Spectralis has its own wonderful Analog/Hybrid voice (digital OSCs, true analog filters) which sounds as good as any modular on its own. This sample pack is not for that voice. This pack is for the three additional polyphonic digital synth voices that the Spectralis also packs inside it. These voices require a sampled audio source as raw material to work with. After that, you can adjust envelopes, filters, LFOs, pitch envelopes, pan, fx and many other parameters to sculpt the sound. Because these voices are digital, they can sound digital if the audio sources underneath are too predicable. Enter the modular.
This sample set is 400mb of analog oscillations that are raw and harmonically rich and unpredictable. Feeding these into the digital sections of the Specki give me a lot of raw analog clay to mold for additional voices. Notes may beat subtly out of perfect tune in a chord. Some patches are inharmonic or have large timbre changes across their range.
I quite like this idea and think this will be the first of more Soundbanks to get produced this way.
Its an almost 300mb download. If you have a Spectralis, you can get it here.
Analog Modulars 2010 - Spectralis Soundbank
Analog Modulars 2010 by bar|none is licensed under a Creative Commons Attribution-No Derivative Works 3.0 United States License.
Friday, May 7, 2010
Hyper-Idiomatic Expression in Controllers
I came across this term today in a radio show on ImprovFriday. It was coined by one of the performers, Richard Bailey when describing his work which is wonderful. I encourage you to follow the link below.
Link is here:
http://improvfridayradio.bandcamp.com/track/if-radio-presents-intro-to-if-3-paul-muller-hosts
Basic gist goes like so: Lots of music controllers are very good at certain distinct styles of performance, or at least impart the characteristics of the controller itself onto the music. While I'm not one to throw around academic and frankly obtuse terms for musical performance. It did get me thinking.
from Merriam-Webster
Idiomatic:
1. "of, relating to, or conforming to idiom"
2. "peculiar to a particular group, individual, or style"
Idiom:
"a style or form of artistic expression that is characteristic of an individual, a period or movement, or a medium or instrument"
Now the "hyper" part is debatable depending on the controller. Hyper being accentuated or extremely active.
If you look at classical musical instruments, their physical layout and properties whether, string, hammers on strings, wind, woodwind, percussion all play a vital role in how they are performed and also have influenced the styles of music that they are associated with.
There isn't a music controller in existence that leaves no trace of it's physical, mechanical and technical characteristics on the music it produces. Some have suggested the keyboard is so adaptable and can make any sound whether sampled or generated that it is such a device.
Yes, keyboards use the black and white keys of a piano which have a certain structure and pitch layout modelled after the western notion of tones and half tones. The act of striking the key does not have the same effect as breath or strumming or the physical properties of strings. It definitely is ideomatic of certain styles. There may be many styles but the instrument itself did play a part and does impart a footprint of it characteristics on the music itself.
Eigenharp is similar, it has the ability to sound like almost any instrument, it has the addition of breath controller and strip controllers for things like bowing a cello or violin. Yet even it has properties that will leave a distinct footprint over time. The fingering of the keys, their percussive nature, similar to a keyboard but different with multiple control axis. The fact that you can glide across the keys, make large movements in pitch and octave within a hands reach. Adapt the fingerboard to different scales and tunings. Yet even this adaptability can't play guitar like a guitar. I will influence the music and become idiomatic of some styles that are peculiar to itself.
Monome: The blank slate controller. The methods of interaction and the model of such exist in software. It can be a sample slicer, looper, keyboard, video controller, robot controller, whatever. Yes it makes music. It's even good at it. It does impart its idioms on the performance. Certain styles are dominant, the keys are strictly on/off so key velocity doesn't really work which forms the basis and bounds of the performance.
This Brings Me to Vagueness
Vagueness being the degree of predicability of interacting with the instrument and knowing the exact sound that will come out, based on having heard the instrument but not knowing how it is setup when walking up to it.
Ok, so on the vague'O'meter, we have the piano being slanted towards the non-vague side. I walk up to a piano, I hit a key and I have a pretty good idea of what I'm going to get.
Keyboard, middle of the road. I may have an idea of the pitch, basic velocity of a hit, etc...but I am unsure of what patch is loaded. Once I know the loaded sound, I can probably play a known chord pretty easily unless someone has done a strange key mapping.
Eigenharp, same as keyboard. Slightly more vague, the fingerboard may be mapped to any scale even microtonal.
Monome: Way vague. Not only do I not know what sound will come out. I might not even get a sound, cause maybe the key I press changes a setting or maybe it's not even controlling sound. There may be some pattern of multiple presses I need to unlock to do something like reslice a loop. Maybe I just raised the curtains on stage.
Now the reason for taking the vague'0'meter test is to make a point.
Here it is:
"The younger you are, the more likely you are to be comfortable with or exposed to vagueness and accepting of such as constituting a performance"
Ok, a lot of older people totally grock vague controllers. But in general, if someone has not been exposed to these concepts they can be very difficult to grasp. Since these concepts are now more prevalent, we have more exposure and acceptance. Just try explaining a monome to an 80 year old that is used to going to the symphony. The 1:1 connection between the instrument and the idiom is not as strong. Likewise the concept of a performance also is spread on a vague'0'meter. For example, the connection of a piano to a musical performance is not vague. It's very linear and 1:1. The concept of a performance by scratching a turntable, playing a prerecorded record is more vague. There are certain prejudices and barriers involved in connecting a vague controller to a performance.
One last thought here and this is regarding the monome. Even though the monome is at the high end of the vagueness scale, there is a 1:1 connection between seeing a performer hitting a simple button and that button affecting a sound or performance. Even though our brains can handle abstract complex vague controllers, we still want to see the physical connection of human to controller to cause and effect. And the pretty lights don't hurt.
Link is here:
http://improvfridayradio.bandcamp.com/track/if-radio-presents-intro-to-if-3-paul-muller-hosts
Basic gist goes like so: Lots of music controllers are very good at certain distinct styles of performance, or at least impart the characteristics of the controller itself onto the music. While I'm not one to throw around academic and frankly obtuse terms for musical performance. It did get me thinking.
from Merriam-Webster
Idiomatic:
1. "of, relating to, or conforming to idiom"
2. "peculiar to a particular group, individual, or style"
Idiom:
"a style or form of artistic expression that is characteristic of an individual, a period or movement, or a medium or instrument"
Now the "hyper" part is debatable depending on the controller. Hyper being accentuated or extremely active.
If you look at classical musical instruments, their physical layout and properties whether, string, hammers on strings, wind, woodwind, percussion all play a vital role in how they are performed and also have influenced the styles of music that they are associated with.
There isn't a music controller in existence that leaves no trace of it's physical, mechanical and technical characteristics on the music it produces. Some have suggested the keyboard is so adaptable and can make any sound whether sampled or generated that it is such a device.
Yes, keyboards use the black and white keys of a piano which have a certain structure and pitch layout modelled after the western notion of tones and half tones. The act of striking the key does not have the same effect as breath or strumming or the physical properties of strings. It definitely is ideomatic of certain styles. There may be many styles but the instrument itself did play a part and does impart a footprint of it characteristics on the music itself.
Eigenharp is similar, it has the ability to sound like almost any instrument, it has the addition of breath controller and strip controllers for things like bowing a cello or violin. Yet even it has properties that will leave a distinct footprint over time. The fingering of the keys, their percussive nature, similar to a keyboard but different with multiple control axis. The fact that you can glide across the keys, make large movements in pitch and octave within a hands reach. Adapt the fingerboard to different scales and tunings. Yet even this adaptability can't play guitar like a guitar. I will influence the music and become idiomatic of some styles that are peculiar to itself.
Monome: The blank slate controller. The methods of interaction and the model of such exist in software. It can be a sample slicer, looper, keyboard, video controller, robot controller, whatever. Yes it makes music. It's even good at it. It does impart its idioms on the performance. Certain styles are dominant, the keys are strictly on/off so key velocity doesn't really work which forms the basis and bounds of the performance.
This Brings Me to Vagueness
Vagueness being the degree of predicability of interacting with the instrument and knowing the exact sound that will come out, based on having heard the instrument but not knowing how it is setup when walking up to it.
Ok, so on the vague'O'meter, we have the piano being slanted towards the non-vague side. I walk up to a piano, I hit a key and I have a pretty good idea of what I'm going to get.
Keyboard, middle of the road. I may have an idea of the pitch, basic velocity of a hit, etc...but I am unsure of what patch is loaded. Once I know the loaded sound, I can probably play a known chord pretty easily unless someone has done a strange key mapping.
Eigenharp, same as keyboard. Slightly more vague, the fingerboard may be mapped to any scale even microtonal.
Monome: Way vague. Not only do I not know what sound will come out. I might not even get a sound, cause maybe the key I press changes a setting or maybe it's not even controlling sound. There may be some pattern of multiple presses I need to unlock to do something like reslice a loop. Maybe I just raised the curtains on stage.
Now the reason for taking the vague'0'meter test is to make a point.
Here it is:
"The younger you are, the more likely you are to be comfortable with or exposed to vagueness and accepting of such as constituting a performance"
Ok, a lot of older people totally grock vague controllers. But in general, if someone has not been exposed to these concepts they can be very difficult to grasp. Since these concepts are now more prevalent, we have more exposure and acceptance. Just try explaining a monome to an 80 year old that is used to going to the symphony. The 1:1 connection between the instrument and the idiom is not as strong. Likewise the concept of a performance also is spread on a vague'0'meter. For example, the connection of a piano to a musical performance is not vague. It's very linear and 1:1. The concept of a performance by scratching a turntable, playing a prerecorded record is more vague. There are certain prejudices and barriers involved in connecting a vague controller to a performance.
One last thought here and this is regarding the monome. Even though the monome is at the high end of the vagueness scale, there is a 1:1 connection between seeing a performer hitting a simple button and that button affecting a sound or performance. Even though our brains can handle abstract complex vague controllers, we still want to see the physical connection of human to controller to cause and effect. And the pretty lights don't hurt.
Sunday, January 10, 2010
A Dream of Iceland
I spent the day yesterday a prisoner in my house. We had a nasty layer of solid ice on everything. The bamboo in the yard looked like an ice sculpture. Winters here are the impetus for a lot of creative endeavors that we're too busy in the summer to make time for.
This was a perfect time followup my recent review of the Eigenharp Alpha with a video performance. I've only had it for a very short time so actually playing it with skill is going to take awhile. In my head I was imagining myself as Pat Metheny playing midi guitar back in the day, but of course I'm not and will never be.
The benefits of using the monome with the eigenharp should be apparent in this video. While you can control a lot from the eigenharp itself, or at least that is the promise, the monome is still king in that area. The eigenharp is the king of expression. Sounds that you may have thought seemed stagnant come alive.
One thing that should not be overlooked is how amazing it is to be able to play different scales on a grid. I've known this from monome apps like SevenUpLive but when you add an expressive controller like the Eigenharp in the equation, I feel that the possibilities are quite amazing. The difference is that rather than using years of muscle memory to play within a scale on a chromatic instrument, you are instead able to make the connection between where you want to go and getting there faster. I'm sure this can be debated. You can change the scales and tonic on the fly. So if you now how things will sound going from one to the other, the sonic connections I think can happen faster than what we've been used to doing which is learning scales for years on end.
I'm sure watching this video will not convince you of that. There was a lot going on and I hamfisted it a bunch. I still believe it to be true and I think we will see some amazing things come out of this instrument. Stay tuned.
This was a perfect time followup my recent review of the Eigenharp Alpha with a video performance. I've only had it for a very short time so actually playing it with skill is going to take awhile. In my head I was imagining myself as Pat Metheny playing midi guitar back in the day, but of course I'm not and will never be.
The benefits of using the monome with the eigenharp should be apparent in this video. While you can control a lot from the eigenharp itself, or at least that is the promise, the monome is still king in that area. The eigenharp is the king of expression. Sounds that you may have thought seemed stagnant come alive.
One thing that should not be overlooked is how amazing it is to be able to play different scales on a grid. I've known this from monome apps like SevenUpLive but when you add an expressive controller like the Eigenharp in the equation, I feel that the possibilities are quite amazing. The difference is that rather than using years of muscle memory to play within a scale on a chromatic instrument, you are instead able to make the connection between where you want to go and getting there faster. I'm sure this can be debated. You can change the scales and tonic on the fly. So if you now how things will sound going from one to the other, the sonic connections I think can happen faster than what we've been used to doing which is learning scales for years on end.
I'm sure watching this video will not convince you of that. There was a lot going on and I hamfisted it a bunch. I still believe it to be true and I think we will see some amazing things come out of this instrument. Stay tuned.
A Dream of Iceland from bar|none on Vimeo.
Thursday, January 7, 2010
First Impressions of the Eigenharp Alpha
I was lucky enough to recently acquire an Eigenharp Alpha. I'm definitely one of the first which shows also that I had absolutely no self-control when this beasty was announced and was one of their very first orders right after the announcement. Well it has arrived and I've spent just a few days with it and here are the first impressions.
The Instrument
The instrument itself is stunning. The wood, the finish, the metal. Everything is indeed very high-end quality wise. I was actually concerned that the instrument would be almost too light weight when I read the specs but in fact it feels just right and well balanced.
The case is excellent and custom form fitted to the instrument. You could pack it on a plane no problem. The case is very long and so is the harp.
There is a leather strap that is quite unique with a rotating center metal bezel that clips into the harp so it can rotate at your waist if you want to stand while playing. Personally, so far I prefer using the floor spike and sitting or standing with it that way. I understand now why there is a floor spike. This is not a guitar, you don't just user your left hand on the frets and your right for strumming. In fact you use both hands for playing the buttons. This ergonomic fact makes it very comfortable to play with the floor spike and it feels very natural after awhile. If you've ever wanted to play a standing bass, this is right up your alley although it's not like playing a bass either.
The Buttons
Ok, this is the magic, the secret sauce, the real stuff. The buttons are magic. No really they are. They are so sensitive, it really needs touching to believe it. It takes a very light press to trigger and normal playing is a very light touch, especially after the first attack. Players who are new, death grip the buttons and that's where you hear pitch changes, since the x,y,z control kicks in at that point. For midi playing, I really want to just turn off the pitch wheel since it's too crude in midi anyway whereas with the native instruments and sound fonts, you can do wonderful things like vibrato/tremelo, if you've got the chops. You can also use these for other sound aspects like filter control, effect control, whatever.
Back to the buttons. They are so sensitive that it takes effort to hit them without triggering a note at a very soft velocity, yet they do not false trigger which is amazing, since I would think even shaking the instrument might trigger a button. Compare this with the force needed to hit a piano or synth key and you will understand that playing this will not only be different, but will have totally new capabilities for the player regarding speed and range of notes played. Since I can choose to use a chromatic mode or any of 80 scales, you can really cover a lot of notes easily and quickly, with a lot of control. I can do the scale thing on the monome, but I can't do it with the expressiveness that this instrument provides.
Bowing the cello is one of my favorites so far. Basically this can done two ways. You can use the large strip controller on one side of the Alpha for long strokes on the bow, OR move your finger slightly and quickly for tremelo of the bow. You can also hammer on/off the strip for tremelo which is probably easier and just as convincing. The second way is to use one of the percussion buttons on the bottom. This highlights how damn sensitve these buttons are. By gliding a finger from one side of the button to the other, you have the same control as the strip controller, but on a small button. You can use a very light touch.
I love how if the cello mode is on especially and I pick up the eigenharp it mutters and squeaks a bit just as an electric guitar does when picked up on stage. That is the sound of an instrument that wants to be played with expression and can be played with skill. Of course there is a dedicated button at the bottom that turns off input when you don't want accidental presses to make a sound.
You can select to play multiple instruments at one time. These could, be the native modelled instruments, sound fonts, Audio Unit plugins or midi and you can layer all of them. For example, layer a grand piano, cello and a synth pad. Because the cello needs to be bowed to play, you can bring it into the mix in a subtle way. Really incredible range of playing just by using different techniques.
The Hardware
The base station is good quality and burly. The instrument cord is high quality and plugs into the bottom of the Alpha. Nice thing, is there is just one cord. So you can walk around same as with an electric guitar. This really makes it a stage instrument. The protocol and technology that delivers so much data over this single mini XLR type connection is amazing.
The Software
OK, this is where it gets muddy. EigenD runs as a menu bar icon in OS X. It has very little UI per say and can run headless (no UI). The important part is really the core software engine and what little UI is in eigenD is a bolt on. It's definitely not going to work like you are used to working in a DAW or other music software. There is a browser window. But this is where the paradigm get weird. The browser is more informational than meant to be a UI that you would operate with a mouse. I would almost prefer that they pick one paradigm and stick with it, rather than part mouse control and part instrument control of what the UI is displaying, because this is confusing for sure. The software is designed to be controlled from the instrument itself. It's clear that was the original design and the UI seems hastily thrown on. I get the on instrument control part though and it is very powerful. Basically I compare this to monome software. You have a range of buttons and you interact with the software by using buttons to change modes etc. You need to bite the bullet and spend a day with this and get familiar with it because this is the way you interact with the software, the Browser UI is really a crutch. On instrument is excellent because you can get away from the computer entirely and play it as the instrument it is meant to be. There are holes however, mixer controls are avaliable on the instrument for example, and when you are changing values, there is no on instrument feedback at the moment, so you really need to look at the computer screen to see the value. Also, when you use Audio Units, you can't just do everything in the EigenD UI, you actually need to use the on instrument control on the Alpha to make the proper Audio Unit UI pop up on the computer. This is what I mean by mixed controls, interacting with a computer UI from the instrument itself. I do think that they will sort this out and it's partially because the UI part of the software is coming late and it feels very beta still, but progress is happening fast and they are obviously committed to making it better and better. Yes, if you buy one, you are an early adopter. This is high tech stuff and kudos for eigenlabs to have the balls to ever attempt what they are doing.
So back to the software. There are a few classes of instruments. Native modelled instruments like the cello, clarinet and piano. SoundFonts where eigenD can play sampled instruments. Then AudioUnits which are hosted in EigenD. You also can use AudioUnit effects. You have two effect slots avail that can be placed in front of any instrument and mixer controls for all these pieces. You also have midi instruments where eigenD simply sends midi, usually this would go to you DAW or an external midi instrument. In the case of the modular...you can't host Volta in eigenD because it requires a multitrack setup that uses side chain audio and there is no support for this in eigenD, so I pipe midi from eigenD to Ableton Live. This actually works OK, I have velocity and aftertouch velocity easily to use for controlling different aspects on the modular. A native CV controller however would be the bomb and I really hope they get around to releasing one. It would be amazing to have that kind of resolution with the modular.
Conclusion
I'm very happy so far. The playability is amazing. There are different levels of control. You have most control over the modelled instruments, next the soundfonts, then the AUs, the the midi. Even though we get reduced to midi at the bottom level, it is still extremely viable and just the velocity control and touch of the keys alone is enough to make the playability something extremely special.
The software still needs refinement. The was a product locked in secrecy for years. Now they need to expose it to the whims of the public. This is an important phase to listen to users but still stay true to the design and adapt to how the instrument wants to really be used by players.
The underlying software engine seems solid which is the core and heart of the system. The on instrument control is really what this is about. EigenD uses a special scripting language to create configurations called belecanto. Right now the configurations are presets but in the future, you can deeply customize your setups using it. The presets on the alpha will keep you busy for a long time though, so it's not really an issue.
The other aspect is playing the instrument itself. This is an instrument that on one hand you can do some amazing stuff out of the box, way more than say you had never touched a keyboard before and you are trying one for the first time. I mean that is really a good comparison. I know that over time I will be able to play this much more proficiently than I will ever play the keyboard. Guitar players rejoice...this is for you. But it is not guitar exactly either. So it is an instrument that you will need to learn to play. Some people don't want this. DJ's you might want to take a pass.
More Info
The Sound On Sound article from Nov 09 I felt had the most information and was also the most fair and truthful in it's assessment. Read it if you want to know more.
The Instrument
The instrument itself is stunning. The wood, the finish, the metal. Everything is indeed very high-end quality wise. I was actually concerned that the instrument would be almost too light weight when I read the specs but in fact it feels just right and well balanced.
The case is excellent and custom form fitted to the instrument. You could pack it on a plane no problem. The case is very long and so is the harp.
There is a leather strap that is quite unique with a rotating center metal bezel that clips into the harp so it can rotate at your waist if you want to stand while playing. Personally, so far I prefer using the floor spike and sitting or standing with it that way. I understand now why there is a floor spike. This is not a guitar, you don't just user your left hand on the frets and your right for strumming. In fact you use both hands for playing the buttons. This ergonomic fact makes it very comfortable to play with the floor spike and it feels very natural after awhile. If you've ever wanted to play a standing bass, this is right up your alley although it's not like playing a bass either.
The Buttons
Ok, this is the magic, the secret sauce, the real stuff. The buttons are magic. No really they are. They are so sensitive, it really needs touching to believe it. It takes a very light press to trigger and normal playing is a very light touch, especially after the first attack. Players who are new, death grip the buttons and that's where you hear pitch changes, since the x,y,z control kicks in at that point. For midi playing, I really want to just turn off the pitch wheel since it's too crude in midi anyway whereas with the native instruments and sound fonts, you can do wonderful things like vibrato/tremelo, if you've got the chops. You can also use these for other sound aspects like filter control, effect control, whatever.
Back to the buttons. They are so sensitive that it takes effort to hit them without triggering a note at a very soft velocity, yet they do not false trigger which is amazing, since I would think even shaking the instrument might trigger a button. Compare this with the force needed to hit a piano or synth key and you will understand that playing this will not only be different, but will have totally new capabilities for the player regarding speed and range of notes played. Since I can choose to use a chromatic mode or any of 80 scales, you can really cover a lot of notes easily and quickly, with a lot of control. I can do the scale thing on the monome, but I can't do it with the expressiveness that this instrument provides.
Bowing the cello is one of my favorites so far. Basically this can done two ways. You can use the large strip controller on one side of the Alpha for long strokes on the bow, OR move your finger slightly and quickly for tremelo of the bow. You can also hammer on/off the strip for tremelo which is probably easier and just as convincing. The second way is to use one of the percussion buttons on the bottom. This highlights how damn sensitve these buttons are. By gliding a finger from one side of the button to the other, you have the same control as the strip controller, but on a small button. You can use a very light touch.
I love how if the cello mode is on especially and I pick up the eigenharp it mutters and squeaks a bit just as an electric guitar does when picked up on stage. That is the sound of an instrument that wants to be played with expression and can be played with skill. Of course there is a dedicated button at the bottom that turns off input when you don't want accidental presses to make a sound.
You can select to play multiple instruments at one time. These could, be the native modelled instruments, sound fonts, Audio Unit plugins or midi and you can layer all of them. For example, layer a grand piano, cello and a synth pad. Because the cello needs to be bowed to play, you can bring it into the mix in a subtle way. Really incredible range of playing just by using different techniques.
The Hardware
The base station is good quality and burly. The instrument cord is high quality and plugs into the bottom of the Alpha. Nice thing, is there is just one cord. So you can walk around same as with an electric guitar. This really makes it a stage instrument. The protocol and technology that delivers so much data over this single mini XLR type connection is amazing.
The Software
OK, this is where it gets muddy. EigenD runs as a menu bar icon in OS X. It has very little UI per say and can run headless (no UI). The important part is really the core software engine and what little UI is in eigenD is a bolt on. It's definitely not going to work like you are used to working in a DAW or other music software. There is a browser window. But this is where the paradigm get weird. The browser is more informational than meant to be a UI that you would operate with a mouse. I would almost prefer that they pick one paradigm and stick with it, rather than part mouse control and part instrument control of what the UI is displaying, because this is confusing for sure. The software is designed to be controlled from the instrument itself. It's clear that was the original design and the UI seems hastily thrown on. I get the on instrument control part though and it is very powerful. Basically I compare this to monome software. You have a range of buttons and you interact with the software by using buttons to change modes etc. You need to bite the bullet and spend a day with this and get familiar with it because this is the way you interact with the software, the Browser UI is really a crutch. On instrument is excellent because you can get away from the computer entirely and play it as the instrument it is meant to be. There are holes however, mixer controls are avaliable on the instrument for example, and when you are changing values, there is no on instrument feedback at the moment, so you really need to look at the computer screen to see the value. Also, when you use Audio Units, you can't just do everything in the EigenD UI, you actually need to use the on instrument control on the Alpha to make the proper Audio Unit UI pop up on the computer. This is what I mean by mixed controls, interacting with a computer UI from the instrument itself. I do think that they will sort this out and it's partially because the UI part of the software is coming late and it feels very beta still, but progress is happening fast and they are obviously committed to making it better and better. Yes, if you buy one, you are an early adopter. This is high tech stuff and kudos for eigenlabs to have the balls to ever attempt what they are doing.
So back to the software. There are a few classes of instruments. Native modelled instruments like the cello, clarinet and piano. SoundFonts where eigenD can play sampled instruments. Then AudioUnits which are hosted in EigenD. You also can use AudioUnit effects. You have two effect slots avail that can be placed in front of any instrument and mixer controls for all these pieces. You also have midi instruments where eigenD simply sends midi, usually this would go to you DAW or an external midi instrument. In the case of the modular...you can't host Volta in eigenD because it requires a multitrack setup that uses side chain audio and there is no support for this in eigenD, so I pipe midi from eigenD to Ableton Live. This actually works OK, I have velocity and aftertouch velocity easily to use for controlling different aspects on the modular. A native CV controller however would be the bomb and I really hope they get around to releasing one. It would be amazing to have that kind of resolution with the modular.
Conclusion
I'm very happy so far. The playability is amazing. There are different levels of control. You have most control over the modelled instruments, next the soundfonts, then the AUs, the the midi. Even though we get reduced to midi at the bottom level, it is still extremely viable and just the velocity control and touch of the keys alone is enough to make the playability something extremely special.
The software still needs refinement. The was a product locked in secrecy for years. Now they need to expose it to the whims of the public. This is an important phase to listen to users but still stay true to the design and adapt to how the instrument wants to really be used by players.
The underlying software engine seems solid which is the core and heart of the system. The on instrument control is really what this is about. EigenD uses a special scripting language to create configurations called belecanto. Right now the configurations are presets but in the future, you can deeply customize your setups using it. The presets on the alpha will keep you busy for a long time though, so it's not really an issue.
The other aspect is playing the instrument itself. This is an instrument that on one hand you can do some amazing stuff out of the box, way more than say you had never touched a keyboard before and you are trying one for the first time. I mean that is really a good comparison. I know that over time I will be able to play this much more proficiently than I will ever play the keyboard. Guitar players rejoice...this is for you. But it is not guitar exactly either. So it is an instrument that you will need to learn to play. Some people don't want this. DJ's you might want to take a pass.
More Info
The Sound On Sound article from Nov 09 I felt had the most information and was also the most fair and truthful in it's assessment. Read it if you want to know more.
A Dream of Iceland from bar|none on Vimeo.
Subscribe to:
Posts (Atom)