Here is the latest site that we built for Mountain Dew on behalf of ColensoBBDO.
This fixes both my needs to be geek and a teenager again! – I’m going to fund it!
Welcome to the new internet! – MeshNet aka Darknet
This happens when governments and corporations get involved in trying to regulate something that they barley understand. Instead of trying to understand and get involved with the people/users they choose oppression, regulation and restriction. As always when governments and rulers that set this type of hostile environment a new set of thinkers and risk takers evolves to combat the assault.
While I think new innovation and technology progress is important in today’s society, I don’t think it should be propelled by overbearing laws and governments anti positions. All you have to do is look back at history to see how this is going to play out.
As this new type of internet – “meshnet” is developed over time I feel this is going to be place to openly share information that otherwise would have been disallowed. But I also feel it will open up a new “seedy” internet where all the really bad sh#@t and people will be pushed, obscuring them from view allowing the to operate with almost impunity. Effectively creating a black-market internet.
Bravo Governments and your complete lack of understanding of the community you are dealing with!
The V Motion Project was a pure collaboration of talent and creativity. The initial concept originated at ColensoBBDO, as part of the marketing campaign for the most popular energy drink in New Zealand, V. The client of the project, Frucor (makers of V energy drink), have a history of sponsoring incredible and over the top projects and events.
With the immense support of Frucor® — V Energy Drink in particular — our technical and creative minds were allowed to run free producing an amazing and absolutely stunning event.
The Fugitive team was enlisted to assess the viability of the project and then build the live realtime music interaction and performance system. Paul Sanderson, director and lead developer at Fugitive, built the original working prototypes — and handled the technical programming and development of the final showpiece. Michael Delucchi, developer at Fugitive, developed a custom motion management system in Adobe AIR for use in tracking and choreographing moves captured by the Kinect cameras. He also assisted Paul and Jeff in the physical setup, integration, and testing of the custom live music instrument powered by Abelton.
Producer Joel Little (Kids of 88, Goodnight Nurse) created the original music and James Hayday, sound designer, meticulously converted that music into an array of loops and sound clips that were then arranged in Ableton Live. Jeff Nusz of Custom Logic also assisted in the music coding process, but was purely responsible for programming the visual display system. This system generated its own custom coded visuals and triggered the mind-blowing visuals created by Matt von Trott and Jonny Kofoed from Assembly. Assembly also had the job of creating the full-length music video and TV commercials, with Thick as Thieves filming the entire process.
The final system was a journey that spanned several months of discovery and included input from a melange of industry experts. The end result was a culmination of creativity that most would agree was simply amazing.
The journey started for us in the November 2011, when Colenso approached us about this madcap idea they had about playing a complete music track with Kinect motion capture technology. Having already built custom Kinect frameworks before, Fugitive was up for the challenge, but the scale and scope of the idea seemed daunting.
The first step was to research the latest technology available and figure out what framework we could utilise as a starting point. All seasoned developers know that the right framework choice is cruicial, so we definitely took a detailed approach to finding the right one.
Here are the systems we worked through over the course of a month or so:
1. Synapse for Kinect.
Synapse is a project from Ryan Challinor, It uses a C++ app with Max Panels in Ableton that interfaces with the Kinect. While this was initially a great start, the Ableton interface/control wasn’t what James expected. This was a big issue for us, as making tools that people could use in their standard work flow was a crucial requirement. Also, for us to extend Synapse’s functionality would have been a massive undertaking. So with that, it was out.
Now we have all see Chris Vik’s Videos with him creating loops and manipulating them. At the time, he was just in the initial phases of developing his software. We tested it thoroughly and found it to be great piece of kit. Chris is a really nice guy to chat to, but again the software at the time was closed source, and we wanted and needed more control than it could offer.
3. Windows Kinect SDK
At the time the Kinect for windows SDK was about to enter the market, so we decided to create a project with the Kinect SDK and run some tests. We weren’t convinced with the outcome of the Windows Kinect SDK compared to the OpenNI Drivers. The Windows Kinect SDK seemed to produce a few more false positives. We were really hoping for more tracking speed, but it simply didn’t deliver. (Note – We believe some the issues we were experiencing with the Windows SDK have now be fixed)
BiKinect is made by an amazing and very friendly French guy, Ben Kuper. Unlike the others listed above, BiKinect was built in Processing. Ben had done a great job of creating a versatile system that connected to Ableton in a manner that was familiar to music producers. He also made the app a bit more versatile, so it was safe to say that BiKinect would be the winning choice as our starting point.
By the time Jeff joined the picture, and with the help of James and Mike, we created a nice test track in Ableton that allowed us to layer, manipulate and filter sounds. But this all came at a price, LAG time! This lag (latency) makes it almost impossible to play something in realtime such as drums or a keyboard, and is a common problem among the Kinect hacking community. This specific latency issue arises from the fact that the OpenNI drivers have to process massive amounts of math. These calculations accurately determine where the person is in relation to their position in front of the camera, and this happens at 30 fames a second.
This lag issue led us to our biggest question and challenge: “How do we keep our flexibility but get realtime play back?”
During our first brain storming session this was the hot topic of conversation. Jeff had the idea to add a second Kinect camera and bringing another computer into the mix. However, this Kinect would not do any skeleton tracking and we would just use the raw camera data instead, but in now realtime!
From our initial realisations on how the systems talked, we understood that our technical choices must be right from the get go, otherwise it would curtail the realtime capabilities. That said, the project endured a plethora of code rewrites before we were satisified with the performance.
Early on, we decided, that the audio system would be the brains of the operation (controlling all the midi inputs, keeping states, info and generally trying to keep everything in check.) The visual machine would also be awesome by reacting to the realtime camera data and sending note representations back to the audio system. As it turns out, when trying to achieve realtime you need to do a lot of planning and rewriting to get the the most out of the system. Looking back, it was a logical progression but required constant discussions and whiteboard sessions.
It turns out that you if you use two Kinects or more together as Jeff says, “the proton fields reverse and the universe implodes on itself”. It’s safe to say that doesn’t really happen but the Kinect’s do see each others IR dots which effectively punches holes in your depth data. As you can imagine, this wreaks havoc on your skeleton tracking and makes it almost impossible to play an instrument, especially in realtime.
While this was a technically challenging problem, Matt Tizard (Creative Technologist @ Colenso BBDO) found a most interestingly simple solution. Apparently you only need to slightly vibrate one Kinect as they now see the other Kinect’s dots as a smear and don’t register them. Our first test involved me just jiggling one kincet, and wouldn’t you know, it worked!
So I went out and got a Tamia Hi speed gearbox, geared it down, added an offset weight and taped it on. Problem Solved!
The whole point for creating the two systems was the hope that this would allows us to give the artist’s realtime playback while still maintaining the flexibility. With this solution, we were effectively combining the pros of both approaches.
While I was creating the system connection architecture, Jeff was building the first cut of the virtual keyboard. His system(visuals) sent notes numbers to my system(audio) where it was interpreted into midi data and sent to Ableton. Here is a video of the first time we had the keyboard and connection operational:
As you can see in the video he is pushing forward over a virtual line which triggers the note. While this was fun and fast it was really unplayable from a musical perceptive. So it was back to the drawing board. We then moved the keys to the side of the user which was a major beakthrough (since reaching out to virutally ‘hit’ something is a very natural movement). This now allowed Josh Cesan, our Motion Artist, to play the system in realtime and make it sound great!
Now that we had the realtime playback sorted — high fives all round — the issue that still remained was integrating this all into our existing system in a manner that was flexible and allowed us to make quick changes and tweaks. This lead me to create another application in Adobe AIR, specifically for the job. This new app allowed us to create and manipulate keyboards and keys on the fly in realtime, and on both systems (Mac and PC). We could easily create and edit keyboards to fit the way Josh moved and danced.
Ableton was the software we chose from the initial concept to handle the music sequencing and serve as the realtime music instrument, taking input (midi data) from our custom system and using it to trigger sounds and effects. Ableton is immensely powerful and allows you to layer, manipulate, play and alter music on the fly.
By the time we got to the stage where James was laying out the track we had a system that was so flexible and powerful, we could control literally everything. For example, we could play a loop, layer tracks, and play notes from a single mapping set (with infinite mapping sets possible). Basically anything you can do with two or three Midi keyboards and several drum machines, you could do with our system. In short, we had unlimited potentional.
We harnessed this potential by mapping skeleton data to knobs and switches that could be controlled by movement. For example, when you touch your head with your left hand a certain loop could start. Or you could control the dry/wet filter on a Phaser effect in Ableton with the distance between your hands. This ability to map physical motion to actions in Ableton is enormously powerful and endlessly flexible.
The hardest thing we found about the process of laying out the track controls out with Josh, James and Joel was self-editing what we would do where and when.
This project required Fugitive to build lot of custom tools and apps that helped us develop and debug the system.
The original music track was created by Joel Little, and broken down into Ableton by James Hayday and them mapped out with James, Josh and Paul. After we had the track mapped, the only person that could play it was James. So we went through the process of teaching Josh how to use the system and then he used his flare and sweet moves to make it shine. Once we had the track locked down, it took around 4 weeks of practicing and tweaking to take it from ‘good’ to ‘amazing’. While we were doing this, Jeff and the boys down at Assembly were simultaneously cranking along on the custom visual effects.
At this point we realised that Josh was now too good on the system and people may think we were faking it all. So James had the idea to have a “Genesis” section as an intro to the track for the live show. This would be the process of Josh tuning the instrument before playing the track. So James went to his studio and made some bone vibrating bass loops for each of the four phase’s of the genesis movement. The first time we heard and saw Josh tuning in the instrument we knew we had nailed it.
Sound Mapping Complete. – James Shows us how its done – Note Visuals still in Development
The visualisation of the elements for the project where handled by Assembly, with Jeff handling the coding. The final presentation was to project on a 30M wall in downtown Auckland with a speaker stack so big that the Skytower would shake. This would also be the location of the music video and TVC’s which where shot in conjunction with showing the world our live creation. Please have a look at Jeff’s blog post about how he created the visuals system that you see in the video
We started with BiKinect as our base but ended up with a completely bespoke system with almost unlimited flexibility. We had one seriously pimped-out PC with a custom Processing app to handle all audio inputs/outputs, a custom Kinect skeleton tracking system, midi sequencer and effects processors. From Ableton, the audio output was generated through an external sound card, which then powered a massive speaker system.
The second system was a Mac Pro tower that ran a Custom C++ app (XCode) and displayed the visuals. In addition, it handled the second Kinect camera and presented the virtual keyboards. This visual system also took an audio feed from the sound card for visualisation. Listed below is Jeff’s sweet diagram of how all of the tech works together.
This project was a massive collaboration between many super-talented people and transformed into something super amazing and fun!
So turns out the V Motion project got a post on Wired. We are stoked!
And guess what the WTF Engine code and this particular implementation is available in GitHub
So here are our next three projects.
Here’s the official music video for “Can’t Help Myself” shot on the streets of Auckland, New Zealand when The V Motion Project team created music through movement.
Any sci-fi fan knows how cool holograms are, and so does Kickstarter. Paul Christie, founder of Liti Holographics, hopes to make full color affordable holograms available to fans everywhere. Unfortunately though, they don’t move.
We did it! Check out The Motion Project setting it off live in the city!
Just a short post to recognize the awesome efforts of one of our JWT clients who spent the night sleeping rough for a good cause. While Bek has already raised almost double her target donations, she’s still accepting more here.
Love watching the Maker revolution in full swing at Kickstarter. I had a similar idea to this, but now I can just buy it instead! http://kck.st/LKiug1
Have to say we are loving the new approach to licensing from Adobe around CS6 as well as the upgrades (that’s you 64bit native Fireworks). Thankfully they seem to have fixed that diabolical update process as well.
This is Paul’s new mouse. It’s a rare moment when I’m speechless, or out-geeked with some tricky piece of kit, but seriously that is just the most insane looking thing. It’s like some kind of dormant Transformer that I expect to come to life if I get too close.
Paul reckons he bought it because “it’s completely programmable, all of it, every single button”. But really we know it’s because of how nuts it looks.
The guys get their first look at some mean new graphics for The Greenman. Then Josh and his crew chuck down some V’s and hit the streets with a portable projector to test out the visuals on some city walls. With the track and graphics all sussed, the guys have an idea to make the [...]
The V Motion Project team takes it up a gear, as the guys input their new sounds into the machine and start to create a track from Josh’s motion. After a few too many V’s the guy’s get a visit from Noise Control… how are we supposed to make music through movement when you can’t [...]
Joel’s keen to find some new sounds for the project and invites some of his musician mates to get on board including Luke from Kids of 88, Brandon from The Wild and a full marching band. He also introduces a Burundi drum crew to V and they introduce him to the way drums were meant [...]
The guys explain how they’re going to use the technology to pick up Josh’s movements to create music. And Josh gets his crew down for a few V’s and a go with the technology.
Here is the second Episode for the Kinect project. Find out more about this sweet project over at the V website.
Everyone is getting über excited about how this V project is coming together. Stuff is going to look insane by the time the visuals get piped in as well!
Turning motion into music through Kinect and some heavy lifting custom code. Check out the webisode at the V website.