Since there are already great solutions for developers, LUMOplay has historically focused on creating software that designers can use. Many creative, artistic people don't have the time or training to code their own interactive experiences, and these are the people we primarily work with. We've never promoted the use of our software through developer tools, like an SDK.
Over the past two years, we've been documenting our game making process, creating internal templates and samples, and refining our app market publishing process. We want to open our app market and existing subscriber base to 3rd party developers, so small studios, independent developers, and established agencies can easily distribute their own amazing interactive experiences to tens of thousands of installations. Right now, high quality interactive experiences, especially highly customized ones, are difficult and costly to develop and deploy at scale, but we think you'll agree that the world would be a lot cooler if every surface was an interactive experience. Right?
That's why we're working on an SDK.
How it Works
Our SDK is a custom package that's imported into Unity3D. This package includes scripts that connect to LUMOplay's custom vision server, allowing you to create a custom experience and simulate the effects of motion, gesture, and touch (so you can test your effect as you build it). There are also pre-built behaviors and samples for 2D and 3D projects that developers can build on by adding their own graphics, 3D models, audio, and scripts.
Our SDK was built around years of experience designing and optimizing interactive experiences that use navmesh, particle systems, and physics.
Right now, it isn't possible for developers using the LUMOplay SDK to publish games to our platform directly - we still need to receive and upload files manually. For this reason, we aren't ready to release the SDK publicly. However, we're looking for beta testers from our existing customer base who already have experience performing commercial retail or event installations. Beta testers also need to demonstrate that they have an internal development team, and that they currently develop in-house interactive experiences that they would like to distribute on the LUMOplay platform.
We're considering two possible sales models for the LUMOplay SDK. The model we choose will be determined by our beta testers, and ongoing market research.
Option One: We'll include the SDK as an optional, paid add-on, available only to members of our Reseller Program subscription, which includes a white label installer and the ability to purchase one-time-fee licenses on behalf of clients.
Option Two: We'll allow any customer to purchase the SDK, and charge a per-app publishing fee, and fees for annual version updates.
Become a Beta Tester
If you are interested in joining the LUMOplay SDK Beta Testing program, please sign up here.
Tendrils Gallery Show Case Study
I'm going to be honest. Unlike the rest of my team, I'm not a very good coder. I've learned Unity and C# by watching Youtube videos in my spare time (here's a link to my playlist of favorite Unity tutorials, in case you're interested).
In fact, I'm a large part of the reason LUMOplay has always focused on making tools that are easy for designers to use. My degree is in computer animation, and prior to starting LUMOplay, I made characters, animations, and designs for games and cartoons. I've also been a VJ, and I've even videomapped a few buildings in my time.
LUMOplay co-founder Curtis Wachs and I videomapped this building in 2014
Before making the announcement that we're working on a semi-public SDK, I wanted to try it for myself. When I was asked to help an amazing glassblowing artist named Michael Skura with a very unique gallery show, I decided to use our SDK to create a custom LUMOplay interaction.Hanging hundreds of tiny glass sculptures on a gallery wall.
Hardware and Installation
What makes this show unique is that there are over 20 projectors, sponsored by Epson America, used to light thousands of glass sculptures throughout the room. The projectors, and the light reflected from the sculptures, are the only source of light in the show.
Michael's vision is to bring his work to life with light, and to imbue each piece with a unique and evolving spirit. Since each sculpture is handblown using a variety of techniques, it's only fitting that the lighting effects for each piece be designed to interact with individual details, flaws, and finishes of the glasswork.
Aryn John Freysteinson, CEO of Rabcup, is the mastermind behind the hardware installation and solutions integration of this installation. Aryn installed three 6000lm Epson Powerlight projectors and 18 individual Epson Lightscene projectors, mapped most of the sculptures in the room, and designed a foolproof power management cycle for the duration of the show (which runs until January, 2020).
The Shy Wall is tracked by 3 depth cameras. For this installation we used Orbbec Astras, but in future I plan to use Intel RealSense D435 cameras, because they can cover a larger area and are easier to link together.
I mapped the 50 foot wall. By hand. Using Photoshop.
- Step one: line up three projectors.
- Step two: trick the computer into thinking all three projectors are a single monitor.
- Step three: open a Photoshop document that matches the resolution of the extended projector display, drag it to the projection, full screen, and draw hundreds of little shapes.
You might have noticed from the first setup photo that the projectors are not arranged in a straight line. This is because Michael arranged the sculptures lower on the left side of the wall, and much higher on the right. To cover the entire area with light, we had to arrange the projectors from left to right so that each one was a bit higher than the last.
We didn't edge blend the projectors. This is because Aryn John Freysteinson is a wizard, and managed to position each projector so that edges fell perfectly between the sculptures.
The Shy Wall
Coding isn't my job, but the developers at LUMOplay are amazing, supportive people, and they were happy to help me use the SDK for this project.
The wall I mapped is called Shy Wall. Micheal wanted to create an interaction that would encourage mindfulness and stillness. Since LUMOplay is normally used to encourage a lot of movement and engagement, this was a really fascinating challenge.
Like all the other pieces in the exhibit, the Shy Wall has an organic arrangement. Each group of sculptures climbs the wall in a tendril pattern, like rows of sea anemones. It was important to respect the continuity of this pattern by arranging the interaction in the same pattern, starting at the bottom and climbing up the wall. In other words, each sculpture needed to 'talk' to the next. When someone approaches the wall, the bottom sculptures dim, and then the sculptures directly above, and so on, until the whole area has dimmed in response to movement. When people stand still near the wall, the light slowly comes back to life.
The Development Process
While I worked on making a custom map for the walls (which took the form of a black png file with a lot of transparent shapes in it), my cofounder Curtis added a new sample to our SDK. His sample included a simple particle prefab that I could clone so one system could sit behind each transparent dot in the mask. The mask itself was imported into Unity as a sprite and loaded into the project as a canvas UI element.
Next, I connected each node, and set a 'target' node and a trigger, so that movement in a certain area triggers the node tree to go dim one sculpture at a time.
Making a connection was easy. I selected the 'parent' node, locked the inspector, clicked on the child node I to select it in the Hierarchy window, and dragged the child node from the hierarchy window to the 'Connections' section of the parent node in the locked Inspector. Then I unlocked the inspector and chose the next parent down the chain. I did this hundreds of times.
In the image above, you can see some plain grey rectangles. These are the trigger areas. Unlike other LUMOplay installations, the cameras in this installation are not pointed directly at the display. Instead, they're pointed at the floor directly in front of the wall, since that's the area where we want to detect motion.
Each sculpture contains at least one particle system. The particles themselves are really simple - just some simple balls and ribbon trails that move quite slowly. Inside the glass, this looks like a little ghost, and it creates some very cool reflections on the wall.
To make everything react to motion, I hooked up the custom scripts that connect to LUMOplay's vision server. Then I built the project, packaged it as a .zip, and uploaded it to Micheal's account on the LUMOplay.com website. Once there, I was able to load the experience onto the gallery computer and see what it looked like on the wall, and how it reacted to movement.
The final particles look like they float inside the glass. We tested the wall for a few hours to get the timing of the dimming right - it needed to be responsive enough for people to notice, but the effect when the wall was left in peace also needed to be enticing enough that visitors would choose to stand still and let the wall come back to life.
This project helped me and my team identify improvements we can make to support the artistic and commercial use of the LUMOplay SDK. I encourage you to apply as a beta tester if you have something cool you'd like to build using LUMOplay, even if you're not a strong Unity developer. I'm not, and I managed to bring 500 little handblown glass sculptures to life in under a week.
The Tendrils Gallery show will be running at the Riverside Art Museum until January 2020.
If you want to see video of the complete show, sign up to our newsletter and you'll be notified when Epson America's case study is available.
If you have questions about this project, or about what can be crated with LUMOplay, contact us.