Agile X 4 (TUI Workshop)

Last week I attended a workshop about Tangible User Interface (TUI) design. I’d registered because I wanted to build skills during my uni holidays, this seemed like something I’d be interested in, and my lecturer (who was running this workshop) specifically suggested that I do it.

My workshop (Carve) was part of 5 integrated, interdisciplinary workshops which ran simultaneously and culminated in a demonstration of our results and a conference to discuss future outcomes.

This thing was brand new, not well advertised, and pretty expensive for the average student. The information about what we’d be doing was also a bit vague(and was only going to get vaguer).

It turned out that I was the only student enrolled in the Carve workshop (apart from Tim, who was kind of tutoring me and kind of participating himself). This did mean that I had to carry a lot of weight myself, but it was also a huge blessing. I had two full time teachers working directly with me. We had a staff of three more technical professionals who were on call to help facilitate my learning and the creation of the final product. I (a first year) was representing the tangible workshop among an integrated group of students and professionals.

The great thing about this week was that you couldn’t tell who was there to learn and who was there to teach; we were all there exploring new ideas and sharing on an even level.

So what was it we were actually doing? I’m glad you asked.

Make sense?

No?

Okay, so we were all exploring weird things and trying to bring them all together into one big weird thing. These meetings where we would try to make sense of what the grand thing would become were hilarious and confusing. As far as I could tell, everyone was inputting their creations into the VR space and all I had to worry about was reinventing the Wiimote.

Each day I was gifted with a new lecture (just for me) on interaction design, blended reality spaces and experiential design. This was awesome. I’ll save my notes til the very end.

At first, we went into our TUI Design bubble with the brief that we were creating a UI for architects to “Carve” a South Australian House. We were asked to build a sword, so naturally, as designers, we looked at every possible option to build a house via carving that was not a sword.

We decided on what the final interface would be used for and worked backwards from there, brainstorming tools that you use for creation that serve multiple functions. We leaned towards knives, spatulas and trowels.

We were promptly brought down to earth by the VR team and the hardware builders.

The VR was only capable of one action; splitting in half with the immediacy of Fruit Ninja. Was it worth designing a tool that could do everything, if it could only be demonstrated to do one thing? Having a tool for carving organic shapes slowly would seem disjointed with the action it was able to represent.

I did enjoy meeting the hardware guys, learning about sensor technology and seeing what they’d be able to build that could fit into a prototype within a week (something bulky).

IMG_3764.jpg

My new idea was to use a hammer and chisel, something that creates a big split in one specific instant.

On day two, we started fresh. We were invited to explore the Architecture Museum (apparently our uni has one!)

The Architecture workshop were investigating the evolution of the archetypal South Australian house and creating a family tree; a gene pool for all of the possible stylistic features that a house could have. It looked like this:

Drawing on the theme of evolution, I realised that a primitive axe could be the best way to draw all of these ideas together.

We had some incredible lectures from bio-medical experts. Petra Gruber talked about how she’d been growing mould and fungi into specific shapes for the purpose of building materials in an attempt to create ‘living architecture’.

Aurelien Forget talked to us about his work, 3D printing organs by cultivating cells in a liquid serum and injecting them in a 3D design into a harder gel serum.

Quenten Schwarz and his colleague (who was brought into the program late, so I don’t have her name on hand), talked about how the jaw and the heart are created as day-by-day processes during foetal growth. It was interesting to hear contributions from designers and architects as to why natural disturbances might occur in these processes as a response to environmental factors such as blood-flow’s potential effect on artery formation (approached from the perspective of river formation).

dsc00428

Biology were building potential structures and then creating algorithms that would grow to fill the structure in a unique and organic way. It looked like this:

It became evident that if these were the structures we’d be interacting with, we’d need a tool that would follow through in a sweeping motion, so that the user could understand they’d be dividing everything that composed of the mesh, not just a single element. A hammer and chisel would not work, but damned if we were gonna make a sword!

We also realised that cutting was not going to be enough after all. We’d been looking at the building as an architectural object, not a biological one.

I decided that our initial idea of how this would be used was wrong. It was not a collaborative space that architects could carve out rooms for their clients; it was going to be a space where clients explore a variety of options with a sense of curiosity and surprise.

My brain was suddenly able to turn this:

dsc00460

Into this:

img_3809

The architects were building the gene pool that would provide a coded source of options that would be drawn on by the self-building structure that the biologists were creating. This structure would sit within a VR space that the VR team was building and be interacted with by the tool that I was designing. The interactions would feed back to the source code and organically alter the makeup of the structure.

If my primitive axe were also a hammer, it could be used like we’d originally intended; an intuitive multitool like a spatula or a trowel. You can cut, bash, scrape, scoop, spin, poke and effect the structure in ways that, without having seen it, we can’t yet understand.

Now, to build the thing:

The evolution of the 21st Century Axe:

soft-model-generation-1500

We needed to split the model into several pieces to fit it into the 3D printer and have it come out within 24 hours.

In the dead of night, the print shifted 10mm to the left. That’s when we brought out the duct tape…

tech-model-generation-1500

We had plenty of time while the printing was happening to think about gestures.

It turns out there are a lot of things you can do with an axe. There are even more things that you can do with your body. I created a selection of tool-based gestures for our limited actions possible within the immediate future, with an architectural audience in mind. I also came up with a variety of free body gestures with the potential biological landscape in mind.

screen-shot-2016-11-30-at-13-11-56

It was a really fun challenge to come up with clear illustrations that would succinctly show the user what to do. I drew them from a first person perspective, because that’s the viewpoint most likely to be used in the VR space. I really enjoyed instruction design, but was warned never to tell an employer that, because apparently it’s all I’d be doing for the rest of my life.

Overall, it was a really ambitious project and although all the elements didn’t come together completely, it was great to be able to share ideas with people that I wouldn’t normally get to work with.

We turned all of our work into academic posters, which will hopefullly make their way into a publication and spawn more research and collaboration with Agile X. I thoroughly recommend you check them out: Agile X 4 – All Posters PDF

screen-shot-2016-12-01-at-11-33-00

Notes from Carve lectures:

  • Experience design is different to Experiential design. Experiential design is more about instincts, reflexes, sensory experience that is felt with the body more than the mind.
  • A Human Experiential designer creates sensory inputs and makes a mind-map out of the head, making it physical and intuitive.
  • Industrial scale looks at markets and ‘users’, whereas human scale looks at individual people.
  • Direct Manipulation – is where you can interact in realtime, moving objects on screen live.
  • Third Wave HCI – Information is not explicit, it’s all implicit in physical space and you use your body as an interactive tool.

Check out:

  • Indy Johan on the Open State Website
  • Metaphors we Live By – 1980 by George Lakoff and Mark Johnson
  • Disney Research
  • Penny Webb
  • DESEEN ECAL students, Milan 2014
  • Memphis design movement / Sottsass
  • Future Industries Institute

The splash-screen for Adobe Illustrator has never been so relevant to me.

img_3873

One comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s