Generating a Virtual Forest Environment Using Procedural Content Generation

Liam Potter & Reza Ryan Part of CW17

Video game worlds are growing rapidly, creating a large amount of content that digital artists need to produce. To cope with this amount of content, game development companies would have to hire more artists and content creators, which is not economical. Therefore, Procedural content generation (PCG) techniques have quickly become a key area in the development of video game worlds. These techniques can be applied to generate a wide variety of things, from entire forests to the individual leaves on a tree. Simulated real-time virtual forests are one of the more common and complex virtual environments in contemporary video games that have to be generated procedurally. In this research, we developed a system that integrates different PCG techniques to automatically generate and simulate a virtual forest in real-time. These techniques include Height Generation, Terrain Texture Generation, Detail Generation, Point Generation, Shadow Map Generation, Life Cycle Simulation and Day/Night Simulation. The implemented day/night system accurately calculate angle of the sun through the time of day to simulate life cycles of all flora in the environment in real-time. The optimized developed system can be easily integrated with any real-time game that requires a forest environment.

iOrpheus Reflections (Panel and Performance)

Queensland Conservatorium Research Centre Part of CW17

In August 2007, New York based composer William Duckworth and pioneering media artist Nora Farrell worked with the Queensland Conservatorium Research Centre on a Fulbright Senior Specialist Grant to create a world premiere of iOrpheus, a ground breaking iPod Opera merging podcasts with live music, dance, installation, fire and a mobile sound garden in the South Bank Parklands. The project re-enacted the story of the mythical musician Orpheus in five acts across various locations in South Bank Parklands with audiences shifting between environments as the immersive story unfolded through interdisciplinary installations and performance.

As the Queensland Conservatorium Research Centre embarks on a documentation project to map the impact of this innovative work over the last decade, this panel and performance reflects on the legacy of iOrpheus with a participatory sound walk on mobile phones and site-specific performance in South Bank Parklands.

iOrpheus Reflections celebrates the life and work Nora Farrell, who sadly passed away in 2017.

To participate in the sound walk and performance, please download the free app AURALITY to your device (iOS and android).


Leah Barclay
Vanessa Tomlinson
Daniel Blinkhorn
And students from the QCGU percussion and music technology department

Mobile Listening: Augmenting Environments and Connecting Communities with Sound

Dr Leah Barclay, Queensland Conservatorium Research Centre,
Griffith University Part of CW17

Sound has a profound ability to make us feel present and connected to our surrounding environment. Recent years have seen a proliferation of site-specific audio works exploring the possibilities of mobile technologies and locative media in place. This means at any given moment in an urban environment, we could be moving through a sound field of voices, music, memories and sonic art dispersed invisibly throughout the places with inhabit. While this material is available only to those with mobile devices and knowledge of the locative experiences, the advancement of new technologies and the accessibility of mobile devices means this field presents new opportunities for exploring our social, cultural and ecological environments through sound.

In the 2007 CreateWorld keynote, pioneering media artist Nora Farrell remarked that the future of computing is in the mobile phone. She believed it was the most valuable platform to focus our energies as creative artists. As locative media and augmented reality audio shifts into mainstream culture, she was clearly correct. This presentation traces creative explorations with locative sound stretching across a decade of practice at the Queensland Conservatorium Research Centre, all inspired by the innovative work of Nora Farrell and composer William Duckworth.

Beginning with the ground breaking work iOrpheus – an iPod Opera conceived by Duckworth and Farrell – this research explores the impact of iPods, iPhones and iPads across six interconnected projects. Ranging from the first live performance with iPads in remote Australia to spatial sound walks in Times Square and augmented reality audio on the Eiffel Tower – these creative works draw on sound walking, mobile technologies and locative media to investigate the role of sound in achieving presence and connection to place and communities. This presentation highlights the legacy of Nora Farrell’s creative and technical innovation and explores the value of mobile technologies in understanding and interrogating our relationship with places and communities through sound.

Dr. Leah Barclay is a sound artist and researcher working at the intersection of art, science and technology. She specialises in spatial audio for immersive installations and performances and leads research in augmented reality at Griffith University. Her work has been commissioned, performed and exhibited to wide acclaim internationally by the Smithsonian Museum, UNESCO, Ear to the Earth, Al Gore’s Climate Reality and the IUCN. Her augmented reality sound installations have been presented around the world from Times Square in New York City to the Eiffel Tower in Paris. She is currently a postdoctoral research fellow at the Queensland Conservatorium Research Centre Griffith University.

Unity and ARKit

Scott Roberts Part of CW17

ARKit is set to be available for 400 million iOS devices this year, making it the largest immersive technology platform in the world. We’ve seen a multitude of examples demonstrating how ARKit is helping developers create some impressive augmented reality apps, from virtual tape measures and navigation to fun things like games. It’s now time we see creatives from all disciplines employing the technology to blend virtual environments with real ones. This workshop will explore the pathway to get you set up with ARKit in Unity, including plane detection, raycasting, and hit testing. Begin your immersive explorations with a taste of AR in Unity.

Scott Roberts is a 3D Visualisation Artist and Digital Designer based on the Gold Coast. As a PhD candidate pursuing a career in Academia, Scott has had the opportunity to explore new techniques for delivering immersive experiences within VR and AR mediums. His passion is conceptualizing and generating virtual environments with a focus on user experience and the perception of realism.

Mix Your World with Holograms

Thomas Verbeek, 8i Limited Part of CW17

Holo is an augmented reality app that brings realistic, 3D holograms to the masses. It allows users to manipulate volumetric media on their mobile devices using state-of-the-art rendering technology by 8i. Enabling this experience naturally comes with a lot of challenges, ranging from mobile device resource restrictions to explorations in user experience to being ready on the App Store for the worldwide release of ARKit.

This presentation explores some of the challenges we encountered and overcame during the development and release of Holo for iOS. We discuss some of the early assumptions and opportunities that guided the product to its initial release; evaluate design challenges regarding discoverability and interaction; provide an under-the-hood look at how we incorporated ARKit into Holo; and shed insight into the future of 8i’s technology.

Thomas Verbeek is a senior iOS engineer at 8i in Wellington, New Zealand. After a few years of delivering mobile applications for companies across New Zealand, he’s returned to his roots by joining his passion for iOS with computer graphics. He has a keen interest in researching manipulation techniques for volumetric content using ARKit, rendering using Metal and architecting applications using VIPER.

TouchDesigner; Audio-reactive Visuals for Performance

Jason Haggerty Part of CW17

Join Jason in creating audio-reactive visuals and a custom made control-panel in the procedural programing platform, TouchDesigner. Perfect for artists, developers, and all kinds of tinkerers, TouchDesigner allows for very flexible programing from user-interfaces and complex real-time geometry, to data visualisation and interactive environments.

Jason Haggerty is a Gold Coast based artist who has exhibited in galleries nationally, a projection artist who has created immersive and large scale installation, and a VJ who is currently playing music festivals across the country.

CreateWorld 2017 Registrations Open

CreateWorld is back for another year from 29 November to 1 December, at Griffith University, South Bank, Brisbane.

Specifically for academic, teachers, and technical staff who use Apple technology in the education and digital arts disciplines, the conference features a wide range of academic and technical presenters from the education and industry sectors, and includes keynotes, panel sessions, hypotheticals, hands-on technical workshops, and regular presentation sessions.

Early bird and student sales are now open. Early bird sales close November 10, and all sales close 9pm AEDT, Tuesday November 21, 2017.

Learn more and register »

Teaching Coding on the iPad

Jonathan Sagorin Part of CW17

Are you facing the choice of how to teach or support coding on the iPad, and don’t know where to get started? Or are you just interested yourself, and want to learn more?

This workshop is for you! We’ll cover three popular offerings: Swift Playgrounds (from Apple), Codea (from Adelaide-based Two Lives Left), and Pythonista (from OMZ Software).

We’ll evaluate each app using the same selection framework, which you can use to choose any eduction app appropriate to your needs.

We’ll look into main features of each, getting started with simple coding, how to get additional content and share code with others, and where to find the best support resources.

Note: this is a hands-on workshop which requires that you bring an iPad along with you, running iOS 10.3 or iOS 11. All apps are available from the iOS App Store, and we encourage you to install them ahead of the conference to save time. Note that the combined size of all three downloads is around 900 megabytes.

Tap these links on your iPad to find the apps on the App Store:

Jonathan has a keen interest in online learning and mentoring. When not working, Jonathan volunteers teaching primary school kids to code. As an iOS software engineer, he has worked on app projects since 2010. Jonathan is meandering toward EdTech. He has worked across a diverse range of industries, both in Melbourne and San Francisco; from user-generated content and TV, to enterprise insurance software and imaging work flow applications.

Wayfinding in Playable Cities

Troy Innocent, Swinburne University of Technology Part of CW17

Cities can be sites for self-discovery and transformation; they are also constantly in the process of becoming. Urban codemaking is a framework for decoding and reimagining cities, a programming language for urban space that marks locations in the city using codes enabling multiple alternate readings of that city – by machines, humans, and other entities.

This workshop will invite feedback on the current iteration of this system following a series of interventions into public space situated around experimental and playful approaches to wayfinding using urban codes.

For the best experience with this workshop, you should bring along an iPad running iOS 11.

Troy Innocent is an artist, academic, designer and educator whose hybrid practice traverses multiple disciplines. His public art practice incorporates pervasive game design, augmented reality, and urban design supporting a long-term investigation into interactive and speculative experiences of the city as an emergent process.

In 2017 Innocent was awarded the Melbourne Knowledge Fellowship to research and develop playable cities in the UK and Europe leading to a crossdisciplinary collaboration with urban designers, policy makers and creative facilitators to transform the city through play. This approach is also central to his public art practice through ‘urban codemaking’ – a system he developed for situating play in cities such as Melbourne, Istanbul, Sydney and Hong Kong.

Innocent teaches pervasive game design at Swinburne University; and is represented by Anna Pappas Gallery.

State of the AR

Iain Anderson, Training Brisbane Part of CW17

Apple’s new ARKit has launched to some fanfare with iOS 11, and some flashy demos are already available. But is Augmented Reality just another fad? In this presentation, gain a broad overview of what AR is, of what developers have done so far with AR in gaming, education and art, and what it’s likely to be used for next.

Attend if you’ve heard of AR but not really seen it in person, or if you’re on the hunt for how you could potentially use AR in your own classrooms.

Iain Anderson is a video editor, animator, designer, iOS and web developer and Apple Certified Trainer based in Brisbane, Australia. He has taught privately and in tertiary institutions, and has freelanced for Microsoft and the Queensland Government. Comfortable with anything from Quartz Composer to Second Life and Final Cut Pro to Adobe Creative Cloud, he has laid out books, booklets, brochures and business cards; retouched magazine covers and product packaging, shot and edited short films and corporate videos, and animated for HD broadcast shows, film festivals and for the web.

Today, Iain is a Lead Trainer (creating video training courses) for, for whom he also regularly writes tutorial articles. An Apple Certified Trainer and Apple Consultant Network member, Iain is an active trainer and presenter, and runs the Brisbane InDesign User Group. Iain also maintains an active interest in the iOS App Store, where he has published several apps and eBooks, and a free complication for the Apple Watch, called “Roughly”. Contact Iain via or