Elizabeth Reed & Dominique Falla Part of CW17
An auto-ethnographic perspective on using technology-based devices to boost motivation when lettering by hand. To learn or to improve skills surrounding cursive handwriting, one needs to practice the letterforms. To produce these letterforms we need to build muscle memory, and the best way to do this is using repetition. The standard method of repetition is to repeat the same letterform over and over again—for example repeating a page of A’s and B’s and so on. This method of learning, although useful, has the potential to lose the interest of the learner. By using multiple sensory activities and project-based learning, one can be motivated to complete, the otherwise mundane act of repetition. Practice and repetition are necessary if one is seeking to improve skills when writing by hand. There are many areas, both digital and non-digital, that can be explored to improve the process of handwriting practice. No matter what the activity, if the focus is on learning the movement and the strokes of the letterforms, we can start to play and experiment with a range of different techniques. Emerging technologies using creative apps in virtual reality are an exciting development. There is something engaging about writing with ink-filled nibs across paper fibres, then switching over to virtual reality and writing the same letters on a much larger scale.
Peter Mills & Justin Carter Part of CW17
Virtual Reality (VR) is a rapidly growing field, disrupting many industries, such as video games, engineering, architecture, and medical visualization. Designing VR experiences involves the use of digital technology and rendered 3D graphics to create immersive virtual environments. While traditional user interfaces require users to view and interact with a screen, VR places the user inside a virtual environment through the use of a head mounted display (HMD). This form of user interface has implications on how rendered graphics are perceived and interpreted. One rendering technique used extensively in design and construction of virtual environments is Non-Photorealistic Rendering (NPR). NPR is primarily concerned with providing opportunity for a wide variety of expressive rendering styles such as toon, hatching and outline shaders.
This paper examines Non-Photorealistic Rendering techniques for virtual reality experiences, specifically focusing on strategies applied to achieve characteristics of toon, hatching and outline shaders, in virtual reality contexts. Through first identifying the common features traditionally used for NPR and then reconstructing these features in a virtual reality context the project illuminates unique considerations for practitioners implementing NPR effects in VR.
Lachlan Bunker & Reza Ryan Part of CW17
Previous research has identified several cooperative design patterns used to facilitate cooperation in games. The effect that these patterns have on player experience individually had been researched, and it has been found that closely-coupled cooperative design patterns have the greatest effect on player experience. However, no research has yet been done into the effect that combining these patterns can have on player experience. Therefore, this research investigates if combining closely-coupled design patterns can improve player experience. Three patterns were chosen to combine: limited resources, interaction with the same object, and complementarity. A prototype game was made for each combination and participants were asked to play the games, and provide feedback on their experience. The combinations were complementarity and interaction, complementarity and limited resources, interaction and limited resources. Based on the games used in the experiment, the results of combining patterns has shown no effect on player experience.
Video game worlds are growing rapidly, creating a large amount of content that digital artists need to produce. To cope with this amount of content, game development companies would have to hire more artists and content creators, which is not economical. Therefore, Procedural content generation (PCG) techniques have quickly become a key area in the development of video game worlds. These techniques can be applied to generate a wide variety of things, from entire forests to the individual leaves on a tree. Simulated real-time virtual forests are one of the more common and complex virtual environments in contemporary video games that have to be generated procedurally. In this research, we developed a system that integrates different PCG techniques to automatically generate and simulate a virtual forest in real-time. These techniques include Height Generation, Terrain Texture Generation, Detail Generation, Point Generation, Shadow Map Generation, Life Cycle Simulation and Day/Night Simulation. The implemented day/night system accurately calculate angle of the sun through the time of day to simulate life cycles of all flora in the environment in real-time. The optimized developed system can be easily integrated with any real-time game that requires a forest environment.
Queensland Conservatorium Research Centre Part of CW17
In August 2007, New York based composer William Duckworth and pioneering media artist Nora Farrell worked with the Queensland Conservatorium Research Centre on a Fulbright Senior Specialist Grant to create a world premiere of iOrpheus, a ground breaking iPod Opera merging podcasts with live music, dance, installation, fire and a mobile sound garden in the South Bank Parklands. The project re-enacted the story of the mythical musician Orpheus in five acts across various locations in South Bank Parklands with audiences shifting between environments as the immersive story unfolded through interdisciplinary installations and performance.
As the Queensland Conservatorium Research Centre embarks on a documentation project to map the impact of this innovative work over the last decade, this panel and performance reflects on the legacy of iOrpheus with a participatory sound walk on mobile phones and site-specific performance in South Bank Parklands.
iOrpheus Reflections celebrates the life and work Nora Farrell, who sadly passed away in 2017.
To participate in the sound walk and performance, please download the free app AURALITY to your device (iOS and android).
Dr Leah Barclay, Queensland Conservatorium Research Centre,
Griffith University Part of CW17
Sound has a profound ability to make us feel present and connected to our surrounding environment. Recent years have seen a proliferation of site-specific audio works exploring the possibilities of mobile technologies and locative media in place. This means at any given moment in an urban environment, we could be moving through a sound field of voices, music, memories and sonic art dispersed invisibly throughout the places with inhabit. While this material is available only to those with mobile devices and knowledge of the locative experiences, the advancement of new technologies and the accessibility of mobile devices means this field presents new opportunities for exploring our social, cultural and ecological environments through sound.
In the 2007 CreateWorld keynote, pioneering media artist Nora Farrell remarked that the future of computing is in the mobile phone. She believed it was the most valuable platform to focus our energies as creative artists. As locative media and augmented reality audio shifts into mainstream culture, she was clearly correct. This presentation traces creative explorations with locative sound stretching across a decade of practice at the Queensland Conservatorium Research Centre, all inspired by the innovative work of Nora Farrell and composer William Duckworth.
Beginning with the ground breaking work iOrpheus – an iPod Opera conceived by Duckworth and Farrell – this research explores the impact of iPods, iPhones and iPads across six interconnected projects. Ranging from the first live performance with iPads in remote Australia to spatial sound walks in Times Square and augmented reality audio on the Eiffel Tower – these creative works draw on sound walking, mobile technologies and locative media to investigate the role of sound in achieving presence and connection to place and communities. This presentation highlights the legacy of Nora Farrell’s creative and technical innovation and explores the value of mobile technologies in understanding and interrogating our relationship with places and communities through sound.
Dr. Leah Barclay is a sound artist and researcher working at the intersection of art, science and technology. She specialises in spatial audio for immersive installations and performances and leads research in augmented reality at Griffith University. Her work has been commissioned, performed and exhibited to wide acclaim internationally by the Smithsonian Museum, UNESCO, Ear to the Earth, Al Gore’s Climate Reality and the IUCN. Her augmented reality sound installations have been presented around the world from Times Square in New York City to the Eiffel Tower in Paris. She is currently a postdoctoral research fellow at the Queensland Conservatorium Research Centre Griffith University. www.leahbarclay.com
ARKit is set to be available for 400 million iOS devices this year, making it the largest immersive technology platform in the world. We’ve seen a multitude of examples demonstrating how ARKit is helping developers create some impressive augmented reality apps, from virtual tape measures and navigation to fun things like games. It’s now time we see creatives from all disciplines employing the technology to blend virtual environments with real ones. This workshop will explore the pathway to get you set up with ARKit in Unity, including plane detection, raycasting, and hit testing. Begin your immersive explorations with a taste of AR in Unity.
Scott Roberts is a 3D Visualisation Artist and Digital Designer based on the Gold Coast. As a PhD candidate pursuing a career in Academia, Scott has had the opportunity to explore new techniques for delivering immersive experiences within VR and AR mediums. His passion is conceptualizing and generating virtual environments with a focus on user experience and the perception of realism.
Thomas Verbeek, 8i Limited Part of CW17
Holo is an augmented reality app that brings realistic, 3D holograms to the masses. It allows users to manipulate volumetric media on their mobile devices using state-of-the-art rendering technology by 8i. Enabling this experience naturally comes with a lot of challenges, ranging from mobile device resource restrictions to explorations in user experience to being ready on the App Store for the worldwide release of ARKit.
This presentation explores some of the challenges we encountered and overcame during the development and release of Holo for iOS. We discuss some of the early assumptions and opportunities that guided the product to its initial release; evaluate design challenges regarding discoverability and interaction; provide an under-the-hood look at how we incorporated ARKit into Holo; and shed insight into the future of 8i’s technology.
Thomas Verbeek is a senior iOS engineer at 8i in Wellington, New Zealand. After a few years of delivering mobile applications for companies across New Zealand, he’s returned to his roots by joining his passion for iOS with computer graphics. He has a keen interest in researching manipulation techniques for volumetric content using ARKit, rendering using Metal and architecting applications using VIPER.
Join Jason in creating audio-reactive visuals and a custom made control-panel in the procedural programing platform, TouchDesigner. Perfect for artists, developers, and all kinds of tinkerers, TouchDesigner allows for very flexible programing from user-interfaces and complex real-time geometry, to data visualisation and interactive environments.
Jason Haggerty is a Gold Coast based artist who has exhibited in galleries nationally, a projection artist who has created immersive and large scale installation, and a VJ who is currently playing music festivals across the country.
CreateWorld is back for another year from 29 November to 1 December, at Griffith University, South Bank, Brisbane.
Specifically for academic, teachers, and technical staff who use Apple technology in the education and digital arts disciplines, the conference features a wide range of academic and technical presenters from the education and industry sectors, and includes keynotes, panel sessions, hypotheticals, hands-on technical workshops, and regular presentation sessions.
Early bird and student sales are now open. Early bird sales close November 10, and all sales close 9pm AEDT, Tuesday November 21, 2017.
Learn more and register »