Rob Keko & Ross McLennan Part of CW17
Since the rising popularity and widespread commercial use of the electric guitar in the 1950’s, advances in guitar amplifier design and technology have played a key role in shaping the soundscape, tonal characteristics, recording methods and production styles of contemporary music. In recent years, digital modelling techniques have created new ways of producing sought after guitar amplifier sounds, which have changed the way producers, artists and guitar players use this technology both in the recording studio and in live performance. This in turn has impacted on how listeners, concert attendees and music consumers hear and experience recordings and live music. Extensive comparisons between authentic and modelled amplification have been conducted in industry magazines. However, these tend to be simplistic or overtly commercial in nature with typical yes/no style responses. A more rigorous approach is required which ascertains both gut feeling, as well as a more considered aesthetic response to the two technologies. This paper, therefore, presents a comparative study between traditional and modelled guitar technology that contextualises these amplifier sounds within fully produced music. It presents a non-biased quantitative and qualitative study of audience reaction to music – recorded using Apple’s Logic Pro X software – which includes both amplification styles: authentic and modelled. The paper concludes with the results of the study and reflects upon the future of guitar amplification.
Clara Durbridge & Ross McLennan Part of CW17
Sound and music have been linked to healing since early civilisation. Likewise, in modern times studies demonstrate sound and music as effective methods in decreasing anxiety, accelerated heart rates and blood pressure. The aim of this paper is not to prove or disprove the efficacy of sound as a healing agent, but to define and explore sound healing as a relatively new field of study, and then to incorporate its ideas, techniques and instruments into an original music composition intended to heal, through sonic metaphor, the damaged natural world. The paper documents a journey from one side of the Earth to the other – with nothing but an iPhone – to capture and record the concepts and practices of modern and ancient sound healing. The paper culminates in the incorporation of these ideas and practices into original music created within Apple’s flagship music software, Logic Pro X. It is anticipated that this study and its resulting music will inspire other composers and artists who are seeking to experiment with their own creative practice and possibly incorporate aspects of sound healing into their own work.
Braiden Fenech & Justin Carter Part of CW17
Video Game design and development has evolved into a profitable and widely accepted creative field that operates within ever-increasing technical capability. This improved capability has facilitated an increase in the visual fidelity achievable within real-time environments. Game artists faced with creating these environments are tasked with maximising both system resource allocation and efficiency in production time. One strategy that has been adopted by artists is to implement a modular design and construction approach when developing environmental elements. Although this approach offers many benefits for artists, the associated skills and techniques are not well defined.
Through an exploration of existing literature and reflection on current practice, this study identifies and evaluates a range of contemporary approaches to modular construction for real-time environments and in the process offers valuable insight for practitioners.
Henry Sun & Justin Carter Part of CW17
Conceptualising and communicating game design ideas amongst teams of game developers can be an enigmatic process. Designers of video games often rely on rapid prototyping and iterative approaches to creating game play experiences. Deep and meaningful experiences are not always easily expressed in the form of words and as a result, initial design intentions are often misinterpreted and or poorly communicated. This often leads to designers of games relying on a serendipitous approaches as they intrinsically move toward design intentions. These approaches are largely derived from traditional models of agile software development placing little emphasis on the cognitive process of individuals in the development team. Therefore, approaches based in theories of cognition are rarely considered for designers of games. One such area of this field is tangible design which attempts to investigate links between cognitive science and the physical tactile world. The impact that tangible approaches have on collaborative game design is yet to be thoroughly investigated.
This paper describes a practice-led study that aims to test the influence of tactile 3D printed video game assets on cognitive processes and design communication for teams when conceptualising game designs. This is achieved through a review of existing literature in the field, followed by an in depth analyses of a tangible approach to game level design. Through this process the study presents a deeper understanding of the implications that tangible design strategies have on conceptualising and communicating game designs.
Joel Bennett & Chris Carter Part of CW17
Performance Capture (PCap) is the process of capturing a continuous recording of an actor’s movements and emotions using motion capture technology, typically in a 3D virtual world. This presents a somewhat unique situation for the actor in that they are challenged to imagine their virtual counterparts and a completely abstract, computer-generated world whilst delivering their performance. Central to this paper is the identification of the various implications that affect the actor’s abilities during a performance by investigating professionals’ experiences when using performance capture and through the exploration of the implications of performance capture in the creation of a short experimental animation.
Over the past 60 years, there has been much research into the field of algorithmic composition. Techniques have been refined, and processes developed to suit a variety of needs. Recently however, focus has been turned to algorithmic composition for more emotive purposes. Affective Algorithmic Composition (AAC), the product of this research is a rapidly developing field, with many potential applications. In particular, AAC has the potential to solve one of the most prevalent issues in game audio. This research described in this paper covers the implementation of an Affective Algorithmic Composition system into a computer game. The methodology used is based upon Design Science Principles and is has a pragmatist theoretical perspective. Using Lindenmayer Systems and Markov chain theory, a fully functional system will be developed.
Andy Aubun & Ross McLennan Part of CW17
Mobile devices have not only changed the way we purchase and listen to music, but are changing the way song writers, composers and producers create. Mobile applications and laptop computers allow composers the freedom to make music anywhere, implementing a myriad of music making apps, software programs and plug-ins designed to simplify and, therefore, democratise music creation. The previously vast and complex arts of music composition and production are now accessible to everyone and accordingly, traditional methods of music composition and production are no longer standard practice. This paper explores these new standard methods of composition and production through the creation of a commercial song. Using iPhone applications for conceptualising and refining the song, computer-based methods for recording and production using a Macbook Pro laptop, and the Internet for automated mastering, this paper illuminates and catalogues a new standard in creative practice, and redefines traditional roles such as music composer, music producer and sound engineer.
Christopher Osmond & Reza Ryan Part of CW17
Using Artificial Intelligence and Non-Player Characters in games has begun to increase rapidly. This is due to both player expectations and availability of new hardware and software technology (Dragert et al, 2012). Artificial Intelligence can increase a player’s immersion and experience with a game as the player see realistic and dynamically reacts to occurrences. However, there is a lack of generic design and implementation of Artificial Intelligence systems that employ more complex algorithms that can be easily integrated and scaled. A common example of a scenario that needs a generic Artificial Intelligence system is an ecosystem of autonomous artificial animals. This research aims to design and implement this system for a simulated virtual forest environment that resembles forest wildlife. This system will employ the Utility AI theory and dynamic considerations to create an ecosystem of autonomous artificial animals. The generic structure makes us enable to scale up our system easily by adding more species in the forest with minimum changes. This design for this system will be shown as well as a walkthrough of the implementation of the system in Unity3D.
Michael Drew & Ross McLennan Part of CW17
Traditional table-top roleplaying games offer a more agile, imaginative and physical experience than video games. The table-top roleplaying game, Dungeons and Dragons, has been popular since its creation in 1974 by American game designers, Gary Gygax and Dave Arneson. The game involves players roleplaying characters while the Dungeon Master (or DM) describes the game’s world, its narrative and controls destiny with a set of many-sided di. In recent years, DM’s have been attempting to seamlessly integrate sound effects and music into the gameplay to create a more cinematic experience for the character players. This paper explores the efficacy of these attempts and suggests an improved method for the creation and control of interactive music to enhance cinematic-style immersion during gameplay. Utilising Apple’s Logic Pro software to explore conventions of film and game music composition and Audiokinetic’s WWise audio middleware for integration into game engines like Unity and Unreal, a final prototype iPhone app will be demonstrated. This prototype has the potential to greatly enhance the Dungeons and Dragons game experience, but also has the capacity to be incorporated into myriad other table-top roleplaying games that exist on the market.
Shanice Hayes & Justin Carter Part of CW17
Virtual reality (VR) systems are increasingly utilised as a medium in which to experience video games. These systems incorporate technology that is designed to offer the user an experience of a simulated physical presence within a virtual environment. The acceptance of VR as a platform for gaming has given rise to many new challenges for designers of games. These new challenges represent a disruption in the craft of game design on a scale not experienced since the transition from 2D to 3D graphics. This paper offers insight into the challenges for designers of VR games through the examination of existing strategies and design principles. These principles are then applied in the construction of a creative work that further expounds techniques for practitioners creating VR games.