Tuesday 12 January 2016

Monday 14 September 2015

Making Audio go private

Today I have been working on getting my Audio system implemented, mostly in order to keep the Activity Lifecycle happy- ie ensuring that media is paused, resumed and released appropriately. The initial implementation I made was an Audio class which was built up of static methods, each able to do a specific function, and if necessary call each other to use pre-existing functionality. The class worked, and is in a small project not a bad model, but I have been reading up on ways to setup a game View and it seems that the Audio class needs to be less public.

Honestly, my first reason for making methods static was that I read that using getters and setters was potentially heavy on resources, but in further reading I have found that for my audio needs this won't be an issue.

The main root of my game framework is now a base activity which is able to create and switch between views, the views themselves after a little experimentation are now extensions of the SurfaceView, and are able to take a pre-initialised Audio instance as a parameter. What this does is allow the main activity to set up the audio environment in one place, passing in itself for context rather than having the Audio being setup in the beginning of a new "screen".

The audio environment is now also much more encapsulated, and as such should only be passed to one screen and any reference will be lost as that screen dies. The activity itself can also ensure that the correct settings are in place for the audio-stream, should the app framework be used for other app types.

The project I have been working on is a soundboard which emulates the character select screen of Mortal Kombat 3, a childhood favourite of mine. So far I have been able to load the 15 game characters names and corresponding "introduction" speech files from the game, and I have it set up to play a random one from this list and display the name on each touch event.
The names themselves exist in an array which is organised in the order that the game presents them on the screen, however it may be necessary at a later date to have them sorted alphabetically- which we will do if needed. One string array (in the strings.xml resource) holds these names, and another array holds the filenames for the audio files in the same order. The reason I have kept the two seperate instead of trying to parse one array into an audio filename, is that I am able to localize the name strings and they will still line up with the sound files whose names are written in english. While the game characters names are the same in any language, it may not be the same in character based languages, and if I ever work on supporting those it would be great to not have to rewrite my audio file loading.

There are probably many other ways to link the names in an array to specific sound files but for now this seems to be the simplest and works well for this project.

The next step will be to load in a music track to loop in the background and then I can begin working on graphics!

Monday 7 September 2015

MediaPlayer experiments

When working with the media player there are some little things I have found that you need to keep an eye out for. The first is that the MediaPlayer object is quite happy to continue playing your media track after you have closed the app itself. This is an example of how important it is to release all resources and to monitor which ones persist in the memory of the device! While bumming around on the android developer website, I started looking into the billing API and found another example in the Google Payments "Binding" when an in-app purchase is attempted. When an app closes, if that binding is not released, it can use up system resources and degrade performance.

My experiments with the media player began with finding a question on the learnandroid subreddit, which asked about pausing media with a touchevent. From my reading I could see that the poster had not prepared the media track after stopping it, something which is unique to that circumstance but not the pause method. The poster also had a bit of confusion about the touch event itself and where to trigger the media playing. After posting a quick suggestion I decided that I should try to implement a music player in a test app too to get to grips with the little things that might be confusing for a beginner.

Following on from my blank screen SurfaceView test, I used VLC to convert a ripped mp3 into the OGGVorbis format with reduced sound quality (to save on memory) and added that into a new assets folder. From my previous experiemnts with Soundpool (An app with pikachu on it that says pikachu when you touch the screen, and plays a computer booting up sfx from pokemon firered on launch), I knew that sound files can be retrieved using the AssetManager in android. I made the first mistake in not putting the new directory in the right place in the file structure, but that didn't become apparent until it was fully set up.

My first set up was to have all of the media player implemented in the SurfaceView class itself, which is a Runnable in a class called Screen. The implementation included instantiating an asset manager, trying to get the file i wanted an AssetFileDescriptor which I could pass into a new instance of MediaPlayer. The process itself was all set up and the data source set and media started, but the app itself didn't work. After adding some Log notes and an exception message for the try-catch block I found that the asset manager wasn't retrieving the files in the folder, and looked back to my original pikachu app to find the structure difference. 

Once the file was able to load and play I ran the app and lo and behold, the music played, (Nine Inchnails- Satellites for those who need to know!). I was elated, then I closed the app and hmm.. still music.
I knew this had something to do with releasing the resources and the activity life cycle so I started to look there for ways to control my music. I started with onStop, onPause and onResume methods inside my SurfaceView which were called from the main activity, and implemented checks for if the media player was valid and playing etc, and stopped the music and restarted it as I expected. This did not work. The app was happy enough to stop the music but not to restart it when the app was loaded up again. I understood that the app was saved in temporary memory when the app was minimized or obscured, so the app was going through the onPause method, but I didn't realize the onStop method was being called also!

Going back I was able to debug my app and place appropriate boolean flags to control the music playing. In the onPause method, the music is (if it is not already) paused, and on resume the music is (if not null such as on initial bootup of the app) started again. The call to onStop was moved into an if block which checked that the app was not being closed fully (isFinishing()) and if so, stopped playback and also released the media player.
With my experiment finally performing the way I wanted it to, I was finally able to begin abstracting the audio into a seperate class. I created an Audio class that could handle music using static methods, which would mean it is accessible throughout my app. Instead of using a default constructor to set up the Audio environment, I opted instead to use static methods that only initialize variables if they are going to be used. I currently have a method that is needed to pass in a context for use in attaining the assets, but later on I will access this statically from a core class that handles the game.

The Audio class also has onPause, onResume and onStop methods which can be called to ensure that any active media, be it sound FX from the soundPool or music from MediaPlayer are paused and resumed correctly, and destroyed on closing the app.

I am really quite happy with the progress I am making in understanding these (albeit simple) aspects of app development. I want to be able to say not only that I understand the theory, but that I can put code on paper to describe the processes and show that I understand the way these systems work together on a more holistic level.

The code for my Audio and Screen app are linked below for viewing.

MainActivity Class
Audio Class
SurfaceView Class