Welcome!


Welcome!

This site showcases the thesis capstone projects for the Full Sail Mobile Gaming Master of Science program. Students completing the program post their end of program project self evaluation here examining what went right and what went wrong during production.

The site provides examples of all completed projects, without regard to the quality of work. Final faculty evaluation of your project is separate from your postmortem. It is a place to share student work and start dialogue with faculty about completed and upcoming projects.

If you are adding a postmortem for a completed project to this blog, please do your best to provide a meaningful meta-level evaluation of your project. This helps students currently in the program have a better understanding of the critical points related to independent production, game development and design and project management. The template for the blog content and instructions can be found in the first post from July 2014.

Thank You,
MGMS Faculty

Friday, July 22, 2016

Capstone Game Post-mortem: Sensor Dev

Capstone Project Postmortem: Sensor Dev

Project Summary: A plugin for Unity3D to provide access to all sensors available on Android platforms.




Author

Jonathan Burnside

Title

Sensor Dev

Genre

Developer tool set

Platform(s)

Unity3D with Android as the platform target

Revenue model

The tool set will be sold on the Unity Asset store for $30.00.

Development tools/Language

SensorDev was developed using Android Studio and Unity3D development tools, and was programmed in both Java and C#.

Project audience

The intended audience for this project is game developers and designers wishing to add sensor support to their games and products that are not currently supported by Unity3D. These developers and designers may either not posses the skill sets needed to access these sensors, or may represent those that would rather purchase a tool set than develop, test and polish the functionality themselves.

Copyright/Reference

SensorDev © 2016 Jonathan Burnside

 

Backstory:

Sound Bite

Your players have sensors, now your games can too!

Executive Summary

With this pack you will be able to easily integrate and use all Android sensors, including those not supported by default in Unity. The tool set supports all standard Android API sensors as well as those provided by the hardware manufacturer of the device not defined by the Android API. The tool set also provides support for networking sensor values from an Android device to a build running in the Unity Editor, allowing the developer to quickly test and debug without having to constantly push new builds to a device.

Inspiration

Multiple courses in the MGMS program had developers use sensors for game-play elements. These tasks were extremely easy if the sensors that Unity provides were used, such as the accelerometer and gyro, but would be considerably more challenging if using a sensor that did not have default support. After researching how to access these sensors I found it would require multiple development platforms and languages to implement. I believes this additional complexity would reduce the average user of Unity3D's ability to access these sensors, indicating that a tool set providing this support would be a viable product to sell on the Unity Asset Store. 

Ideal

The original ideal version of my product would consist of two distinct parts, the tool set and an application that would demonstrate how these sensors can be used in a game setting. The tool set would provide sensor access in three forms: direct from Android, data converted to common use cases and Unity prefab objects that could simply be added to in the editor. The demonstration portion would consist of a 3D scene that would have used most of the sensors as part of game-play elements.

 


The Critique: What went right…


Android Studio


When I started this project Android Studio was still relatively new and not all developers had started using it in favor of other editors such as Eclipse. This made me concerned that Android Studio might be missing support that the project would require. I became particularly worried when I realized that Android Studio did not have a built in path for creating a JAR file (Java Archive (JAR) Files), which I planned to use as the package for my Unity plugin. This did not prove to be an issue as I was able to first provide support myself for creating the JAR file, then later discovered that Android Studio's built in package file system that creates AAR files also worked as a plugin for Unity (Create an Android Library).

Unity Asset Store art

The tile set package I purchased, titled "Village Interiors Kit", for creating the demonstration portions of the project worked wonderfully. It had assets for exactly what I had in mind. What I did not expect was that it would also come with a demonstration level that was already better than I could likely create myself. All I had to do with the assets was add a roof and some optimizations for mobile platforms.

The starting point in the SensorKeep scene, made from a purchased tile set from the Unity Asset Store.

Light Mapping and Occlusion Culling

Light Mapping and Occlusion Culling, while time consuming to calculate, adding these features to the scene I purchased was all that was needed to get a strong frame rate on the weakest mobile devices I had available.

A picture of the visualization of the Occlusion Culling system in Unity

Feature creep

During development I realized there was no simple means by which the actual values of the sensors could be debugged, at least not from Unity. Using Unity remote would give you the values for the sensors Unity supported by default, but my tool set would be running on the local development machine instead of the mobile device, so no sensor data was available. After speaking with my adviser, we agreed that support for debugging these sensors while running a build in the Unity Editor would be very useful, and my second month of development was spent implementing such a system. The resulting system uses Unity's server/client networking to pass sensor data from a device back to a build of the project running in the Unity Editor on the developers machine. This allows the developer to not only debug the sensor values being used, but also allows for much more development and testing to take place on the user's computer without pushing to a device. Pushing to a mobile device from Unity takes a fair amount of time

External responsibilities and distractions

I can't say that I would suggest anyone try to complete a Master's degree while taking care of two children under the age of two and working a full-time job, but these distractions did have at least one positive effect on myself. They forced me to think through my development process more than I typically would. I tend to be somewhat of a brute force developer, preferring to have only a very sparse high level idea of what I am going to do and figuring out most of the details along the way. The benefit of this bottom up approach is that I do not spend time designing things that may ultimately not work out. The draw back is that the final results tend to be less modular and organized. While I am still no fan of doing a great deal of pre-production planning, spending so much time with my arms filled with children forced me to think through some tasks in greater detail than I would previously, and the result was that when I finally did get to sit down and code, things tended to work out a little better and go a little faster. While I still think that most traditional wisdom on pre-production would have you plan to a detail level that is almost guaranteed to not be what is ultimately developed, doing more planning than I have in the past likely improved my results on this project.

 




The Critique: What went wrong…


Networked Debugging system

While this system proved extremely useful once complete, and is one of the features that sets this tool set apart from similar tools, it also took quite a bit of development time. The system itself did not take long to implement to a base working level, but this feature broke many times as other portions of the tool set were developed or adjusted. When working, the system allowed for debugging of sensor data but I did not have great tools for debugging why those values might not be networked when the system broke. Due to this, I was left to debug based on asserts and log files, which are not as informative as an actual debugger.

Demonstration program 

I had many unplanned for issues with the demonstration portion of this project. Most of the ideas I had at the start of development on how to use the sensors in generic game-play systems did not pan out. I was also of the opinion that while I developed the demonstration portion more ideas on how to use sensors would be reached intuitively, this was not the case. There are quite a few different means by which these sensors can be used for games, but trying to put them all in one homogeneous system did not work out. 

The Raw Sensor data scene, while not very exciting, gives an example of how to use every sensor available on a device.

Step Counter & Step Detector

I intended to have an option for translating the player based on either the step counter or step detector sensors. Unfortunately, walking in place is not detected by these sensors. Actually walking around in the real world while staring at a non-AR scene would result in people bumping into things. The step counter and detector can be faked into believing the user is walking by shaking the device, but this made it impossible to see the screen, rendering it useless. I added on-screen virtual controls for translation instead.

Proximity Sensor

I had planned to use the proximity sensor as an indication that the player was about to try to touch something on screen. This could be used to slow down game play elements  which would make it easier to touch fast moving objects. I quickly realized that the on-screen controls used for translation in the Sensor Keep scene would cause the proximity sensor to always trigger. This general concept may work for a game designed with this in mind, but it was not going to work for my intended use.

Light Sensor

I intended to use the light sensor output to scale the brightness of the lighting in the scene. This technique worked fine on my development machine, but mobile devices required that I baked the lighting in order to get a decent frame rate. I then tried adjusting the size of the particle effects used to represent lights in the scene based on the light sensor. This worked in practice, but was rather subtle in its result. Since the effect was so subtle, when I had testers try the scene they never noticed that this was happening. I considered adding a tutorial like system that would tell the user what sensors were being used for what, but this felt like a bad path. It was after this sensor that I decided the general idea I had for having most of the sensors used in the same scene was not going to work out as I intended, requiring that I redesign or re-evaluate this portion of the program.


External Responsibilities and Distractions

Had I been able to consistently dedicate 40 hours a week to this project, it would have likely been further along after two months than it is today. Even being able to consistently get 20 hours in a week was well beyond what I was able to dedicate to this project most of the time. Luckily, while life did not allow me to sit at my PC and develop as often as I had hoped, it did allow time to think about the project which as mentioned improved my work rate when I was actually able to put time towards development.


Testing

Getting external sources to test the demonstration portion of my project has been very easy, and very helpful. On the other hand, I have not been successful in getting anyone with development experience to try out the tool set. I will not be comfortable releasing this product without some external sources testing the tool set portion of the project.

 

Summary:


At the completion of my time in the MGMS program there are still three main tasks I would like to complete, and test, before I would be comfortable releasing this tool set to be purchased on the asset store.

The demonstration portion of the project, as previously mentioned, did not turn out as planned. In hind sight, I do not believe the plan I had would even be the best course of action for this tool, mainly because a developer is more likely to want a simple light weight example than a larger hard to follow one. I am considering a few different paths that could be taken with the demo project, but the front running idea now is to just use what I have with a much smaller level for the Rotation Vector based character controller. The smaller level will reduce the file size as well as likely make it run more smoothly. While this will not demonstrate every feature of the tool set, the raw data scene already does provide a simple example of each sensor in use. Any benefit I would get from giving more complex or interesting examples for the remaining sensors is likely over-shadowed by the other two tasks left to complete.

In addition to the tool set giving the sensor data directly as reported from Unity, I planned to also give a slightly easier to use access system that put the data into more common use case formats. I planned to implement this feature as I was developing the more complex demonstration for the sensors to get a better feel for what the common use cases would be. As the complex demonstrations have been scrapped this task also did not get completed in full. Having more experience now though I believe the common use cases are going to be mostly just the raw data, or the raw data divided by the data range for most sensors. Providing a simpler coding interface to query these values directly should be very simple for most sensors. The one known exception to this will be the rotational sensors, like the Rotational Vector and Game Rotational Vector sensors, but having implemented a player controller prefab using these sensors, I have already determined what the more complex to implement common use case will be for these sensors.

The last thing the project needs is for the documentation to be completed. After speaking with my advisers and some other developers, having good documentation will likely be far better than having more in game examples of the sensors use. I am using Doxygen to automate the creation of the basic documentation, which will allow the end user to reference all the functionality of the system. In addition to the automated Doxygen documentation I will need to create tutorials of how to get started and how to use some of the more advanced features of the system, such as the networked debugging. The last aspect of documentation I want to implement are video tutorials, one demonstrating the capabilities of the system, one walking a user through the basics of getting started using the system and the last one showing how to use the networked debugging for a variety of use cases.

I believe these last few tasks should not take a huge effort, and with them the result will be better than what I had originally planned for the project. Some of the motivation for the project was lost when Google released a package for supporting their Cardboard platform with head tracking, to Unity, but I do believe the project is still viable and that it is worth finishing and releasing. The main competition for this product, GyroDroid, appears to still be getting sales but people are complaining about bugs and a lack of support. SensorDev is already a better tool set than GyroDroid as I provide access to a number of sensors that my competitor does not as well as a system for debugging. With a cheaper price, if I can finish the remaining tasks as well as provide a more bug-free, or at least supported, product I should be able to make some sales. Also, simply releasing the product will prove a number of abilities that could be beneficial to my future career.

 

References


Java Archive (JAR) Files. (n.d.). Retrieved July 22, 2016, from http://docs.oracle.com/javase/6/docs/technotes/guides/jar/index.html 

Create an Android Library. (n.d.). Retrieved July 22, 2016, from https://developer.android.com/studio/projects/android-library.html

Asset Store - Village Interiors Kit. (n.d.). Retrieved July 25, 2016, from https://www.assetstore.unity3d.com/en/#!/content/17033

Asset Store - GyroDroid. (n.d.). Retrieved July 22, 2016, from https://www.assetstore.unity3d.com/en/#!/content/1701 

Doxygen: Main Page. (n.d.). Retrieved July 22, 2016, from http://www.stack.nl/~dimitri/doxygen/

Motion Sensors. (n.d.). Retrieved July 22, 2016, from https://developer.android.com/guide/topics/sensors/sensors_motion.html#sensors-motion-rotate 


No comments:

Post a Comment

Note: Only a member of this blog may post a comment.