As I mentioned in my earlier post, it was great getting back into proper coding. Unfortunately, I had to put working on the project on hold while I made sure the rest of my studies were up to scratch. I started back up when I broke off from University for the Xmas break. I was confident that I would be able to create an application within the 3 weeks I had before the hand in and while I did indeed manage it, it was a lot more difficult than I had anticipated.
Building on top of the basic window I had made previously, I began creating the other elements that I would need. The first task I set myself was to simply render a triangle. Once that was working I began work on creating a base class that I could use to render models with. As it wasn't the focus of the project, I used Rastertek tutorials to help speed development along. From the tutorials, I had a simple model loader.
As well as a base model class, I created a pipeline for using shaders and textures. Each had a base class which took the name of their respective file and would generate the required resources. I was really pleased with how this turned out although obviously there's room for improvement.
As the class were quick and simple to use, once I'd set up a basic lit shader, I was able to introduce more complex lighting models such as specular lighting and then normal mapped lighting. While I was quite familiar with specular lighting, normal mapping was something I hadn't yet tried to implement. It was interesting to read up on and felt awesome when I managed to get it implemented.
While the model loader was a great tool for the time I had, it had a large restriction in that not only did the models have to be .objs, but they also had to have texture UV coordinates and normals, if the file missed either the converter would break. The work around, as I didn't have any modelling packages, was to use Unity. There is fortunately a script that exports Unity's models to an obj with all the required data. This also allowed grouped object to be exported as a single model. While the models it creates are obviously inefficient as well as being quite limited, it was fine for my project. The framework I had created could have used more impressive models had I an artist to hand.
As well as a base model class, I created a pipeline for using shaders and textures. Each had a base class which took the name of their respective file and would generate the required resources. I was really pleased with how this turned out although obviously there's room for improvement.
As the class were quick and simple to use, once I'd set up a basic lit shader, I was able to introduce more complex lighting models such as specular lighting and then normal mapped lighting. While I was quite familiar with specular lighting, normal mapping was something I hadn't yet tried to implement. It was interesting to read up on and felt awesome when I managed to get it implemented.
While the model loader was a great tool for the time I had, it had a large restriction in that not only did the models have to be .objs, but they also had to have texture UV coordinates and normals, if the file missed either the converter would break. The work around, as I didn't have any modelling packages, was to use Unity. There is fortunately a script that exports Unity's models to an obj with all the required data. This also allowed grouped object to be exported as a single model. While the models it creates are obviously inefficient as well as being quite limited, it was fine for my project. The framework I had created could have used more impressive models had I an artist to hand.
One thing I'm pleased with regarding my framework this time was the consistency I had throughout the code. For objects that it made sense, there was an initialisation function, an update function, a render function and then a clean up function to make sure resources are freed correctly. Given the short time frame, the code is satisfactorily clean and clear.
As the the rendering possibilities of the framework increased, I adapted my framework pipeline. It's still not where I would like it - there's still repeated code and it's not as general and accessible as I would like it, but it was such that I could quickly add new content, which greatly helped me meet the dead line. I knew I would need some sort of environment to base my scene in but was worried modelling some form of cityscape would eat up valuable time. Fortunately, Unity came to my rescue once again. There's a terrain generated that allows the creation of a terrain mesh very quickly. Not so fortunately, Unity applies level of detail techniques to the mesh to allow for a higher resolution mesh to be rendered. Great if you're rendering such a mesh in-engine. Not so great if you want that mesh as an obj. As with my need for basic objs, I was able to find a script to export the terrain mesh to an obj. With these tools I was able to start mocking together a scene.
As the the rendering possibilities of the framework increased, I adapted my framework pipeline. It's still not where I would like it - there's still repeated code and it's not as general and accessible as I would like it, but it was such that I could quickly add new content, which greatly helped me meet the dead line. I knew I would need some sort of environment to base my scene in but was worried modelling some form of cityscape would eat up valuable time. Fortunately, Unity came to my rescue once again. There's a terrain generated that allows the creation of a terrain mesh very quickly. Not so fortunately, Unity applies level of detail techniques to the mesh to allow for a higher resolution mesh to be rendered. Great if you're rendering such a mesh in-engine. Not so great if you want that mesh as an obj. As with my need for basic objs, I was able to find a script to export the terrain mesh to an obj. With these tools I was able to start mocking together a scene.
It was simple, but again, this wasn't an art course, the focus was on what functionality I could provide in my framework. I was really pleased with how my framework was running with the large model loaded in.
As can be seen in the pictures, I had also procured a font rendering engine that was also adapted from a Rastertek tutorial. The solution is an inefficient one but it was simple and made sense to me as well as being adaptable for future use. However, as well as the font, I wanted some form of heads up display for the application. I was confident that I knew how the method should work in theory and so put this to the test.
As can be seen in the pictures, I had also procured a font rendering engine that was also adapted from a Rastertek tutorial. The solution is an inefficient one but it was simple and made sense to me as well as being adaptable for future use. However, as well as the font, I wanted some form of heads up display for the application. I was confident that I knew how the method should work in theory and so put this to the test.
I was pleased with the results but it became immediately apparent that I needed to change the blending method used by the Rastertek tutorial. I changed it from multiplicative to additive making the transparency work as desired.
With all these things in place I was in a position to start creating a basic game. I was however, still needing the main focus of the project: a particle effect. I decided to go for snow because rain is what most people trying out particle effects go for. I also thought snow would look more visually interesting and... well... prettier!
With all these things in place I was in a position to start creating a basic game. I was however, still needing the main focus of the project: a particle effect. I decided to go for snow because rain is what most people trying out particle effects go for. I also thought snow would look more visually interesting and... well... prettier!
As with most new things that I want to try out, I didn't look for help but instead dove in and tried my hand at it. I very quickly had a particle rendering using the geometry shader. My work with the tessellation stages definitely helped me here as the geometry shader is set up in a very similar (and simpler) manner. Once I had a square billboarding where I specified a single point, I created a particle emitter. It wasn't long before I had 5000 particles falling like snow around the camera. I was really pleased with the results, which I promptly made a video of. The next step however was to make it snow everywhere. Because, while I could fly through the current emitted snow, it had boundaries which you could fly out of and break the illusion as you see a cube of snow infront of you. I was aware of earlier tutorials on particle rendering where they simply move the emitter above the player, which keeps particles spawning about the player but of course means the player tends to move beyond the effect. That's dumb in my eyes. So instead I thought about how I wanted it to work. I considered a 3 by 3 grid with the player in the centre box and the outter boxes adjusting based on the player position so that they were always covered but without them being able to fly out of the boundaries. I adjusted my emitted particles so that they could highlight the grid and then tested my algorithm and to my delight it worked as can be seen in my video which then goes on to show how the snow looks. I was really pleased with the results. But I knew I could still make them better. I had already set up a blending state and decided to utilise this to make my snow even better. The snow pops in quite noticeably as the grid is moved about and so, given I knew the grid element sizes, I made them fade out at the smallest distance possible so that you never see them pop in. I was rather pleased with the results. I had wondered whether there would be a significant impact to the framerate upon doing this blending but then realised I was already doing it and that there was only a small cost of calculating the alpha of each point.
The snow currently holds the orange tester texture as can be seen in previous pictures. My initial plan was to provide an alpha sporting snow texture for realistic snow. However I admired the snow effect as it was. This is almost certainly due to my enjoyment of Minecraft. However, getting feedback from peers, it is clear that it is snow and, given that the functionality is there, I left it as I wanted it.
As well as awesome snow, I wanted the weather to hold more believable depth. I also wanted to hide the clipping of my, perhaps unnecessarily, large terrain model. For this I simply implemented fog. This is another effect that I hadn't attempted before, but of course it's pretty simple to implement basic fog. The method I used was based on the camera position. This differed from other "simple" methods presented by other people. The other methods create a fog that clearly follows the straight clipping plane of the camera so that objects that are faded in the centre of the camera can then come out of the fog when the camera rotates. I'm not sure why people use this clearly flawed technique when the method I used has a similar performance hit but much nicer results. Such results can be seen below.
The snow currently holds the orange tester texture as can be seen in previous pictures. My initial plan was to provide an alpha sporting snow texture for realistic snow. However I admired the snow effect as it was. This is almost certainly due to my enjoyment of Minecraft. However, getting feedback from peers, it is clear that it is snow and, given that the functionality is there, I left it as I wanted it.
As well as awesome snow, I wanted the weather to hold more believable depth. I also wanted to hide the clipping of my, perhaps unnecessarily, large terrain model. For this I simply implemented fog. This is another effect that I hadn't attempted before, but of course it's pretty simple to implement basic fog. The method I used was based on the camera position. This differed from other "simple" methods presented by other people. The other methods create a fog that clearly follows the straight clipping plane of the camera so that objects that are faded in the centre of the camera can then come out of the fog when the camera rotates. I'm not sure why people use this clearly flawed technique when the method I used has a similar performance hit but much nicer results. Such results can be seen below.
The issue with graphics programming is there is always something shinier that you can do. Still not satisfied, I created a day and night cycle variable which was then used to effect the conditions of the whether. As the whether got more intense, there were more snow particles emitted (reaching 45,000) and the visibility is drastically reduced. Then, as the storm eases off, visibility increases and the amount of snow particles decreases.
I had managed all of this with time to spare and so added sound to my game. I didn't have enough time to make a really impressive sound engine so simply used Directsound as well as Rastertek's tutorial to load in a wave file. I then made some crude sounds, one for the helicopter that the player controls and another for the wind. I then had those sounds adjust their volume; the helicopter for how fast the blades are spinning, and the wind for how strong the storm is.
I really hoped I would make something I could be happy with and I did so for that, I'm proud of myself. I laugh when I see the application with low visibility and highlight as it looks like I worked hard to render a white screen. However when you play through the game I'm pleased with how it looks knowing that I made the majority of it from scratch while learning new techniques and all within the space of 3 weeks (alongside other time consuming work).
I really hoped I would make something I could be happy with and I did so for that, I'm proud of myself. I laugh when I see the application with low visibility and highlight as it looks like I worked hard to render a white screen. However when you play through the game I'm pleased with how it looks knowing that I made the majority of it from scratch while learning new techniques and all within the space of 3 weeks (alongside other time consuming work).