Tuesday, December 2, 2014

Engineering II - Assignment 11 - 2D Sprites

Assignment 11 - 2D Sprites

ZIP LINK


Write-Up
The basics of this assignment were to draw 2D sprites for overlay in our game engine. Part of this overlay was going to be animated or altered, and we need to use a texture atlas to create the appropriate effect.

Finally, we included appropriate PIX events for helping with debugging code issues. These primarily were focused on our 3D meshes being separated from our 2D sprites in the render call. 

Technical Write-Up

Sprite Creation

The basic 2D sprite is consisting of a left, right, top, and bottom side to it. I created a sprite class to handle the position, texture, vertex buffer, shaders, and UV values.

Each sprite consists of the same type of mesh, containing four vertices and will not differ. This means that we did not need any type of index buffer to handle a mesh, so we only had to create a vertex buffer to handle the position of the vertices.

To handle the position of the vertices of the mesh, we allow the programmer to enter in values for the overall UI position on screen. The user will enter a value for the x and y, which will be the center of the sprite, and a width and height, which will create the overall square-like shape.

My code handles the points of the sprite by dividing the height and width by two, and then added the needed values to the x and y positions. What this allows is the user to have a defined position and UI size for the sprite when rendering. Currently, I have altered the vertex buffer to handle the change in width of the screen size to not alter the sprite. The vertical alteration in screen size has not been handled, as was not required.

The image below shows the change in resolution for the screen, but the sprites in the image are exactly the same width and height, as specified by the user:


By having the vertex buffer handle this information, it simplifies the needs of the sprite class. This means that the vertex and fragment shaders for sprites can also be simplified. I created new shaders to handle both the position, color, and pass those correctly from the vertex shader to the fragment shader. The vertex shader has no transformations that need to be performed on the sprite, as it is used as a UI overlay, and will not change or distort based on the camera position.

Texture Atlas

The texture atlas is used to move over a particular section of a texture for a sprite, and alter the texture's appearance on the sprite in real-time. The positioning on the texture in both x and y coordinates moves from zero to one. This means that if you only have values going from one-half x and one-half y, that only have of the texture will be used on the sprite.

Using this to our advantage we can have a complex texture mapped out, and then use the texture atlas to alter the world, or have multiple items use the same texture with multiple coordinates (UVs) for different looks.

This is done in my game engine using the Numbers image below:



The Numbers image allows the user to press the numbers 1 through 0 on the keyboard, not the number pad keys, and it will alter the image of the numbers to that particular number. If the 'Q' key is pressed, it will show the entire texture over that surface. You can see in the following two images that the number value is initially shown as 1, because the '1' key was pressed, but is then shown as 2 when the '2' key is pressed:



The other sprite I'm showing in game is called the Logo image, shown below:



This image is placed at the center of the screen, and the entire texture is used, not using a texture atlas. If I only wanted to show part of the smiley face in the texture or the wording, I would only have to alter the UVs of the sprite texture and it would shift appropriately.

PIX Events  

The debugging events added for PIX were used to separate all draw calls in my render function from the 3D meshes to the 2D sprites. This allows the user to simply see which call is being made when, and how many times it is being called. There is an image of this event below:


If you expand the Sprite Rendering event, you can see the moment in the draw call when the AlphaBlendable state is set to true. This allows the alpha value of the sprites to be used, creating the ability to draw through a texture with other textures located behind it. This is shown in the image below:


If the alpha state was not set to true, then the image would look like the following:


This shows the importance of the alpha value being adjusted appropriately, for the effect we need. This also sets the need for the artist to make sure that alpha values are set appropriately for all art content.

Realized Learning Moments

I honestly didn't think that the PIX debugging statements would be that useful, but they helped me fix an error with my sprite class, as well as the vertices. I also had a small error where I was using the triangle "list" not the triangle "strip", which was not allowing the full texture to be viewed.

I am still very confused about why the resolution of the texture shouldn't be altered based on the resolution. It seems like the game's window manager would handle particular widths/heights and adjust the other appropriately, and then using a higher/lower resolution of the sprite image. By not allowing the sprite to alter with screen size seems reasonable for keeping the art asset the same, but it seems unreasonable for game UI management. It's an interesting part to see how the engineering pipeline can be in conflict with what the game content wants it to be.

Time Used

Reading: 2 hours
Write-Up: 0.75 hours
Technical Write-Up: 1.25 hours
Coding: 7 hours



No comments:

Post a Comment