├── GLOSSARY.md ├── README.md ├── SUMMARY.md ├── appendix-a ├── appendix-a.md ├── capture.png ├── event_browser.png ├── exec_arguments.png ├── java_process.png ├── mesh.png ├── pipeline_state.png ├── renderdoc.png ├── sample.png ├── texture_inputs.png └── texture_outputs.png ├── book.json ├── chapter-01 ├── chapter-01.md ├── hello_world.png ├── intellij.png └── maven_project.png ├── chapter-02 └── chapter-02.md ├── chapter-03 ├── 3d_cartesian_coordinate_system.png ├── alt_3d_cartesian_coordinate_system.png ├── alt_cartesian_coordinate_system.png ├── cartesian_coordinate_system.png ├── chapter-03.md ├── opengl_coordinates.png ├── rendering_pipeline.png ├── rendering_pipeline_2.png ├── righthanded_lefthanded.png ├── triangle_coordinates.png └── triangle_window.png ├── chapter-04 ├── chapter-04.md ├── colored_quad.png ├── dolphin.png └── quad_coordinates.png ├── chapter-05 ├── 2_2_matrix.png ├── chapter-05.md ├── coordinates.png ├── projection_matrix.png ├── projection_matrix_eq.png ├── rectangle.png ├── square_1.png └── square_colored.png ├── chapter-06 ├── chapter-06.md ├── cube_coords.png ├── cube_depth_test.png ├── cube_no_depth_test.png └── transformations.png ├── chapter-07 ├── chapter-07.md ├── cube_texture.png ├── cube_texture_front_face.png ├── cube_texture_top_face.png ├── cube_with_texture.png ├── texture_coordinates.png └── texture_mapping.png ├── chapter-08 ├── actual_movement.png ├── camera_movement.png ├── chapter-08.md ├── new_transf_eq.png ├── prev_transformation_eq.png └── roll_pitch_yaw.png ├── chapter-09 └── chapter-09.md ├── chapter-10 ├── chapter-10.md └── demo.png ├── chapter-11 ├── chapter-11.md ├── diffuse_calc_i.png ├── diffuse_light.png ├── diffuse_light_normals.png ├── directional_light.png ├── dot_product.png ├── light_controls.png ├── light_reflection.png ├── light_types.png ├── normals.png ├── polished_surface.png ├── specular_lightining.png ├── specular_lightining_calc.png ├── spot_light.png ├── spot_light_calc.png ├── spot_light_ii.png ├── sun_directional_light.png ├── surface.png └── vertex_normals.png ├── chapter-12 ├── chapter-12.md └── skybox.png ├── chapter-13 ├── chapter-13.md ├── exponential_model.png ├── fog.png └── linear_model.png ├── chapter-14 ├── chapter-14.md ├── fragment_normals.png ├── normal_mapping_result.png ├── normal_mapping_result.png.png ├── rock.png ├── rock_normals.png └── surface_normals.png ├── chapter-15 ├── bones_weights.png ├── chapter-15.md ├── mesh_bones_weights_vertices.png ├── node_animations.png ├── screenshot.png └── static_vao_animation_vao.png ├── chapter-16 ├── chapter-16.md ├── listener_at_up.png └── openal_concepts.png ├── chapter-17 ├── cascade_splits.png ├── chapter-17.md ├── render_light_perspective.png ├── result_shadows.png ├── result_shadows_debug.png ├── shadow_concepts_I.png └── shadow_concepts_II.png ├── chapter-18 ├── aabb.svg ├── chapter-18.md ├── object_picking.svg ├── screen_coordinates.png └── screenshot.png ├── chapter-19 ├── buffAlbedo_texture.png ├── buffNormal_texture.png ├── buffSpecular_texture.png ├── chapter-19.md ├── depth_texture.png └── result_shadows.png ├── chapter-20 ├── chapter-20.md ├── indirect-drawing.svg └── screenshot.png ├── chapter-21 ├── chapter-21.md └── screenshot.png ├── cover.jpg ├── screenshot.png └── styles └── pdf.css /GLOSSARY.md: -------------------------------------------------------------------------------- 1 | # Glossary 2 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # 3D Game Development with LWJGL 3 2 | 3 | This online book will introduce the main concepts required to write a 3D game using the LWJGL 3 library. 4 | 5 | [LWJGL](http://www.lwjgl.org/) is a Java library that provides access to native APIs used in the development of graphics \(OpenGL\), audio \(OpenAL\) and parallel computing \(OpenCL\) applications. This library leverages the high performance of native OpenGL applications while using the Java language. 6 | 7 | My initial goal was to learn the techniques involved in writing a 3D game using OpenGL. All the information required was there in the internet but it was not organized and sometimes it was very hard to find and even incomplete or misleading. 8 | 9 | I started to collect some materials, develop some examples and decided to organize that information in the form of a book. 10 | 11 | [Table of contents](SUMMARY.md). 12 | 13 | You can also check my Vulkan book [here](https://github.com/lwjglgamedev/vulkanbook). 14 | 15 | ## Source Code 16 | 17 | The source code of the samples of this book is in [GitHub](https://github.com/lwjglgamedev/lwjglbook). 18 | 19 | The source code for the book itself is also published in [GitHub](https://github.com/lwjglgamedev/lwjglbook-bookcontents). 20 | 21 | ## License 22 | 23 | The book is licensed under [Attribution-ShareAlike 4.0 International \(CC BY-SA 4.0\)](http://creativecommons.org/licenses/by-sa/4.0/) 24 | 25 | The source code for the book is licensed under [Apache v2.0](https://www.apache.org/licenses/LICENSE-2.0 "Apache v2.0") 26 | 27 | ## Previous version 28 | 29 | Previous version of the book can still be accessed in Github. Here are the links: 30 | 31 | * [Book contents](https://github.com/lwjglgamedev/lwjglbook-bookcontents-leg) 32 | * [Source code](https://github.com/lwjglgamedev/lwjglbook-leg) 33 | 34 | **NOTE**: The old version of the book was published, originally, using a different URL. That site is now part of what GitBook considers legacy content. Unfortunately, I cannot access that content any more nor even delete it (according to GitBook this is not possible). Therefore, if you are accessing the book through a GitBook URL which starts with: "https://lwjglgamedev.gitbooks.io/" you are accessing and out of sync version (not even in sync with previous book legacy version which is still hosted in Github). 35 | 36 | 37 | ## Support 38 | 39 | If you like the book you can become a [sponsor](https://github.com/sponsors/lwjglgamedev) 40 | 41 | ## Comments are welcome 42 | 43 | Suggestions and corrections are more than welcome \(and if you do like it please rate it with a star\). Please send them using the discussion forum and make the corrections you consider in order to improve the book. 44 | 45 | ## Author 46 | 47 | Antonio Hernández Bejarano 48 | 49 | ## Special Thanks 50 | 51 | To all the readers that have contributed with corrections, improvements and ideas. 52 | -------------------------------------------------------------------------------- /SUMMARY.md: -------------------------------------------------------------------------------- 1 | # Summary 2 | 3 | * [Introduction](README.md) 4 | * [Chapter 01 - First steps](chapter-01/chapter-01.md) 5 | * [Chapter 02 - The Game Loop](chapter-02/chapter-02.md) 6 | * [Chapter 03 - Our first triangle](chapter-03/chapter-03.md) 7 | * [Chapter 04 - Render a quad](chapter-04/chapter-04.md) 8 | * [Chapter 05 - Perspective projection](chapter-05/chapter-05.md) 9 | * [Chapter 06 - Going 3D](chapter-06/chapter-06.md) 10 | * [Chapter 07 - Textures](chapter-07/chapter-07.md) 11 | * [Chapter 08 - Camera](chapter-08/chapter-08.md) 12 | * [Chapter 09 - Loading more complex models (Assimp)](chapter-09/chapter-09.md) 13 | * [Chapter 10 - GUI (Imgui)](chapter-10/chapter-10.md) 14 | * [Chapter 11 - Lights](chapter-11/chapter-11.md) 15 | * [Chapter 12 - Sky Box](chapter-12/chapter-12.md) 16 | * [Chapter 13 - Fog](chapter-13/chapter-13.md) 17 | * [Chapter 14 - Normal Mapping](chapter-14/chapter-14.md) 18 | * [Chapter 15 - Animations](chapter-15/chapter-15.md) 19 | * [Chapter 16 - Audio](chapter-16/chapter-16.md) 20 | * [Chapter 17 - Cascade shadow maps](chapter-17/chapter-17.md) 21 | * [Chapter 18 - 3D Object Picking](chapter-18/chapter-18.md) 22 | * [Chapter 19 - Deferred Shading](chapter-19/chapter-19.md) 23 | * [Chapter 20 - Indirect drawing (static models)](chapter-20/chapter-20.md) 24 | * [Chapter 21 - Indirect drawing (animated models) and compute shaders](chapter-21/chapter-21.md) 25 | * [Appendix A - OpenGL Debugging](appendix-a/appendix-a.md) -------------------------------------------------------------------------------- /appendix-a/appendix-a.md: -------------------------------------------------------------------------------- 1 | # Appendix A - OpenGL Debugging 2 | 3 | Debugging an OpenGL program can be a daunting task. Most of the times you end up with a black screen and you have no means of knowing what’s going on. In order to alleviate this problem we can use some existing tools that will provide more information about the rendering process. 4 | 5 | In this annex we will describe how to use the [RenderDoc](https://renderdoc.org/ "RenderDoc") tool to debug our LWJGL programs. RenderDoc is a graphics debugging tool that can be used with Direct3D, Vulkan and OpenGL. In the case of OpenGL it only supports the core profile from 3.2 up to 4.5. 6 | 7 | So let’s get started. You need to download and install the RenderDoc version for your OS. Once installed, when you launch it you will see something similar to this. 8 | 9 | ![RenderDoc](/appendix-a/renderdoc.png) 10 | 11 | The first step is to configure RenderDoc to execute and monitor our samples. In the “Capture Executable” tab we need to setup the following parameters: 12 | 13 | * **Executable path**: In our case this should point to the JVM launcher \(For instance, “C:\Program Files\Java\jdk-XX\bin\java.exe”\). 14 | * **Working Directory**: This is the working directory that will be setup for your program. In our case it should be set to the target directory where maven dumps the result. By setting this way, the dependencies will be able to be found \(For instance, "D:/Projects/booksamples/chapter-18/target"\). 15 | * **Command line arguments**: This will contain the arguments required by the JVM to execute our sample. In our case, just passing the jar to be executed \(For instance, “-jar chapter-18-1.0.jar”\). 16 | 17 | ![Exec arguments](/appendix-a/exec_arguments.png) 18 | 19 | There are many other options int this tab to configure the capture options. You can consult their purpose in [RenderDoc documentation](https://renderdoc.org/docs/index.html "RenderDoc documentation"). Once everything has been setup you can execute your program by clicking on the “Launch” button. You will see something like this: 20 | 21 | ![Sample](/appendix-a/sample.png) 22 | 23 | Once launched the process, you will see that a new tab has been added which is named “java \[PID XXXX\]” \(where the XXXX number represents the PID, the process identifier, of the java process\). 24 | 25 | ![Java process](java_process.png) 26 | 27 | From that tab you can capture the state of your program by pressing the “Trigger capture” button. Once a capture has been generated, you will see a little snapshot in that same tab. 28 | 29 | ![Capture](capture.png) 30 | 31 | If you double click on that capture, all the data collected will be loaded and you can start inspecting it. The “Event Browser” panel will be populated will all the relevant OpenGL calls executed during one rendering cycle. 32 | 33 | ![Event browser](event_browser.png) 34 | 35 | You can see, the following events: 36 | * Three depth passes for the cascade shadows. 37 | * The geometry pass. If you click over a glDrawELements event, and select the “Mesh” tab you can see the mesh that was drawn, its input and output for the vertex shader. 38 | * The lighting pass. 39 | 40 | You can also view the input textures used for that drawing operation \(by clicking the “Texture Viewer” tab\). 41 | 42 | ![Texture inputs](texture_inputs.png) 43 | 44 | In the center panel, you can see the output, and on the right panel you can see the list of textures used as an input. You can also view the output textures one by one. This is very illustrative to show how deferred shading works. 45 | 46 | ![Texture outputs](texture_outputs.png) 47 | 48 | As you can see, this tool provides valuable information about what’s happening when rendering. It can save precious time while debugging rendering problems. It can even display information about the shaders used in the rendering pipeline. 49 | 50 | ![Pipeline state](pipeline_state.png) 51 | 52 | -------------------------------------------------------------------------------- /appendix-a/capture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/capture.png -------------------------------------------------------------------------------- /appendix-a/event_browser.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/event_browser.png -------------------------------------------------------------------------------- /appendix-a/exec_arguments.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/exec_arguments.png -------------------------------------------------------------------------------- /appendix-a/java_process.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/java_process.png -------------------------------------------------------------------------------- /appendix-a/mesh.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/mesh.png -------------------------------------------------------------------------------- /appendix-a/pipeline_state.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/pipeline_state.png -------------------------------------------------------------------------------- /appendix-a/renderdoc.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/renderdoc.png -------------------------------------------------------------------------------- /appendix-a/sample.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/sample.png -------------------------------------------------------------------------------- /appendix-a/texture_inputs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/texture_inputs.png -------------------------------------------------------------------------------- /appendix-a/texture_outputs.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/appendix-a/texture_outputs.png -------------------------------------------------------------------------------- /book.json: -------------------------------------------------------------------------------- 1 | { 2 | "plugins": [ 3 | "katex" 4 | ], 5 | "pluginsConfig": {} 6 | } -------------------------------------------------------------------------------- /chapter-01/chapter-01.md: -------------------------------------------------------------------------------- 1 | # Chapter 01 - First steps 2 | 3 | In this book we will learn the principal techniques involved in developing 3D games using [OpenGL](https://www.opengl.org). We will develop our samples in Java and we will use the Lightweight Java Game Library [LWJGL](http://www.lwjgl.org/). LWJGL library enables the access to low-level APIs (Application Programming Interface) such as OpenGL from Java. 4 | 5 | LWJGL is a low level API that acts like a wrapper around OpenGL. Therefore, if your idea is to start creating 3D games in a short period of time maybe you should consider other alternatives like [jMonkeyEngine](https://jmonkeyengine.org) or [Unity](https::/unity.com). By using this low level API you will have to go through many concepts and write lots of lines of code before you see the results. The benefit of doing it this way is that you will get a much better understanding of 3D graphics and you will always have the control. 6 | 7 | Regarding Java, you will need at least Java 17. So the first step, in case you do not have that version installed, is to download the Java SDK. You can download the OpenJDK binaries [here](https://jdk.java.net/17/). In any case, this book assumes that you have a moderate understanding of the Java language. If this is not your case, you should first get proper knowledge of the language. 8 | 9 | The best way to work with the examples is to clone the Github repository. You can either download the whole repository as a zip and extract in in your desired folder or clone it by using the following command: `git clone https://github.com/lwjglgamedev/lwjglbook.git`. In bot cases you will have a root folder which contains one sub folder per chapter. 10 | 11 | You may use the Java IDE you want in order to run the samples. You can download IntelliJ IDEA which has good support for Java. IntelliJ provides a free open source version, the Community version, which you can download from here: [https://www.jetbrains.com/idea/download/](https://www.jetbrains.com/idea/download/ "IntelliJ"). 12 | 13 | ![IntelliJ](intellij.png) 14 | 15 | When you open the source code in your IDE you can either open the root folder which contains all the chapters (the parent project) or each chapter independently. In the first case, please remember to properly set the working directory for each chapter to the root folder of the chapter. The samples will try to access files using relative paths assuming that the root folder is the chapter base folder. 16 | 17 | For building our samples we will be using [Maven](https://maven.apache.org/). Maven is already integrated in most IDEs and you can directly open the different samples inside them. Just open the folder that contains the chapter sample and IntelliJ will detect that it is a maven project. 18 | ![Maven](maven_project.png) 19 | 20 | Maven builds projects based on an XML file named `pom.xml` \(Project Object Model\) which manages project dependencies \(the libraries you need to use\) and the steps to be performed during the build process. Maven follows the principle of convention over configuration, that is, if you stick to the standard project structure and naming conventions the configuration file does not need to explicitly say where source files are or where compiled classes should be located. 21 | 22 | This book does not intend to be a maven tutorial, so please find the information about it in the web in case you need it. The source code root folder defines a parent project which defines the plugins to be used and collects the versions of the libraries employed. Therefore you will find there a `pom.xml` file which defines common actions and properties for all the chapters, which are handled as sub-projects. 23 | 24 | LWJGL 3.1 introduced some changes in the way that the project is built. Now the base code is much more modular, and we can be more selective in the packages that we want to use instead of using a giant monolithic jar file. This comes at a cost: You now need to carefully specify the dependencies one by one. But the [download](https://www.lwjgl.org/download) page includes a fancy tool that generates the pom file for you. In our case, we will be first using GLFW and OpenGL bindings. You can check what the pom file looks like in the source code. 25 | 26 | The LWJGL platform dependency already takes care of unpacking native libraries for your platform, so there's no need to use other plugins \(such as `mavennatives`\). We just need to set up three profiles to set a property that will configure the LWJGL platform. The profiles will set up the correct values of that property for Windows, Linux and Mac OS families. 27 | 28 | ```xml 29 | 30 | 31 | windows-profile 32 | 33 | 34 | Windows 35 | 36 | 37 | 38 | natives-windows 39 | 40 | 41 | 42 | linux-profile 43 | 44 | 45 | Linux 46 | 47 | 48 | 49 | natives-linux 50 | 51 | 52 | 53 | OSX-profile 54 | 55 | 56 | mac 57 | 58 | 59 | 60 | natives-osx 61 | 62 | 63 | 64 | ``` 65 | 66 | Inside each project, the LWJGL platform dependency will use the correct property established in the profile for the current platform. 67 | 68 | ```xml 69 | 70 | org.lwjgl 71 | lwjgl-platform 72 | ${lwjgl.version} 73 | ${native.target} 74 | 75 | ``` 76 | 77 | Besides that, every project generates a runnable jar \(one that can be executed by typing java -jar name\_of\_the\_jar.jar\). This is achieved by using the maven-jar-plugin which creates a jar with a `MANIFEST.MF` file with the correct values. The most important attribute for that file is `Main-Class`, which sets the entry point for the program. In addition, all the dependencies are set as entries in the `Class-Path` attribute for that file. In order to execute it on another computer, you just need to copy the main jar file and the lib directory \(with all the jars included there\) which are located under the target directory. 78 | 79 | The jars that contain LWJGL classes, also contain the native libraries. LWJGL will also take care of extracting them and adding them to the path where the JVM will look for libraries. 80 | 81 | This chapter's source code is taken directly from the getting started sample in the LWJGL site [http://www.lwjgl.org/guide](http://www.lwjgl.org/guide). Although it is very well documented let's go through the source code and explain the most relevant parts. Since pasting the source code for each class will make it impossible to read, we will include fragments. In order for you to better understand the class to which each specific fragment belongs, we will always include the class header in each fragment. We will use three dots (`...`) to indicate that there is more code before / after the fragment. The sample is contained in a single class named `HelloWorld` which starts like this: 82 | ```java 83 | package org.lwjglb; 84 | 85 | import org.lwjgl.Version; 86 | import org.lwjgl.glfw.*; 87 | import org.lwjgl.opengl.GL; 88 | import org.lwjgl.system.MemoryStack; 89 | 90 | import java.nio.IntBuffer; 91 | 92 | import static org.lwjgl.glfw.Callbacks.glfwFreeCallbacks; 93 | import static org.lwjgl.glfw.GLFW.*; 94 | import static org.lwjgl.opengl.GL11.*; 95 | import static org.lwjgl.system.MemoryStack.stackPush; 96 | import static org.lwjgl.system.MemoryUtil.NULL; 97 | 98 | public class HelloWorld { 99 | 100 | // The window handle 101 | private long window; 102 | 103 | public static void main(String[] args) { 104 | new HelloWorld().run(); 105 | } 106 | ... 107 | } 108 | ``` 109 | 110 | The class just stores a reference to a Window handle (we will see what this means later on), an in the `main` method we just call the `run` method. Let's start dissecting that method: 111 | ```java 112 | public class HelloWorld { 113 | ... 114 | public void run() { 115 | System.out.println("Hello LWJGL " + Version.getVersion() + "!"); 116 | 117 | init(); 118 | loop(); 119 | 120 | // Free the window callbacks and destroy the window 121 | glfwFreeCallbacks(window); 122 | glfwDestroyWindow(window); 123 | 124 | // Terminate GLFW and free the error callback 125 | glfwTerminate(); 126 | glfwSetErrorCallback(null).free(); 127 | } 128 | ... 129 | } 130 | ``` 131 | This method just calls the `init` method to initialize the application and then calls the `loop` method which is basically and endless loop which renders to a a window. When the `loop` method is finished we just need to free some resources created during initialization (the GLFW window). Let's start with the `init` method. 132 | 133 | ```java 134 | public class HelloWorld { 135 | ... 136 | private void init() { 137 | // Setup an error callback. The default implementation 138 | // will print the error message in System.err. 139 | GLFWErrorCallback.createPrint(System.err).set(); 140 | 141 | // Initialize GLFW. Most GLFW functions will not work before doing this. 142 | if (!glfwInit()) 143 | throw new IllegalStateException("Unable to initialize GLFW"); 144 | 145 | // Configure GLFW 146 | glfwDefaultWindowHints(); // optional, the current window hints are already the default 147 | glfwWindowHint(GLFW_VISIBLE, GLFW_FALSE); // the window will stay hidden after creation 148 | glfwWindowHint(GLFW_RESIZABLE, GLFW_TRUE); // the window will be resizable 149 | 150 | // Create the window 151 | window = glfwCreateWindow(300, 300, "Hello World!", NULL, NULL); 152 | if (window == NULL) 153 | throw new RuntimeException("Failed to create the GLFW window"); 154 | 155 | // Setup a key callback. It will be called every time a key is pressed, repeated or released. 156 | glfwSetKeyCallback(window, (window, key, scancode, action, mods) -> { 157 | if (key == GLFW_KEY_ESCAPE && action == GLFW_RELEASE) 158 | glfwSetWindowShouldClose(window, true); // We will detect this in the rendering loop 159 | }); 160 | ... 161 | } 162 | ... 163 | } 164 | ``` 165 | We start by invoking [GLFW](https://www.glfw.org/), which is library to handle GUI components \(Windows, etc.\) and events \(key presses, mouse movements, etc.\) with an OpenGL context attached in a straightforward way. Currently, you cannot using Swing or AWT directly to render OpenGL. If you want to use AWT you can check [ lwjgl3-awt 166 | ](https://github.com/LWJGLX/lwjgl3-awt), but in this book we will stick with GLFW. We first start by initializing GLFW library and setting some parameters for window initialization (such as if it is resizable or not). The window is created by calling the `glfwCreateWindow` which receive window's width and height and the window title. This function returns a handle, which we need to store so we can use it ith any other GLFW related function. After that, we set a keyboard callback, that is a function that will be called when a key is pressed. In this case we just want to detect if the `ESC` key is pressed to close the window. Let's continue with the `init` method: 167 | ```java 168 | public class HelloWorld { 169 | ... 170 | private void init() { 171 | ... 172 | // Get the thread stack and push a new frame 173 | try (MemoryStack stack = stackPush()) { 174 | IntBuffer pWidth = stack.mallocInt(1); // int* 175 | IntBuffer pHeight = stack.mallocInt(1); // int* 176 | 177 | // Get the window size passed to glfwCreateWindow 178 | glfwGetWindowSize(window, pWidth, pHeight); 179 | 180 | // Get the resolution of the primary monitor 181 | GLFWVidMode vidmode = glfwGetVideoMode(glfwGetPrimaryMonitor()); 182 | 183 | // Center the window 184 | glfwSetWindowPos( 185 | window, 186 | (vidmode.width() - pWidth.get(0)) / 2, 187 | (vidmode.height() - pHeight.get(0)) / 2 188 | ); 189 | } // the stack frame is popped automatically 190 | 191 | // Make the OpenGL context current 192 | glfwMakeContextCurrent(window); 193 | // Enable v-sync 194 | glfwSwapInterval(1); 195 | 196 | // Make the window visible 197 | glfwShowWindow(window); 198 | } 199 | ... 200 | } 201 | ``` 202 | 203 | Although we will explain it in next chapters, you will see here a key class in LWJGL which is the `MemoryStack`. As it has been said before, LJWGL provides wrappers around native libraries (C-based functions). Java does not have the concept of pointers (at least thinking in C terms), so passing structures to C functions is not a straight forward task. In order to share those structures, and to have pass by reference parameters, such as in the example above, we need to allocate memory which can be accessed by native code. LWJGL provides the `MemoryStack` class which allows us to allocate native-accessible memory / structures which is automatically cleaned (in fact is returned to a pool like structure so it can be reused) when we are out of the scope where `stackPush` method is called. Every native-accessible memory / structure is instantiated through this stack class. In the sample above we need to call the `glfwGetWindowSize` to get window dimensions. The values are returned using a pass-by-reference approach, so meed to allocate two ints (in the form of two `IntBuffer`'s). With that information and the dimensions of the monitor we can center the window, setup OpenGL, enable v-sync (more on this in next chapter) and finally show the window. 204 | 205 | Now we need an endless loop to render continuously something: 206 | ```java 207 | public class HelloWorld { 208 | ... 209 | private void loop() { 210 | // This line is critical for LWJGL's interoperation with GLFW's 211 | // OpenGL context, or any context that is managed externally. 212 | // LWJGL detects the context that is current in the current thread, 213 | // creates the GLCapabilities instance and makes the OpenGL 214 | // bindings available for use. 215 | GL.createCapabilities(); 216 | 217 | // Set the clear color 218 | glClearColor(1.0f, 0.0f, 0.0f, 0.0f); 219 | 220 | // Run the rendering loop until the user has attempted to close 221 | // the window or has pressed the ESCAPE key. 222 | while (!glfwWindowShouldClose(window)) { 223 | glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // clear the framebuffer 224 | 225 | glfwSwapBuffers(window); // swap the color buffers 226 | 227 | // Poll for window events. The key callback above will only be 228 | // invoked during this call. 229 | glfwPollEvents(); 230 | } 231 | } 232 | ... 233 | } 234 | ``` 235 | We first create OpenGL context, set up the clear color and perform a clear operation (over color abd depth buffers) in each loop, polling for keyboard events to detect if window should be closed. We will explain these concepts in detail along next chapters. However, just for the sake of completeness, render is done over a target, in this case over a buffer which contains color information and depth values (for 3D), after we have finished rendering over these buffers, we just need to inform GLFW that this buffer is ready for presenting by calling `glfwSwapBuffers`. GLFW will maintain several buffers so we can perform render operations over one buffer while the other one is presented in the window (if not, we would have flickering artifacts). 236 | 237 | If you have your environment correctly set up you should be able to execute it and see a window with a red background. 238 | 239 | ![Hello World](hello_world.png) 240 | 241 | The source code of this chapter is located [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-01). 242 | 243 | [Next chapter](../chapter-02/chapter-02.md) 244 | -------------------------------------------------------------------------------- /chapter-01/hello_world.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-01/hello_world.png -------------------------------------------------------------------------------- /chapter-01/intellij.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-01/intellij.png -------------------------------------------------------------------------------- /chapter-01/maven_project.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-01/maven_project.png -------------------------------------------------------------------------------- /chapter-02/chapter-02.md: -------------------------------------------------------------------------------- 1 | # Chapter 02 - The Game Loop 2 | 3 | In this chapter we will start developing our game engine by creating the game loop. The game loop is the core component of every game. It is basically an endless loop which is responsible for periodically handling user input, updating game state and rendering to the screen. 4 | 5 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-02). 6 | 7 | ## The basis 8 | 9 | The following snippet shows the structure of a game loop: 10 | 11 | ```java 12 | while (keepOnRunning) { 13 | input(); 14 | update(); 15 | render(); 16 | } 17 | ``` 18 | 19 | The `input` method is responsible of handling user input (key strokes, mouse movements, etc.). The `update` method is responsible of updating game state (enemy positions, AI, etc..) and, finally, theSo, is that all? Are we finished with game loops? Well, not yet. The above snippet has many pitfalls. First of all the speed that the game loop runs at will be different depending on the machine it runs on. If the machine is fast enough the user will not even be able to see what is happening in the game. Moreover, that game loop will consume all the machine resources. 20 | 21 | First of all we may want to control separately the period at which the game state is updated and the period at which the game is rendered to the screen. Why do we do this? Well, updating our game state at a constant rate is more important, especially if we use some physics engine. On the contrary, if our rendering is not done in time it makes no sense to render old frames while processing our game loop. We have the flexibility to skip some frames. 22 | 23 | ## Implementation 24 | 25 | Prior to examining the game loop, let's create the supporting classes that will form the core of the engine. We will first create an interface that will encapsulate the game logic. By doing this we will make our game engine reusable across the different chapters. This interface will have methods to initialize the game assets (`init`), handle user input (`input`), update game state (`update`) and clean up the resources (`cleanup`). 26 | 27 | ```java 28 | package org.lwjglb.engine; 29 | 30 | import org.lwjglb.engine.graph.Render; 31 | import org.lwjglb.engine.scene.Scene; 32 | 33 | public interface IAppLogic { 34 | 35 | void cleanup(); 36 | 37 | void init(Window window, Scene scene, Render render); 38 | 39 | void input(Window window, Scene scene, long diffTimeMillis); 40 | 41 | void update(Window window, Scene scene, long diffTimeMillis); 42 | } 43 | ``` 44 | 45 | As you can see, there are some classes instances which we have not defined yet (`Window`, `Scene` and `Render`) and a parameter named `diffTimeMillis` which holds the milliseconds passed between invocations of those methods. 46 | 47 | Let's start with the `Window` class. We will encapsulate in this class all the invocations to GLFW library to create and manage a window, and its structure is like this: 48 | 49 | ```java 50 | package org.lwjglb.engine; 51 | 52 | import org.lwjgl.glfw.GLFWVidMode; 53 | import org.lwjgl.system.MemoryUtil; 54 | import org.tinylog.Logger; 55 | 56 | import java.util.concurrent.Callable; 57 | 58 | import static org.lwjgl.glfw.Callbacks.glfwFreeCallbacks; 59 | import static org.lwjgl.glfw.GLFW.*; 60 | import static org.lwjgl.opengl.GL11.*; 61 | import static org.lwjgl.system.MemoryUtil.NULL; 62 | 63 | public class Window { 64 | 65 | private final long windowHandle; 66 | private int height; 67 | private Callable resizeFunc; 68 | private int width; 69 | ... 70 | ... 71 | public static class WindowOptions { 72 | public boolean compatibleProfile; 73 | public int fps; 74 | public int height; 75 | public int ups = Engine.TARGET_UPS; 76 | public int width; 77 | } 78 | } 79 | ``` 80 | 81 | As you can see, it defines some attributes to store the window handle, its width and height and a callback function which will be invoked nay time the window is resized. It also defines an inner class to set up some options to control window creation: 82 | 83 | * `compatibleProfile`: This controls wether we want to use old functions from previous versions (deprecated functions) or not. 84 | * `fps`: Defines the target frames per second (FPS). If it has a value equal os less than zero it will mean that we do not want to set up a target but either use monitor refresh that as target FPS. In order to do so, we will use v-sync (that is the number of screen updates to wait from the time `glfwSwapBuffers` was called before swapping the buffers and returning). 85 | * `height`: Desired window height. 86 | * `width`: Desired window width: 87 | * `ups`: Defines the target number of updates per second (initialized to a default value). 88 | 89 | Let's examine the constructor of the `Window` class: 90 | 91 | ```java 92 | public class Window { 93 | ... 94 | public Window(String title, WindowOptions opts, Callable resizeFunc) { 95 | this.resizeFunc = resizeFunc; 96 | if (!glfwInit()) { 97 | throw new IllegalStateException("Unable to initialize GLFW"); 98 | } 99 | 100 | glfwDefaultWindowHints(); 101 | glfwWindowHint(GLFW_VISIBLE, GL_FALSE); 102 | glfwWindowHint(GLFW_RESIZABLE, GL_TRUE); 103 | 104 | glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); 105 | glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2); 106 | if (opts.compatibleProfile) { 107 | glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_COMPAT_PROFILE); 108 | } else { 109 | glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); 110 | glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE); 111 | } 112 | 113 | if (opts.width > 0 && opts.height > 0) { 114 | this.width = opts.width; 115 | this.height = opts.height; 116 | } else { 117 | glfwWindowHint(GLFW_MAXIMIZED, GLFW_TRUE); 118 | GLFWVidMode vidMode = glfwGetVideoMode(glfwGetPrimaryMonitor()); 119 | width = vidMode.width(); 120 | height = vidMode.height(); 121 | } 122 | 123 | windowHandle = glfwCreateWindow(width, height, title, NULL, NULL); 124 | if (windowHandle == NULL) { 125 | throw new RuntimeException("Failed to create the GLFW window"); 126 | } 127 | 128 | glfwSetFramebufferSizeCallback(windowHandle, (window, w, h) -> resized(w, h)); 129 | 130 | glfwSetErrorCallback((int errorCode, long msgPtr) -> 131 | Logger.error("Error code [{}], msg [{}]", errorCode, MemoryUtil.memUTF8(msgPtr)) 132 | ); 133 | 134 | glfwSetKeyCallback(windowHandle, (window, key, scancode, action, mods) -> { 135 | keyCallBack(key, action); 136 | }); 137 | 138 | glfwMakeContextCurrent(windowHandle); 139 | 140 | if (opts.fps > 0) { 141 | glfwSwapInterval(0); 142 | } else { 143 | glfwSwapInterval(1); 144 | } 145 | 146 | glfwShowWindow(windowHandle); 147 | 148 | int[] arrWidth = new int[1]; 149 | int[] arrHeight = new int[1]; 150 | glfwGetFramebufferSize(windowHandle, arrWidth, arrHeight); 151 | width = arrWidth[0]; 152 | height = arrHeight[0]; 153 | } 154 | ... 155 | public void keyCallBack(int key, int action) { 156 | if (key == GLFW_KEY_ESCAPE && action == GLFW_RELEASE) { 157 | glfwSetWindowShouldClose(windowHandle, true); // We will detect this in the rendering loop 158 | } 159 | } 160 | ... 161 | } 162 | ``` 163 | 164 | We start by setting some window hints to hide the window and set it resizable. After that, we set OpenGL version and set either core or compatible profile depending on window options. Then, if we have not set a preferred width and height we get the primary monitor dimensions to set window size. We then create the window by calling the `glfwCreateWindow` and set some callbacks when window is resized or to detect window termination (when `ESC` key is pressed). If we want to manually set a target FPS, we invoke `glfwSwapInterval(0)` to disable v-sync and finally, we show the window and get the frame buffer size (the portion of the window used to render()). 165 | 166 | The rest of the methods of the `Window` class are for cleaning up resources, the resize callback, some getters for window size and methods to poll events and to check if the window should be closed. 167 | 168 | ```java 169 | public class Window { 170 | ... 171 | public void cleanup() { 172 | glfwFreeCallbacks(windowHandle); 173 | glfwDestroyWindow(windowHandle); 174 | glfwTerminate(); 175 | GLFWErrorCallback callback = glfwSetErrorCallback(null); 176 | if (callback != null) { 177 | callback.free(); 178 | } 179 | } 180 | 181 | public int getHeight() { 182 | return height; 183 | } 184 | 185 | public int getWidth() { 186 | return width; 187 | } 188 | 189 | public long getWindowHandle() { 190 | return windowHandle; 191 | } 192 | 193 | public boolean isKeyPressed(int keyCode) { 194 | return glfwGetKey(windowHandle, keyCode) == GLFW_PRESS; 195 | } 196 | 197 | public void pollEvents() { 198 | glfwPollEvents(); 199 | } 200 | 201 | protected void resized(int width, int height) { 202 | this.width = width; 203 | this.height = height; 204 | try { 205 | resizeFunc.call(); 206 | } catch (Exception excp) { 207 | Logger.error("Error calling resize callback", excp); 208 | } 209 | } 210 | 211 | public void update() { 212 | glfwSwapBuffers(windowHandle); 213 | } 214 | 215 | public boolean windowShouldClose() { 216 | return glfwWindowShouldClose(windowHandle); 217 | } 218 | ... 219 | } 220 | ``` 221 | 222 | The `Scene` class will hold 3D scene future elements (models, etc.). By now it is just an empty place holder: 223 | 224 | ```java 225 | package org.lwjglb.engine.scene; 226 | 227 | public class Scene { 228 | 229 | public Scene() { 230 | } 231 | 232 | public void cleanup() { 233 | // Nothing to be done here yet 234 | } 235 | } 236 | ``` 237 | 238 | The `Render` class is just now another place holder that just clears the screen: 239 | 240 | ```java 241 | package org.lwjglb.engine.graph; 242 | 243 | import org.lwjgl.opengl.GL; 244 | import org.lwjglb.engine.Window; 245 | import org.lwjglb.engine.scene.Scene; 246 | 247 | import static org.lwjgl.opengl.GL11.*; 248 | 249 | public class Render { 250 | 251 | public Render() { 252 | GL.createCapabilities(); 253 | } 254 | 255 | public void cleanup() { 256 | // Nothing to be done here yet 257 | } 258 | 259 | public void render(Window window, Scene scene) { 260 | glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); 261 | } 262 | } 263 | ``` 264 | 265 | Now we can implement the game loop in a new class named `Engine` which starts like this: 266 | 267 | ```java 268 | package org.lwjglb.engine; 269 | 270 | import org.lwjglb.engine.graph.Render; 271 | import org.lwjglb.engine.scene.Scene; 272 | 273 | public class Engine { 274 | 275 | public static final int TARGET_UPS = 30; 276 | private final IAppLogic appLogic; 277 | private final Window window; 278 | private Render render; 279 | private boolean running; 280 | private Scene scene; 281 | private int targetFps; 282 | private int targetUps; 283 | 284 | public Engine(String windowTitle, Window.WindowOptions opts, IAppLogic appLogic) { 285 | window = new Window(windowTitle, opts, () -> { 286 | resize(); 287 | return null; 288 | }); 289 | targetFps = opts.fps; 290 | targetUps = opts.ups; 291 | this.appLogic = appLogic; 292 | render = new Render(); 293 | scene = new Scene(); 294 | appLogic.init(window, scene, render); 295 | running = true; 296 | } 297 | 298 | private void cleanup() { 299 | appLogic.cleanup(); 300 | render.cleanup(); 301 | scene.cleanup(); 302 | window.cleanup(); 303 | } 304 | 305 | private void resize() { 306 | // Nothing to be done yet 307 | } 308 | ... 309 | } 310 | ``` 311 | 312 | The `Engine` class, receives in the constructor the title of the window, the window options and a reference to the implementation of the `IAppLogic` interface. In the constructor it creates instance of the `Window`, `Render` and `Scene` classes. The `cleanup` method just invokes the other classes `cleanup` resources. The game loop is defined in the `run` method which is defined like this: 313 | 314 | ```java 315 | public class Engine { 316 | ... 317 | private void run() { 318 | long initialTime = System.currentTimeMillis(); 319 | float timeU = 1000.0f / targetUps; 320 | float timeR = targetFps > 0 ? 1000.0f / targetFps : 0; 321 | float deltaUpdate = 0; 322 | float deltaFps = 0; 323 | 324 | long updateTime = initialTime; 325 | while (running && !window.windowShouldClose()) { 326 | window.pollEvents(); 327 | 328 | long now = System.currentTimeMillis(); 329 | deltaUpdate += (now - initialTime) / timeU; 330 | deltaFps += (now - initialTime) / timeR; 331 | 332 | if (targetFps <= 0 || deltaFps >= 1) { 333 | appLogic.input(window, scene, now - initialTime); 334 | } 335 | 336 | if (deltaUpdate >= 1) { 337 | long diffTimeMillis = now - updateTime; 338 | appLogic.update(window, scene, diffTimeMillis); 339 | updateTime = now; 340 | deltaUpdate--; 341 | } 342 | 343 | if (targetFps <= 0 || deltaFps >= 1) { 344 | render.render(window, scene); 345 | deltaFps--; 346 | window.update(); 347 | } 348 | initialTime = now; 349 | } 350 | 351 | cleanup(); 352 | } 353 | ... 354 | } 355 | ``` 356 | 357 | The loop starts by calculating two parameters: `timeU` and `timeR` which control the maximum elapsed time between updates (`timeU`) and render calls (`timeR`) in milliseconds. If those periods are consumed we need either to update game state or to render. In the later case, if the target FPS is set to 0 we will rely on v-sync refresh rate so we just set tha value to `0`. The loop starts by polling the events over the window, after that, we get current time in milliseconds. After that we get the elapsed time between update and render calls. If we have passed the maximum elapsed time for render (or relay in v-sync), we process user input by calling `appLogic.input`. If we have surpassed maximum update elapsed time we update game state by calling `appLogic.update`. we have passed the maximum elapsed time for render (or relay in v-sync), we trigger render calls by calling `render.render`. 358 | 359 | At the end of the loop we call the `cleanup` method to free resources. 360 | 361 | Finally the `Engine` is completed like this: 362 | 363 | ```java 364 | public class Engine { 365 | ... 366 | public void start() { 367 | running = true; 368 | run(); 369 | } 370 | 371 | public void stop() { 372 | running = false; 373 | } 374 | } 375 | ``` 376 | 377 | A little bit note on threading. GLFW requires to be initialized from the main thread. Polling of events should also be done in that thread. Therefore, instead of creating a separate thread for the game loop, which is what you would see commonly in games, we will execute everything from the main thread. This is whey we do not create new `Thread` in the `start` method. 378 | 379 | Finally, we just simplify the `Main` class to this: 380 | 381 | ```java 382 | package org.lwjglb.game; 383 | 384 | import org.lwjglb.engine.*; 385 | import org.lwjglb.engine.graph.Render; 386 | import org.lwjglb.engine.scene.Scene; 387 | 388 | public class Main implements IAppLogic { 389 | 390 | public static void main(String[] args) { 391 | Main main = new Main(); 392 | Engine gameEng = new Engine("chapter-02", new Window.WindowOptions(), main); 393 | gameEng.start(); 394 | } 395 | 396 | @Override 397 | public void cleanup() { 398 | // Nothing to be done yet 399 | } 400 | 401 | @Override 402 | public void init(Window window, Scene scene, Render render) { 403 | // Nothing to be done yet 404 | } 405 | 406 | @Override 407 | public void input(Window window, Scene scene, long diffTimeMillis) { 408 | // Nothing to be done yet 409 | } 410 | 411 | @Override 412 | public void update(Window window, Scene scene, long diffTimeMillis) { 413 | // Nothing to be done yet 414 | } 415 | } 416 | ``` 417 | 418 | We just create the `Engine` instance and start it up in the `main` method. The `Main` class also implements the `IAppLogic` interface which by now is just empty. 419 | 420 | [Next chapter](../chapter-03/chapter-03.md) 421 | -------------------------------------------------------------------------------- /chapter-03/3d_cartesian_coordinate_system.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/3d_cartesian_coordinate_system.png -------------------------------------------------------------------------------- /chapter-03/alt_3d_cartesian_coordinate_system.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/alt_3d_cartesian_coordinate_system.png -------------------------------------------------------------------------------- /chapter-03/alt_cartesian_coordinate_system.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/alt_cartesian_coordinate_system.png -------------------------------------------------------------------------------- /chapter-03/cartesian_coordinate_system.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/cartesian_coordinate_system.png -------------------------------------------------------------------------------- /chapter-03/opengl_coordinates.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/opengl_coordinates.png -------------------------------------------------------------------------------- /chapter-03/rendering_pipeline.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/rendering_pipeline.png -------------------------------------------------------------------------------- /chapter-03/rendering_pipeline_2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/rendering_pipeline_2.png -------------------------------------------------------------------------------- /chapter-03/righthanded_lefthanded.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/righthanded_lefthanded.png -------------------------------------------------------------------------------- /chapter-03/triangle_coordinates.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/triangle_coordinates.png -------------------------------------------------------------------------------- /chapter-03/triangle_window.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-03/triangle_window.png -------------------------------------------------------------------------------- /chapter-04/chapter-04.md: -------------------------------------------------------------------------------- 1 | # Chapter 04 - More on render 2 | 3 | In this chapter we will continue talking about how OpenGL renders things. We will draw a quad instead of a triangle and set additional data to the Mesh, such as a color for each vertex. 4 | 5 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-04). 6 | 7 | # Mesh modification 8 | 9 | As we said at the beginning, we want to draw a quad. A quad can be constructed by using two triangles as shown in the next figure. 10 | 11 | ![Quad coordinates](quad_coordinates.png) 12 | 13 | As you can see each of the two triangles is composed of three vertices. The first one formed by the vertices V1, V2 and V4 \(the orange one\) and the second one formed by the vertices V4, V2 and V3 \(the green one\). Vertices are specified in a counter-clockwise order, so the float array to be passed will be \[V1, V2, V4, V4, V2, V3\]. Thus, the data for that shape could be: 14 | 15 | ```java 16 | float[] positions = new float[] { 17 | -0.5f, 0.5f, 0.0f, 18 | -0.5f, -0.5f, 0.0f, 19 | 0.5f, 0.5f, 0.0f, 20 | 0.5f, 0.5f, 0.0f, 21 | -0.5f, -0.5f, 0.0f, 22 | 0.5f, -0.5f, 0.0f, 23 | } 24 | ``` 25 | 26 | The code above still presents some issues. We are repeating coordinates to represent the quad. We are passing twice V2 and V4 coordinates. With this small shape it may not seem a big deal, but imagine a much more complex 3D model. We would be repeating the coordinates many times, like in the figure below \(where a vertex can be shared between six triangles\). 27 | 28 | ![Dolphin](dolphin.png) 29 | 30 | At the end we would need much more memory because of that duplicate information. But the major problem is not this, the biggest problem is that we will be repeating processes in our shaders for the shame vertex. This is where Index Buffers come to the rescue. For drawing the quad we only need to specify each vertex once this way: V1, V2, V3, V4\). Each vertex has a position in the array. V1 has position 0, V2 has position 1, etc: 31 | 32 | | V1 | V2 | V3 | V4 | 33 | | --- | --- | --- | --- | 34 | | 0 | 1 | 2 | 3 | 35 | 36 | Then we specify the order in which those vertices should be drawn by referring to their position: 37 | 38 | | 0 | 1 | 3 | 3 | 1 | 2 | 39 | | --- | --- | --- | --- | --- | --- | 40 | | V1 | V2 | V4 | V4 | V2 | V3 | 41 | 42 | So we need to modify our `Mesh` class to accept another parameter, an array of indices, and now the number of vertices to draw will be the length of that indices array. Keep in mind also that now we are just using three floats for representing the position of a vertex, but we want to associate the color of each one. Therefore, wee need to modify the `Mesh` class like this. 43 | 44 | ```java 45 | public class Mesh { 46 | ... 47 | public Mesh(float[] positions, float[] colors, int[] indices) { 48 | numVertices = indices.length; 49 | ... 50 | // Color VBO 51 | vboId = glGenBuffers(); 52 | vboIdList.add(vboId); 53 | FloatBuffer colorsBuffer = MemoryUtil.memCallocFloat(colors.length); 54 | colorsBuffer.put(0, colors); 55 | glBindBuffer(GL_ARRAY_BUFFER, vboId); 56 | glBufferData(GL_ARRAY_BUFFER, colorsBuffer, GL_STATIC_DRAW); 57 | glEnableVertexAttribArray(1); 58 | glVertexAttribPointer(1, 3, GL_FLOAT, false, 0, 0); 59 | 60 | // Index VBO 61 | vboId = glGenBuffers(); 62 | vboIdList.add(vboId); 63 | IntBuffer indicesBuffer = MemoryUtil.memCallocInt(indices.length); 64 | indicesBuffer.put(0, indices); 65 | glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboId); 66 | glBufferData(GL_ELEMENT_ARRAY_BUFFER, indicesBuffer, GL_STATIC_DRAW); 67 | ... 68 | MemoryUtil.memFree(colorsBuffer); 69 | MemoryUtil.memFree(indicesBuffer); 70 | } 71 | ... 72 | } 73 | ``` 74 | 75 | After we have created the VBO that stores the positions, we need to create another VBO which will hold the color data. After that we create another one for the indices. The process of creating that VBO is similar but the previous ones but notice that the type is now `GL_ELEMENT_ARRAY_BUFFER`. Since we are dealing with integers we need to create an `IntBuffer` instead of a `FloatBuffer`. The VAO will contain now three VBOs, one for positions, the other one for colors and another one that will hold the indices and that will be used for rendering. 76 | 77 | After that, we need to change the drawing call in the `SceneRender` class to use indices: 78 | ```java 79 | public class SceneRender { 80 | ... 81 | public void render(Scene scene) { 82 | ... 83 | scene.getMeshMap().values().forEach(mesh -> { 84 | glBindVertexArray(mesh.getVaoId()); 85 | glDrawElements(GL_TRIANGLES, mesh.getNumVertices(), GL_UNSIGNED_INT, 0); 86 | } 87 | ); 88 | ... 89 | } 90 | ... 91 | } 92 | ``` 93 | 94 | The parameters of the `glDrawElements` method are: 95 | 96 | * mode: Specifies the primitives for rendering, triangles in this case. No changes here. 97 | * count: Specifies the number of elements to be rendered. 98 | * type: Specifies the type of value in the indices data. In this case we are using integers. 99 | * indices: Specifies the offset to apply to the indices data to start rendering. 100 | 101 | Now we can just create a new Mesh with the extra vertex parameters (colors) and the indices in the `Main` class: 102 | ```java 103 | public class Main implements IAppLogic { 104 | ... 105 | public static void main(String[] args) { 106 | ... 107 | Engine gameEng = new Engine("chapter-04", new Window.WindowOptions(), main); 108 | ... 109 | } 110 | ... 111 | public void init(Window window, Scene scene, Render render) { 112 | float[] positions = new float[]{ 113 | -0.5f, 0.5f, 0.0f, 114 | -0.5f, -0.5f, 0.0f, 115 | 0.5f, -0.5f, 0.0f, 116 | 0.5f, 0.5f, 0.0f, 117 | }; 118 | float[] colors = new float[]{ 119 | 0.5f, 0.0f, 0.0f, 120 | 0.0f, 0.5f, 0.0f, 121 | 0.0f, 0.0f, 0.5f, 122 | 0.0f, 0.5f, 0.5f, 123 | }; 124 | int[] indices = new int[]{ 125 | 0, 1, 3, 3, 1, 2, 126 | }; 127 | Mesh mesh = new Mesh(positions, colors, indices); 128 | scene.addMesh("quad", mesh); 129 | } 130 | ... 131 | } 132 | ``` 133 | 134 | Now we need to modify the shaders, not because of the indices, but to use the color per vertex. The vertex shader (`scene.vert`) is like this: 135 | ```glsl 136 | #version 330 137 | 138 | layout (location=0) in vec3 position; 139 | layout (location=1) in vec3 color; 140 | 141 | out vec3 outColor; 142 | 143 | void main() 144 | { 145 | gl_Position = vec4(position, 1.0); 146 | outColor = color; 147 | } 148 | ``` 149 | 150 | In the input parameters, you can see we receive a new `vec2` for the color and we just return that to be used the fragment shader (`scene.frag`) which is like this: 151 | ```glsl 152 | #version 330 153 | 154 | in vec3 outColor; 155 | out vec4 fragColor; 156 | 157 | void main() 158 | { 159 | fragColor = vec4(outColor, 1.0); 160 | } 161 | ``` 162 | 163 | We just use the input color parameter to return the fragment color. It is important to notice, that the color value will be interpolated when using in the fragment shader, so the result will be something like this. 164 | 165 | ![Colored quad](colored_quad.png) 166 | 167 | [Next chapter](../chapter-05/chapter-05.md) -------------------------------------------------------------------------------- /chapter-04/colored_quad.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-04/colored_quad.png -------------------------------------------------------------------------------- /chapter-04/dolphin.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-04/dolphin.png -------------------------------------------------------------------------------- /chapter-04/quad_coordinates.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-04/quad_coordinates.png -------------------------------------------------------------------------------- /chapter-05/2_2_matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-05/2_2_matrix.png -------------------------------------------------------------------------------- /chapter-05/chapter-05.md: -------------------------------------------------------------------------------- 1 | # Chapter 05 - Perspective 2 | 3 | In this chapter, we will learn two important concepts, perspective projection (to render far away objects smaller than closer ones) and uniforms (a buffer like structure to pass additional data to the shader). 4 | 5 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-05). 6 | 7 | ## Perspective projection 8 | 9 | Let’s get back to our nice colored quad we created in the previous chapter. If you look carefully, you will see that the quad is distorted and appears as a rectangle. You can even change the width of the window from 600 pixels to 900 and the distortion will be more evident. What’s happening here? 10 | 11 | If you revisit our vertex shader code we are just passing our coordinates directly. That is, when we say that a vertex has a value for coordinate x of 0.5 we are saying to OpenGL to draw it at x position 0.5 on our screen. The following figure shows the OpenGL coordinates \(just for x and y axis\). 12 | 13 | ![Coordinates](coordinates.png) 14 | 15 | Those coordinates are mapped, considering our window size, to window coordinates \(which have the origin at the top-left corner of the previous figure\). So, if our window has a size of 900x580, OpenGL coordinates \(1,0\) will be mapped to coordinates \(900, 0\) creating a rectangle instead of a quad. 16 | 17 | ![Rectangle](rectangle.png) 18 | 19 | But, the problem is more serious than that. Modify the z coordinate of our quad from 0.0 to 1.0 and to -1.0. What do you see? The quad is exactly drawn in the same place no matter if it’s displaced along the z axis. Why is this happening? Objects that are further away should be drawn smaller than objects that are closer. But we are drawing them with the same x and y coordinates. 20 | 21 | But, wait. Should this not be handled by the z coordinate? The answer is yes and no. The z coordinate tells OpenGL that an object is closer or farther away, but OpenGL does not know anything about the size of your object. You could have two objects of different sizes, one closer and smaller and one bigger and further that could be projected correctly onto the screen with the same size \(those would have same x and y coordinates but different z\). OpenGL just uses the coordinates we are passing, so we must take care of this. We need to correctly project our coordinates. 22 | 23 | Now that we have diagnosed the problem, how do we fix it? The answer is using a perspective projection matrix. The perspective projection matrix will take care of the aspect ratio \(the relation between size and height\) of our drawing area so objects won’t be distorted. It also will handle the distance so objects far away from us will be drawn smaller. The projection matrix will also consider our field of view and the maximum distance to be displayed. 24 | 25 | For those not familiar with matrices, a matrix is a bi-dimensional array of numbers arranged in columns and rows. Each number inside a matrix is called an element. A matrix order is the number of rows and columns. For instance, here you can see a 2x2 matrix \(2 rows and 2 columns\). 26 | 27 | ![2x2 Matrix](2_2_matrix.png) 28 | 29 | Matrices have a number of basic operations that can be applied to them \(such as addition, multiplication, etc.\) that you can consult in a math book. The main characteristics of matrices, related to 3D graphics, is that they are very useful to transform points in the space. 30 | 31 | You can think about the projection matrix as a camera, which has a field of view and a minimum and maximum distance. The vision area of that camera will be obtained from a truncated pyramid. The following picture shows a top view of that area. 32 | 33 | ![Projection Matrix concepts](projection_matrix.png) 34 | 35 | A projection matrix will correctly map 3D coordinates so they can be correctly represented on a 2D screen. The mathematical representation of that matrix is as follows \(don’t be scared\). 36 | 37 | ![Projection Matrix](projection_matrix_eq.png) 38 | 39 | Where aspect ratio is the relation between our screen width and our screen height \($$a=width/height$$\). In order to obtain the projected coordinates of a given point we just need to multiply the projection matrix by the original coordinates. The result will be another vector that will contain the projected version. 40 | 41 | So we need to handle a set of mathematical entities such as vectors, matrices and include the operations that can be done on them. We could choose to write all that code by our own from scratch or use an already existing library. We will choose the easy path and use a specific library for dealing with math operations in LWJGL which is called JOML \(Java OpenGL Math Library\). In order to use that library we just need to add another dependency to our `pom.xml` file. 42 | 43 | ```xml 44 | 45 | org.joml 46 | joml 47 | ${joml.version} 48 | 49 | ``` 50 | 51 | Now that everything has been set up let’s define our projection matrix. We will create a new class named `Projection` which is defined like this: 52 | ```java 53 | package org.lwjglb.engine.scene; 54 | 55 | import org.joml.Matrix4f; 56 | 57 | public class Projection { 58 | 59 | private static final float FOV = (float) Math.toRadians(60.0f); 60 | private static final float Z_FAR = 1000.f; 61 | private static final float Z_NEAR = 0.01f; 62 | 63 | private Matrix4f projMatrix; 64 | 65 | public Projection(int width, int height) { 66 | projMatrix = new Matrix4f(); 67 | updateProjMatrix(width, height); 68 | } 69 | 70 | public Matrix4f getProjMatrix() { 71 | return projMatrix; 72 | } 73 | 74 | public void updateProjMatrix(int width, int height) { 75 | projMatrix.setPerspective(FOV, (float) width / height, Z_NEAR, Z_FAR); 76 | } 77 | } 78 | ``` 79 | 80 | As you can see, it relies on the `Matrix4f` class \(provided by the JOML library\) which provides a method to set up a perspective projection matrix named `setPerspective`. This method needs the following parameters: 81 | 82 | * Field of View: The Field of View angle in radians. We just use the `FOV` constant for that 83 | * Aspect Ratio: That is, the relation ship between render width and height. 84 | * Distance to the near plane \(z-near\) 85 | * Distance to the far plane \(z-far\). 86 | 87 | We will store a `Projection` class instance in the `Scene` class and initialize it in the constructor. IN addition to that, we weill need to take care if the window is resized, so we provide a new method in that `Scene` class, named `resize` to recalculate the perspective projection matrix when window dimensions change. 88 | 89 | ```java 90 | public class Scene { 91 | ... 92 | private Projection projection; 93 | 94 | public Scene(int width, int height) { 95 | ... 96 | projection = new Projection(width, height); 97 | } 98 | ... 99 | public Projection getProjection() { 100 | return projection; 101 | } 102 | 103 | public void resize(int width, int height) { 104 | projection.updateProjMatrix(width, height); 105 | } 106 | } 107 | ``` 108 | 109 | We need also to update the `Engine` to adapt it to the new `Scene` class constructor parameters and to invoke the `resize` method: 110 | 111 | ```java 112 | public class Engine { 113 | ... 114 | public Engine(String windowTitle, Window.WindowOptions opts, IAppLogic appLogic) { 115 | ... 116 | scene = new Scene(window.getWidth(), window.getHeight()); 117 | ... 118 | } 119 | ... 120 | private void resize() { 121 | scene.resize(window.getWidth(), window.getHeight()); 122 | } 123 | ... 124 | } 125 | ``` 126 | 127 | ## Uniforms 128 | 129 | Now that we have the infrastructure to calculate the perspective projection matrix, how do we use it? We need to use it in our shader, and it should be applied to all the vertices. At first, you could think of bundling it in the vertex input \(like the coordinates and the colors\). In this case we would be wasting lots of space since the projection matrix is common to any vertex. You may also think of multiplying the vertices by the matrix in the java code. But then, our VBOs would be useless and we will not be using the process power available in the graphics card. 130 | 131 | The answer is to use “uniforms”. Uniforms are global GLSL variables that shaders can use and that we will employ to pass data that is common to all elements or to a model. So, let's start with how uniforms are used in shader programs. We need to modify our vertex shader code and declare a new uniform called `projectionMatrix` and use it to calculate the projected position. 132 | 133 | ```glsl 134 | #version 330 135 | 136 | layout (location=0) in vec3 position; 137 | layout (location=1) in vec3 color; 138 | 139 | out vec3 outColor; 140 | 141 | uniform mat4 projectionMatrix; 142 | 143 | void main() 144 | { 145 | gl_Position = projectionMatrix * vec4(position, 1.0); 146 | outColor = color; 147 | } 148 | ``` 149 | 150 | As you can see we define our `projectionMatrix` as a 4x4 matrix and the position is obtained by multiplying it by our original coordinates. Now we need to pass the values of the projection matrix to our shader. We will create a new class named `UniformMap` which will allow us to create references to the uniforms and set up their values. It starts like this: 151 | 152 | ```java 153 | package org.lwjglb.engine.graph; 154 | 155 | import org.joml.Matrix4f; 156 | import org.lwjgl.system.MemoryStack; 157 | 158 | import java.util.*; 159 | 160 | import static org.lwjgl.opengl.GL20.*; 161 | 162 | public class UniformsMap { 163 | 164 | private int programId; 165 | private Map uniforms; 166 | 167 | public UniformsMap(int programId) { 168 | this.programId = programId; 169 | uniforms = new HashMap<>(); 170 | } 171 | 172 | public void createUniform(String uniformName) { 173 | int uniformLocation = glGetUniformLocation(programId, uniformName); 174 | if (uniformLocation < 0) { 175 | throw new RuntimeException("Could not find uniform [" + uniformName + "] in shader program [" + 176 | programId + "]"); 177 | } 178 | uniforms.put(uniformName, uniformLocation); 179 | } 180 | ... 181 | } 182 | ``` 183 | 184 | As you can see, the constructor receives the identifier of the shader program and it defines a `Map` to store the references (`Integer` instances) to uniforms which are created in the `createUniform` method. Uniforms references are retrieved by calling the `glGetUniformLocation` function, which receives two parameters: 185 | 186 | * The shader program identifier. 187 | * The name of the uniform \(it should match the one defined in the shader code\). 188 | 189 | As you can see, uniform creation is independent on the data type associated to it. We will need to have separate methods for the different types when we want to set the data for that uniform. By now, we will just need a method to load a 4x4 matrix: 190 | 191 | ```java 192 | public class UniformsMap { 193 | ... 194 | public void setUniform(String uniformName, Matrix4f value) { 195 | try (MemoryStack stack = MemoryStack.stackPush()) { 196 | Integer location = uniforms.get(uniformName); 197 | if (location == null) { 198 | throw new RuntimeException("Could not find uniform [" + uniformName + "]"); 199 | } 200 | glUniformMatrix4fv(location.intValue(), false, value.get(stack.mallocFloat(16))); 201 | } 202 | } 203 | } 204 | ``` 205 | 206 | Now, we can use the code above in the `SceneRender` class: 207 | 208 | ```java 209 | public class SceneRender { 210 | ... 211 | private UniformsMap uniformsMap; 212 | 213 | public SceneRender() { 214 | ... 215 | createUniforms(); 216 | } 217 | ... 218 | private void createUniforms() { 219 | uniformsMap = new UniformsMap(shaderProgram.getProgramId()); 220 | uniformsMap.createUniform("projectionMatrix"); 221 | } 222 | ... 223 | public void render(Scene scene) { 224 | ... 225 | uniformsMap.setUniform("projectionMatrix", scene.getProjection().getProjMatrix()); 226 | ... 227 | } 228 | } 229 | ``` 230 | 231 | We are almost done. We can now show the quad correctly rendered, So you can now launch your program and will obtain a... black background without any coloured quad. What’s happening? Did we break something? Well, actually no. Remember that we are now simulating the effect of a camera looking at our scene. And we provided two distances, one to the farthest plane \(equal to 1000f\) and one to the closest plane \(equal to 0.01f\). Our coordinates were: 232 | 233 | ```java 234 | float[] positions = new float[]{ 235 | -0.5f, 0.5f, 0.0f, 236 | -0.5f, -0.5f, 0.0f, 237 | 0.5f, -0.5f, 0.0f, 238 | 0.5f, 0.5f, 0.0f, 239 | }; 240 | ``` 241 | 242 | That is, our z coordinates are outside the visible zone. Let’s assign them a value of `-0.05f`. Now you will see a giant square like this: 243 | 244 | ![Square 1](square_1.png) 245 | 246 | What is happening now is that we are drawing the quad too close to our camera. We are actually zooming into it. If we assign now a value of `-1.0f` to the z coordinate we can now see our coloured quad. 247 | 248 | ```java 249 | public class Main implements IAppLogic { 250 | ... 251 | public static void main(String[] args) { 252 | ... 253 | Engine gameEng = new Engine("chapter-05", new Window.WindowOptions(), main); 254 | ... 255 | } 256 | ... 257 | public void init(Window window, Scene scene, Render render) { 258 | float[] positions = new float[]{ 259 | -0.5f, 0.5f, -1.0f, 260 | -0.5f, -0.5f, -1.0f, 261 | 0.5f, -0.5f, -1.0f, 262 | 0.5f, 0.5f, -1.0f, 263 | }; 264 | ... 265 | } 266 | ... 267 | } 268 | ``` 269 | 270 | ![Square coloured](square_colored.png) 271 | 272 | If we continue pushing the quad backwards we will see it becoming smaller. Notice also that our quad does not appear as a rectangle anymore. 273 | 274 | [Next chapter](../chapter-06/chapter-06.md) 275 | -------------------------------------------------------------------------------- /chapter-05/coordinates.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-05/coordinates.png -------------------------------------------------------------------------------- /chapter-05/projection_matrix.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-05/projection_matrix.png -------------------------------------------------------------------------------- /chapter-05/projection_matrix_eq.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-05/projection_matrix_eq.png -------------------------------------------------------------------------------- /chapter-05/rectangle.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-05/rectangle.png -------------------------------------------------------------------------------- /chapter-05/square_1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-05/square_1.png -------------------------------------------------------------------------------- /chapter-05/square_colored.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-05/square_colored.png -------------------------------------------------------------------------------- /chapter-06/cube_coords.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-06/cube_coords.png -------------------------------------------------------------------------------- /chapter-06/cube_depth_test.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-06/cube_depth_test.png -------------------------------------------------------------------------------- /chapter-06/cube_no_depth_test.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-06/cube_no_depth_test.png -------------------------------------------------------------------------------- /chapter-06/transformations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-06/transformations.png -------------------------------------------------------------------------------- /chapter-07/cube_texture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-07/cube_texture.png -------------------------------------------------------------------------------- /chapter-07/cube_texture_front_face.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-07/cube_texture_front_face.png -------------------------------------------------------------------------------- /chapter-07/cube_texture_top_face.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-07/cube_texture_top_face.png -------------------------------------------------------------------------------- /chapter-07/cube_with_texture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-07/cube_with_texture.png -------------------------------------------------------------------------------- /chapter-07/texture_coordinates.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-07/texture_coordinates.png -------------------------------------------------------------------------------- /chapter-07/texture_mapping.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-07/texture_mapping.png -------------------------------------------------------------------------------- /chapter-08/actual_movement.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-08/actual_movement.png -------------------------------------------------------------------------------- /chapter-08/camera_movement.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-08/camera_movement.png -------------------------------------------------------------------------------- /chapter-08/chapter-08.md: -------------------------------------------------------------------------------- 1 | # Chapter 08 - Camera 2 | 3 | In this chapter we will learn how to move inside a rendered 3D scene. This capability is like having a camera that can travel inside the 3D world and in fact that's the term used to refer to it. 4 | 5 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-08). 6 | 7 | ## Camera introduction 8 | 9 | If you try to search for specific camera functions in OpenGL you will discover that there is no camera concept, or in other words the camera is always fixed, centered in the \(0, 0, 0\) position at the center of the screen. So what we will do is a simulation that gives us the impression that we have a camera capable of moving inside the 3D scene. How do we achieve this? Well, if we cannot move the camera then we must move all the objects contained in our 3D space at once. In other words, if we cannot move a camera we will move the whole world. 10 | 11 | Hence, suppose that we would like to move the camera position along the z axis from a starting position \(Cx, Cy, Cz\) to a position \(Cx, Cy, Cz+dz\) to get closer to the object which is placed at the coordinates \(Ox, Oy, Oz\). 12 | 13 | ![Camera movement](camera_movement.png) 14 | 15 | What we will actually do is move the object \(all the objects in our 3D space indeed\) in the opposite direction that the camera should move. Think about it like the objects being placed in a treadmill. 16 | 17 | ![Actual movement](actual_movement.png) 18 | 19 | A camera can be displaced along the three axis \(x, y and z\) and also can rotate along them \(roll, pitch and yaw\). 20 | 21 | ![Roll pitch and yaw](roll_pitch_yaw.png) 22 | 23 | So basically what we must do is to be able to move and rotate all of the objects of our 3D world. How are we going to do this? The answer is to apply another transformation that will translate all of the vertices of all of the objects in the opposite direction of the movement of the camera and that will rotate them according to the camera rotation. This will be done of course with another matrix, the so called view matrix. This matrix will first perform the translation and then the rotation along the axis. 24 | 25 | Let's see how we can construct that matrix. If you remember from the transformations chapter our transformation equation was like this: 26 | 27 | $$ 28 | \begin{array}{lcl} 29 | Transf & = & \lbrack ProjMatrix \rbrack \cdot \lbrack TranslationMatrix \rbrack \cdot \lbrack RotationMatrix \rbrack \cdot \lbrack ScaleMatrix \rbrack \\ 30 | & = & \lbrack ProjMatrix \rbrack \cdot \lbrack WorldMatrix \rbrack 31 | \end{array} 32 | $$ 33 | 34 | The view matrix should be applied before multiplying by the projection matrix, so our equation should be now like this: 35 | 36 | $$ 37 | \begin{array}{lcl} 38 | Transf & = & \lbrack ProjMatrix \rbrack \cdot \lbrack ViewMatrix \rbrack \cdot \lbrack TranslationMatrix \rbrack \cdot \lbrack RotationMatrix \rbrack \cdot \lbrack ScaleMatrix \rbrack \\ 39 | & = & \lbrack ProjMatrix \rbrack \cdot \lbrack ViewMatrix \rbrack \cdot \lbrack WorldMatrix \rbrack 40 | \end{array} 41 | $$ 42 | 43 | ## Camera implementation 44 | 45 | So let’s start modifying our code to support a camera. First of all we will create a new class called `Camera` which will hold the position and rotation state of our camera as well as its view matrix. The class is defined like this: 46 | ```java 47 | package org.lwjglb.engine.scene; 48 | 49 | import org.joml.*; 50 | 51 | public class Camera { 52 | 53 | private Vector3f direction; 54 | private Vector3f position; 55 | private Vector3f right; 56 | private Vector2f rotation; 57 | private Vector3f up; 58 | private Matrix4f viewMatrix; 59 | 60 | public Camera() { 61 | direction = new Vector3f(); 62 | right = new Vector3f(); 63 | up = new Vector3f(); 64 | position = new Vector3f(); 65 | viewMatrix = new Matrix4f(); 66 | rotation = new Vector2f(); 67 | } 68 | 69 | public void addRotation(float x, float y) { 70 | rotation.add(x, y); 71 | recalculate(); 72 | } 73 | 74 | public Vector3f getPosition() { 75 | return position; 76 | } 77 | 78 | public Matrix4f getViewMatrix() { 79 | return viewMatrix; 80 | } 81 | 82 | public void moveBackwards(float inc) { 83 | viewMatrix.positiveZ(direction).negate().mul(inc); 84 | position.sub(direction); 85 | recalculate(); 86 | } 87 | 88 | public void moveDown(float inc) { 89 | viewMatrix.positiveY(up).mul(inc); 90 | position.sub(up); 91 | recalculate(); 92 | } 93 | 94 | public void moveForward(float inc) { 95 | viewMatrix.positiveZ(direction).negate().mul(inc); 96 | position.add(direction); 97 | recalculate(); 98 | } 99 | 100 | public void moveLeft(float inc) { 101 | viewMatrix.positiveX(right).mul(inc); 102 | position.sub(right); 103 | recalculate(); 104 | } 105 | 106 | public void moveRight(float inc) { 107 | viewMatrix.positiveX(right).mul(inc); 108 | position.add(right); 109 | recalculate(); 110 | } 111 | 112 | public void moveUp(float inc) { 113 | viewMatrix.positiveY(up).mul(inc); 114 | position.add(up); 115 | recalculate(); 116 | } 117 | 118 | private void recalculate() { 119 | viewMatrix.identity() 120 | .rotateX(rotation.x) 121 | .rotateY(rotation.y) 122 | .translate(-position.x, -position.y, -position.z); 123 | } 124 | 125 | public void setPosition(float x, float y, float z) { 126 | position.set(x, y, z); 127 | recalculate(); 128 | } 129 | 130 | public void setRotation(float x, float y) { 131 | rotation.set(x, y); 132 | recalculate(); 133 | } 134 | } 135 | ``` 136 | 137 | As you can see, besides rotation and position we define some vectors to define forward up and right directions. This is because we are implementing a free space movement camera, and when we rotate it if we want to move forward we just want to move where the camera is pointing, not to predefined axis. We need to get those vectors to calculate where the next position will be placed. And finally, at the end the state of the camera is stored into a 4x4 matrix, the view matrix, so any time we change position or rotation we need to update it. As you can see, when updating thew view matrix, we first need to do the rotation and then the translation. If we did the opposite we would not be rotating along the camera position but along the coordinates origin. 138 | 139 | The `Camera` class also provides methods to update position when moving forward, up or to the right. In these methods, the view matrix is used to calculate where the forward, up or right methods should be according to current state, and increases the position accordingly. We use the fantastic JOML library to these calculations for us while maintaining the code quite simple. 140 | 141 | ## Using the Camera 142 | 143 | We will store a `Camera` instance in the `Scene` class, so let's go for the changes: 144 | 145 | ```java 146 | public class Scene { 147 | ... 148 | private Camera camera; 149 | ... 150 | public Scene(int width, int height) { 151 | ... 152 | camera = new Camera(); 153 | } 154 | ... 155 | public Camera getCamera() { 156 | return camera; 157 | } 158 | ... 159 | } 160 | ``` 161 | 162 | It would be nice to control de camera with our mouse. In order to do so, we will create a new class to handle mouse events so we can use them to update camera rotation. Here's the code for that class. 163 | 164 | ```java 165 | package org.lwjglb.engine; 166 | 167 | import org.joml.Vector2f; 168 | 169 | import static org.lwjgl.glfw.GLFW.*; 170 | 171 | public class MouseInput { 172 | 173 | private Vector2f currentPos; 174 | private Vector2f displVec; 175 | private boolean inWindow; 176 | private boolean leftButtonPressed; 177 | private Vector2f previousPos; 178 | private boolean rightButtonPressed; 179 | 180 | public MouseInput(long windowHandle) { 181 | previousPos = new Vector2f(-1, -1); 182 | currentPos = new Vector2f(); 183 | displVec = new Vector2f(); 184 | leftButtonPressed = false; 185 | rightButtonPressed = false; 186 | inWindow = false; 187 | 188 | glfwSetCursorPosCallback(windowHandle, (handle, xpos, ypos) -> { 189 | currentPos.x = (float) xpos; 190 | currentPos.y = (float) ypos; 191 | }); 192 | glfwSetCursorEnterCallback(windowHandle, (handle, entered) -> inWindow = entered); 193 | glfwSetMouseButtonCallback(windowHandle, (handle, button, action, mode) -> { 194 | leftButtonPressed = button == GLFW_MOUSE_BUTTON_1 && action == GLFW_PRESS; 195 | rightButtonPressed = button == GLFW_MOUSE_BUTTON_2 && action == GLFW_PRESS; 196 | }); 197 | } 198 | 199 | public Vector2f getCurrentPos() { 200 | return currentPos; 201 | } 202 | 203 | public Vector2f getDisplVec() { 204 | return displVec; 205 | } 206 | 207 | public void input() { 208 | displVec.x = 0; 209 | displVec.y = 0; 210 | if (previousPos.x > 0 && previousPos.y > 0 && inWindow) { 211 | double deltax = currentPos.x - previousPos.x; 212 | double deltay = currentPos.y - previousPos.y; 213 | boolean rotateX = deltax != 0; 214 | boolean rotateY = deltay != 0; 215 | if (rotateX) { 216 | displVec.y = (float) deltax; 217 | } 218 | if (rotateY) { 219 | displVec.x = (float) deltay; 220 | } 221 | } 222 | previousPos.x = currentPos.x; 223 | previousPos.y = currentPos.y; 224 | } 225 | 226 | public boolean isLeftButtonPressed() { 227 | return leftButtonPressed; 228 | } 229 | 230 | public boolean isRightButtonPressed() { 231 | return rightButtonPressed; 232 | } 233 | } 234 | ``` 235 | 236 | The `MouseInput` class, in its constructor, registers a set of callbacks to process mouse events: 237 | 238 | * `glfwSetCursorPosCallback`: Registers a callback that will be invoked when the mouse is moved. 239 | * `glfwSetCursorEnterCallback`: Registers a callback that will be invoked when the mouse enters our window. We will be receiving mouse events even if the mouse is not in our window. We use this callback to track when the mouse is in our window. 240 | * `glfwSetMouseButtonCallback`: Registers a callback that will be invoked when a mouse button is pressed. 241 | 242 | The `MouseInput` class provides an input method which should be called when game input is processed. This method calculates the mouse displacement from the previous position and stores it into the `displVec` variable so it can be used by our game. 243 | 244 | The `MouseInput` class will be instantiated in our `Window` class, which will also provide a getter to return its instance. 245 | 246 | ```java 247 | public class Window { 248 | ... 249 | private MouseInput mouseInput; 250 | ... 251 | public Window(String title, WindowOptions opts, Callable resizeFunc) { 252 | ... 253 | mouseInput = new MouseInput(windowHandle); 254 | } 255 | ... 256 | public MouseInput getMouseInput() { 257 | return mouseInput; 258 | } 259 | ... 260 | } 261 | ``` 262 | 263 | In the `Engine` class we will consume mouse input when handling regular input: 264 | ```java 265 | public class Engine { 266 | ... 267 | private void run() { 268 | ... 269 | if (targetFps <= 0 || deltaFps >= 1) { 270 | window.getMouseInput().input(); 271 | appLogic.input(window, scene, now - initialTime); 272 | } 273 | ... 274 | } 275 | ... 276 | } 277 | ``` 278 | 279 | Now we can modify the vertex shader to use the `Camera`'s view matrix, which as you may guess will be passed as an uniform. 280 | 281 | ```glsl 282 | #version 330 283 | 284 | layout (location=0) in vec3 position; 285 | layout (location=1) in vec2 texCoord; 286 | 287 | out vec2 outTextCoord; 288 | 289 | uniform mat4 projectionMatrix; 290 | uniform mat4 viewMatrix; 291 | uniform mat4 modelMatrix; 292 | 293 | void main() 294 | { 295 | gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0); 296 | outTextCoord = texCoord; 297 | } 298 | ``` 299 | 300 | So the next step is to properly create the uniform in the `SceneRender` class and update its value in each `render` call: 301 | 302 | ```java 303 | public class SceneRender { 304 | ... 305 | private void createUniforms() { 306 | ... 307 | uniformsMap.createUniform("viewMatrix"); 308 | ... 309 | } 310 | ... 311 | public void render(Scene scene) { 312 | ... 313 | uniformsMap.setUniform("projectionMatrix", scene.getProjection().getProjMatrix()); 314 | uniformsMap.setUniform("viewMatrix", scene.getCamera().getViewMatrix()); 315 | ... 316 | } 317 | } 318 | ``` 319 | 320 | And that’s all, our base code supports the concept of a camera. Now we need to use it. We can change the way we handle the input and update the camera. We will set the following controls: 321 | 322 | * Keys “A” and “D” to move the camera to the left and right \(x axis\) respectively. 323 | * Keys “W” and “S” to move the camera forward and backwards \(z axis\) respectively. 324 | * Keys “Z” and “X” to move the camera up and down \(y axis\) respectively. 325 | 326 | We will use the mouse position to rotate the camera along the x and y axis when the right button of the mouse is pressed. 327 | 328 | Now we are ready to update our `Main` class to process the keyboard and mouse input. 329 | 330 | ```java 331 | 332 | public class Main implements IAppLogic { 333 | 334 | private static final float MOUSE_SENSITIVITY = 0.1f; 335 | private static final float MOVEMENT_SPEED = 0.005f; 336 | ... 337 | 338 | public static void main(String[] args) { 339 | ... 340 | Engine gameEng = new Engine("chapter-08", new Window.WindowOptions(), main); 341 | ... 342 | } 343 | ... 344 | public void input(Window window, Scene scene, long diffTimeMillis) { 345 | float move = diffTimeMillis * MOVEMENT_SPEED; 346 | Camera camera = scene.getCamera(); 347 | if (window.isKeyPressed(GLFW_KEY_W)) { 348 | camera.moveForward(move); 349 | } else if (window.isKeyPressed(GLFW_KEY_S)) { 350 | camera.moveBackwards(move); 351 | } 352 | if (window.isKeyPressed(GLFW_KEY_A)) { 353 | camera.moveLeft(move); 354 | } else if (window.isKeyPressed(GLFW_KEY_D)) { 355 | camera.moveRight(move); 356 | } 357 | if (window.isKeyPressed(GLFW_KEY_UP)) { 358 | camera.moveUp(move); 359 | } else if (window.isKeyPressed(GLFW_KEY_DOWN)) { 360 | camera.moveDown(move); 361 | } 362 | 363 | MouseInput mouseInput = window.getMouseInput(); 364 | if (mouseInput.isRightButtonPressed()) { 365 | Vector2f displVec = mouseInput.getDisplVec(); 366 | camera.addRotation((float) Math.toRadians(-displVec.x * MOUSE_SENSITIVITY), 367 | (float) Math.toRadians(-displVec.y * MOUSE_SENSITIVITY)); 368 | } 369 | } 370 | ... 371 | } 372 | ``` 373 | 374 | [Next chapter](../chapter-09/chapter-09.md) 375 | -------------------------------------------------------------------------------- /chapter-08/new_transf_eq.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-08/new_transf_eq.png -------------------------------------------------------------------------------- /chapter-08/prev_transformation_eq.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-08/prev_transformation_eq.png -------------------------------------------------------------------------------- /chapter-08/roll_pitch_yaw.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-08/roll_pitch_yaw.png -------------------------------------------------------------------------------- /chapter-09/chapter-09.md: -------------------------------------------------------------------------------- 1 | # Chapter 09 - Loading more complex models: Assimp 2 | 3 | The capability of loading complex 3D models in different formats is crucial in order to write a game. The task of writing parsers for some of them would require lots of work. Even just supporting a single format can be time consuming. Fortunately, the [Assimp](http://assimp.sourceforge.net/) library already can be used to parse many common 3D formats. It’s a C/C++ library which can load static and animated models in a variety of formats. LWJGL provides the bindings to use them from Java code. In this chapter, we will explain how it can be used. 4 | 5 | 6 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-09). 7 | 8 | ## Model loader 9 | 10 | The first thing is adding Assimp maven dependencies to the project pom.xml. We need to add compile time and runtime dependencies. 11 | 12 | ```xml 13 | 14 | org.lwjgl 15 | lwjgl-assimp 16 | ${lwjgl.version} 17 | 18 | 19 | org.lwjgl 20 | lwjgl-assimp 21 | ${lwjgl.version} 22 | ${native.target} 23 | runtime 24 | 25 | ``` 26 | 27 | Once the dependencies has been set, we will create a new class named `ModelLoader` that will be used to load models with Assimp. The class defines two static public methods: 28 | 29 | ```java 30 | package org.lwjglb.engine.scene; 31 | 32 | import org.joml.Vector4f; 33 | import org.lwjgl.PointerBuffer; 34 | import org.lwjgl.assimp.*; 35 | import org.lwjgl.system.MemoryStack; 36 | import org.lwjglb.engine.graph.*; 37 | 38 | import java.io.File; 39 | import java.nio.IntBuffer; 40 | import java.util.*; 41 | 42 | import static org.lwjgl.assimp.Assimp.*; 43 | 44 | public class ModelLoader { 45 | 46 | private ModelLoader() { 47 | // Utility class 48 | } 49 | 50 | public static Model loadModel(String modelId, String modelPath, TextureCache textureCache) { 51 | return loadModel(modelId, modelPath, textureCache, aiProcess_GenSmoothNormals | aiProcess_JoinIdenticalVertices | 52 | aiProcess_Triangulate | aiProcess_FixInfacingNormals | aiProcess_CalcTangentSpace | aiProcess_LimitBoneWeights | 53 | aiProcess_PreTransformVertices); 54 | 55 | } 56 | 57 | public static Model loadModel(String modelId, String modelPath, TextureCache textureCache, int flags) { 58 | ... 59 | 60 | } 61 | ... 62 | } 63 | ``` 64 | 65 | Both methods have the following arguments: 66 | 67 | * `modelId`: A unique identifier for the model to be loaded. 68 | 69 | * `modelPath`: The path to the file where the model file is located. This is a regular file path, no CLASSPATH relative paths, because Assimp may need to load additional files and may use the same base path as the `modelPath` \(For instance, material files for wavefront, OBJ, files\). If you embed your resources inside a JAR file, Assimp will not be able to import it, so it must be a file system path. When loading textures we will use `modelPath` to get the base directory where the model is located to load textures (overriding whatever path is defined in the model). We do this because some models contain absolute paths to local folders of where the model was developed which, obviously, are not accessible. 70 | 71 | * `textureCache`: A reference to the texture cache to avoid loading the same texture multiple times. 72 | 73 | The second method has an extra argument named `flags`. This parameter allows to tune the loading process. The first method invokes the second one and passes some values that are useful in most of the situations: 74 | 75 | * `aiProcess_JoinIdenticalVertices`: This flag reduces the number of vertices that are used, identifying those that can be reused between faces. 76 | 77 | * `aiProcess_Triangulate`: The model may use quads or other geometries to define their elements. Since we are only dealing with triangles, we must use this flag to split all he faces into triangles \(if needed\). 78 | 79 | * `aiProcess_FixInfacingNormals`: This flags try to reverse normals that may point inwards. 80 | 81 | * `aiProcess_CalcTangentSpace`: We will use this parameter when implementing lights, but it basically calculates tangent and bitangents using normals information. 82 | 83 | * `aiProcess_LimitBoneWeights`: We will use this parameter when implementing animations, but it basically limit the number of weights that affect a single vertex. 84 | 85 | * `aiProcess_PreTransformVertices`: This flag performs some transformation over the data loaded so the model is placed in the origin and the coordinates are corrected to math OpenGL coordinate System. If you have problems with models that are rotated, make sure to use this flag. Important: do not use this flag if your model uses animations, this flag will remove that information. 86 | 87 | There are many other flags that can be used, you can check them in the LWJGL or Assimp documentation. 88 | 89 | Let’s go back to the second constructor. The first thing we do is invoke the `aiImportFile` method to load the model with the selected flags. 90 | 91 | ```java 92 | public class ModelLoader { 93 | ... 94 | public static Model loadModel(String modelId, String modelPath, TextureCache textureCache, int flags) { 95 | File file = new File(modelPath); 96 | if (!file.exists()) { 97 | throw new RuntimeException("Model path does not exist [" + modelPath + "]"); 98 | } 99 | String modelDir = file.getParent(); 100 | 101 | AIScene aiScene = aiImportFile(modelPath, flags); 102 | if (aiScene == null) { 103 | throw new RuntimeException("Error loading model [modelPath: " + modelPath + "]"); 104 | } 105 | ... 106 | } 107 | ... 108 | } 109 | ``` 110 | 111 | The rest of the code for the constructor is a as follows: 112 | 113 | ```java 114 | public class ModelLoader { 115 | ... 116 | public static Model loadModel(String modelId, String modelPath, TextureCache textureCache, int flags) { 117 | ... 118 | int numMaterials = aiScene.mNumMaterials(); 119 | List materialList = new ArrayList<>(); 120 | for (int i = 0; i < numMaterials; i++) { 121 | AIMaterial aiMaterial = AIMaterial.create(aiScene.mMaterials().get(i)); 122 | materialList.add(processMaterial(aiMaterial, modelDir, textureCache)); 123 | } 124 | 125 | int numMeshes = aiScene.mNumMeshes(); 126 | PointerBuffer aiMeshes = aiScene.mMeshes(); 127 | Material defaultMaterial = new Material(); 128 | for (int i = 0; i < numMeshes; i++) { 129 | AIMesh aiMesh = AIMesh.create(aiMeshes.get(i)); 130 | Mesh mesh = processMesh(aiMesh); 131 | int materialIdx = aiMesh.mMaterialIndex(); 132 | Material material; 133 | if (materialIdx >= 0 && materialIdx < materialList.size()) { 134 | material = materialList.get(materialIdx); 135 | } else { 136 | material = defaultMaterial; 137 | } 138 | material.getMeshList().add(mesh); 139 | } 140 | 141 | if (!defaultMaterial.getMeshList().isEmpty()) { 142 | materialList.add(defaultMaterial); 143 | } 144 | 145 | return new Model(modelId, materialList); 146 | } 147 | ... 148 | } 149 | ``` 150 | 151 | We process the materials contained in the model. Materials define color and textures to be used by the meshes that compose the model. Then we process the different meshes. A model can define several meshes and each of them can use one of the materials defined for the model. This is why we process meshes after materials and link to them, to avoid repeating binding calls when rendering. 152 | 153 | If you examine the code above you may see that many of the calls to the Assimp library return `PointerBuffer` instances. You can think about them like C pointers, they just point to a memory region which contain data. You need to know in advance the type of data that they hold in order to process them. In the case of materials, we iterate over that buffer creating instances of the `AIMaterial` class. In the second case, we iterate over the buffer that holds mesh data creating instance of the `AIMesh` class. 154 | 155 | Let’s examine the `processMaterial` method. 156 | 157 | ```java 158 | public class ModelLoader { 159 | ... 160 | private static Material processMaterial(AIMaterial aiMaterial, String modelDir, TextureCache textureCache) { 161 | Material material = new Material(); 162 | try (MemoryStack stack = MemoryStack.stackPush()) { 163 | AIColor4D color = AIColor4D.create(); 164 | 165 | int result = aiGetMaterialColor(aiMaterial, AI_MATKEY_COLOR_DIFFUSE, aiTextureType_NONE, 0, 166 | color); 167 | if (result == aiReturn_SUCCESS) { 168 | material.setDiffuseColor(new Vector4f(color.r(), color.g(), color.b(), color.a())); 169 | } 170 | 171 | AIString aiTexturePath = AIString.calloc(stack); 172 | aiGetMaterialTexture(aiMaterial, aiTextureType_DIFFUSE, 0, aiTexturePath, (IntBuffer) null, 173 | null, null, null, null, null); 174 | String texturePath = aiTexturePath.dataString(); 175 | if (texturePath != null && texturePath.length() > 0) { 176 | material.setTexturePath(modelDir + File.separator + new File(texturePath).getName()); 177 | textureCache.createTexture(material.getTexturePath()); 178 | material.setDiffuseColor(Material.DEFAULT_COLOR); 179 | } 180 | 181 | return material; 182 | } 183 | } 184 | ... 185 | } 186 | ``` 187 | 188 | We first get the material color, in this case the diffuse color (by setting the `AI_MATKEY_COLOR_DIFFUSE` flag). There are many different types of colors which we will use when applying lights, for example we have diffuse, ambient (for ambient light), specular (for specular factor of lights, etc.) After that, we check if the material defines a texture or not. If so, that is if there is a texture path, we store the texture path and delegate texture creation to the `TexturCache` class as in previous examples. In this case, if the material defines a texture we set the diffuse color to a default value, which is black. By doing this we will be able to use both values, diffuse color and texture without checking if there is a texture or not. If the model does not define a texture we will use a default black texture which can be combined with the material color. 189 | 190 | The `processMesh` method is defined like this. 191 | 192 | ```java 193 | public class ModelLoader { 194 | ... 195 | private static Mesh processMesh(AIMesh aiMesh) { 196 | float[] vertices = processVertices(aiMesh); 197 | float[] textCoords = processTextCoords(aiMesh); 198 | int[] indices = processIndices(aiMesh); 199 | 200 | // Texture coordinates may not have been populated. We need at least the empty slots 201 | if (textCoords.length == 0) { 202 | int numElements = (vertices.length / 3) * 2; 203 | textCoords = new float[numElements]; 204 | } 205 | 206 | return new Mesh(vertices, textCoords, indices); 207 | } 208 | ... 209 | } 210 | ``` 211 | 212 | A `Mesh` is defined by a set of vertices position, texture coordinates and indices. Each of these elements are processed in the `processVertices`, `processTextCoords` and `processIndices` methods. After processing all that data we check if texture coordinates have been defined. If not, we just assign a set of texture coordinates to 0.0f to ensure consistency of the VAO. 213 | 214 | The `processXXX` methods are very simple, they just invoke the corresponding method over the `AIMesh` instance that returns the desired data and store it into an array: 215 | 216 | ```java 217 | public class ModelLoader { 218 | ... 219 | private static int[] processIndices(AIMesh aiMesh) { 220 | List indices = new ArrayList<>(); 221 | int numFaces = aiMesh.mNumFaces(); 222 | AIFace.Buffer aiFaces = aiMesh.mFaces(); 223 | for (int i = 0; i < numFaces; i++) { 224 | AIFace aiFace = aiFaces.get(i); 225 | IntBuffer buffer = aiFace.mIndices(); 226 | while (buffer.remaining() > 0) { 227 | indices.add(buffer.get()); 228 | } 229 | } 230 | return indices.stream().mapToInt(Integer::intValue).toArray(); 231 | } 232 | ... 233 | private static float[] processTextCoords(AIMesh aiMesh) { 234 | AIVector3D.Buffer buffer = aiMesh.mTextureCoords(0); 235 | if (buffer == null) { 236 | return new float[]{}; 237 | } 238 | float[] data = new float[buffer.remaining() * 2]; 239 | int pos = 0; 240 | while (buffer.remaining() > 0) { 241 | AIVector3D textCoord = buffer.get(); 242 | data[pos++] = textCoord.x(); 243 | data[pos++] = 1 - textCoord.y(); 244 | } 245 | return data; 246 | } 247 | 248 | private static float[] processVertices(AIMesh aiMesh) { 249 | AIVector3D.Buffer buffer = aiMesh.mVertices(); 250 | float[] data = new float[buffer.remaining() * 3]; 251 | int pos = 0; 252 | while (buffer.remaining() > 0) { 253 | AIVector3D textCoord = buffer.get(); 254 | data[pos++] = textCoord.x(); 255 | data[pos++] = textCoord.y(); 256 | data[pos++] = textCoord.z(); 257 | } 258 | return data; 259 | } 260 | } 261 | ``` 262 | 263 | You can see that get get a buffer to the vertices by invoking the `mVertices` method. We just simply process them to create a `List` of floats that contain the vertices positions. Since, the method returns just a buffer you could pass that information directly to the OpenGL methods that create vertices. We do not do it that way for two reasons. The first one is try to reduce as much as possible the modifications over the code base. Second one is that by loading into an intermediate structure you may be able to perform some pros-processing tasks and even debug the loading process. 264 | 265 | If you want a sample of the much more efficient approach, that is, directly passing the buffers to OpenGL, you can check this [sample](https://github.com/LWJGL/lwjgl3-demos/blob/master/src/org/lwjgl/demo/opengl/assimp/WavefrontObjDemo.java). 266 | 267 | ## Using the models 268 | 269 | We need to modify the `Material` class to add support for diffuse color: 270 | 271 | ```java 272 | public class Material { 273 | 274 | public static final Vector4f DEFAULT_COLOR = new Vector4f(0.0f, 0.0f, 0.0f, 1.0f); 275 | 276 | private Vector4f diffuseColor; 277 | ... 278 | public Material() { 279 | diffuseColor = DEFAULT_COLOR; 280 | ... 281 | } 282 | ... 283 | public Vector4f getDiffuseColor() { 284 | return diffuseColor; 285 | } 286 | ... 287 | public void setDiffuseColor(Vector4f diffuseColor) { 288 | this.diffuseColor = diffuseColor; 289 | } 290 | ... 291 | } 292 | ``` 293 | 294 | In the `SceneRender` class, we need to create, and properly set up while rendering, the material diffuse color: 295 | ```java 296 | public class SceneRender { 297 | ... 298 | private void createUniforms() { 299 | ... 300 | uniformsMap.createUniform("material.diffuse"); 301 | } 302 | 303 | public void render(Scene scene) { 304 | ... 305 | for (Model model : models) { 306 | List entities = model.getEntitiesList(); 307 | 308 | for (Material material : model.getMaterialList()) { 309 | uniformsMap.setUniform("material.diffuse", material.getDiffuseColor()); 310 | ... 311 | } 312 | } 313 | ... 314 | } 315 | ... 316 | } 317 | ``` 318 | 319 | As you can see we are using a weird name for the uniform with a `.` in the name. This is because we will use structures in the shader. With structures we can group several types into a single combined one. You can see this in the fragment shader: 320 | 321 | ```glsl 322 | #version 330 323 | 324 | in vec2 outTextCoord; 325 | 326 | out vec4 fragColor; 327 | 328 | struct Material 329 | { 330 | vec4 diffuse; 331 | }; 332 | 333 | uniform sampler2D txtSampler; 334 | uniform Material material; 335 | 336 | void main() 337 | { 338 | fragColor = texture(txtSampler, outTextCoord) + material.diffuse; 339 | } 340 | ``` 341 | 342 | We will need also to add a new method to the `UniformsMap` class to add support for passing `Vector4f` values 343 | ```java 344 | public class UniformsMap { 345 | ... 346 | public void setUniform(String uniformName, Vector4f value) { 347 | glUniform4f(getUniformLocation(uniformName), value.x, value.y, value.z, value.w); 348 | } 349 | } 350 | ``` 351 | 352 | Finally, we need to modify the `Main` class to use the `ModelLoader` class to load models: 353 | ```java 354 | public class Main implements IAppLogic { 355 | ... 356 | public static void main(String[] args) { 357 | ... 358 | Engine gameEng = new Engine("chapter-09", new Window.WindowOptions(), main); 359 | ... 360 | } 361 | ... 362 | public void init(Window window, Scene scene, Render render) { 363 | Model cubeModel = ModelLoader.loadModel("cube-model", "resources/models/cube/cube.obj", 364 | scene.getTextureCache()); 365 | scene.addModel(cubeModel); 366 | 367 | cubeEntity = new Entity("cube-entity", cubeModel.getId()); 368 | cubeEntity.setPosition(0, 0, -2); 369 | scene.addEntity(cubeEntity); 370 | } 371 | ... 372 | } 373 | ``` 374 | 375 | As you can see, the `init` method has been simplified a lot, no more model data embedded in the code. Now we are using a cube model which uses the wavefront format. You can locate model files in the `resources\models\cube` folder. You will find there, the following files: 376 | 377 | * `cube.obj`: The main model file. In fact is a text based format, so you can open it and see how vertices, indices and textures coordinates are defined and glued together by defining faces. It also contains a reference to a material file. 378 | 379 | * `cube.mtl`: The material file, it defines colors and textures. 380 | 381 | * `cube.png`: The texture file of the model. 382 | 383 | Finally, we will add another feature to optimize the render. We will reduce the amount of data that is being rendered by applying face culling. As you well know, a cube is made of six faces and we are rendering the six faces they are not visible. You can check this if you zoom inside a cube, you will see its interior. 384 | 385 | Faces that cannot be seen should be discarded immediately and this is what face culling does. In fact, for a cube you can only see 3 faces at the same time, so we can just discard half of the faces just by applying face culling \(this will only be valid if your game does not require you to dive into the inner side of a model\). 386 | 387 | For every triangle, face culling checks if it's facing towards us and discards the ones that are not facing that direction. But, how do we know if a triangle is facing towards us or not? Well, the way that OpenGL does this is by the winding order of the vertices that compose a triangle. 388 | 389 | Remember from the first chapters that we may define the vertices of a triangle in clockwise or counter-clockwise order. In OpenGL, by default, triangles that are in counter-clockwise order are facing towards the viewer and triangles that are in clockwise order are facing backwards. The key thing here, is that this order is checked while rendering taking into consideration the point of view. So a triangle that has been defined in counter-clock wise order can be interpreted, at rendering time, as being defined clockwise because of the point of view. 390 | 391 | We will enable face culling in the `Render` class: 392 | 393 | ```java 394 | public class Render { 395 | ... 396 | public Render() { 397 | ... 398 | glEnable(GL_CULL_FACE); 399 | glCullFace(GL_BACK); 400 | ... 401 | } 402 | ... 403 | } 404 | ``` 405 | 406 | The first line will enable face culling and the second line states that faces that are facing backwards should be culled \(removed\). 407 | 408 | 409 | If you run the sample you will see the same result as in previous chapter, however, if you zoom in into the cube, inner faces will not be rendered. You can modify this sample to load more complex models. 410 | 411 | [Next chapter](../chapter-10/chapter-10.md) 412 | -------------------------------------------------------------------------------- /chapter-10/demo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-10/demo.png -------------------------------------------------------------------------------- /chapter-11/diffuse_calc_i.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/diffuse_calc_i.png -------------------------------------------------------------------------------- /chapter-11/diffuse_light.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/diffuse_light.png -------------------------------------------------------------------------------- /chapter-11/diffuse_light_normals.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/diffuse_light_normals.png -------------------------------------------------------------------------------- /chapter-11/directional_light.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/directional_light.png -------------------------------------------------------------------------------- /chapter-11/dot_product.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/dot_product.png -------------------------------------------------------------------------------- /chapter-11/light_controls.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/light_controls.png -------------------------------------------------------------------------------- /chapter-11/light_reflection.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/light_reflection.png -------------------------------------------------------------------------------- /chapter-11/light_types.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/light_types.png -------------------------------------------------------------------------------- /chapter-11/normals.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/normals.png -------------------------------------------------------------------------------- /chapter-11/polished_surface.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/polished_surface.png -------------------------------------------------------------------------------- /chapter-11/specular_lightining.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/specular_lightining.png -------------------------------------------------------------------------------- /chapter-11/specular_lightining_calc.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/specular_lightining_calc.png -------------------------------------------------------------------------------- /chapter-11/spot_light.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/spot_light.png -------------------------------------------------------------------------------- /chapter-11/spot_light_calc.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/spot_light_calc.png -------------------------------------------------------------------------------- /chapter-11/spot_light_ii.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/spot_light_ii.png -------------------------------------------------------------------------------- /chapter-11/sun_directional_light.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/sun_directional_light.png -------------------------------------------------------------------------------- /chapter-11/surface.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/surface.png -------------------------------------------------------------------------------- /chapter-11/vertex_normals.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-11/vertex_normals.png -------------------------------------------------------------------------------- /chapter-12/chapter-12.md: -------------------------------------------------------------------------------- 1 | # Chapter 12 - Sky Box 2 | 3 | In this chapter we will see how to create a sky box. A skybox will allow us to set a background to give the illusion that our 3D world is bigger. That background is wrapped around the camera position and covers the whole space. The technique that we are going to use here is to construct a big cube that will be displayed around the 3D scene, that is, the centre of the camera position will be the centre of the cube. The sides of that cube will be wrapped with a texture with hills, a blue sky and clouds that will be mapped in a way that the image appears to be a continuous landscape. 4 | 5 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-12). 6 | 7 | ## Sky Box 8 | 9 | The following picture depicts the skybox concept. 10 | 11 | ![Sky Box](skybox.png) 12 | 13 | The process of creating a sky box can be summarized in the following steps: 14 | 15 | * Create a big cube. 16 | * Apply a texture to it that provides the illusion that we are seeing a giant landscape with no edges. 17 | * Render the cube so its sides are at a far distance and its origin is located at the centre of the camera. 18 | 19 | We will start by creating a new class named `SkyBox` with a constructor that receives the path to the 3D model which contains the sky box cube (with its texture) and a reference to the texture cache. This class will load that model and will create an `Entity` instance associated to that model. The definition of the `SkyBox` class is as follows. 20 | 21 | ```java 22 | package org.lwjglb.engine.scene; 23 | 24 | import org.lwjglb.engine.graph.*; 25 | 26 | public class SkyBox { 27 | 28 | private Entity skyBoxEntity; 29 | private Model skyBoxModel; 30 | 31 | public SkyBox(String skyBoxModelPath, TextureCache textureCache) { 32 | skyBoxModel = ModelLoader.loadModel("skybox-model", skyBoxModelPath, textureCache); 33 | skyBoxEntity = new Entity("skyBoxEntity-entity", skyBoxModel.getId()); 34 | } 35 | 36 | public Entity getSkyBoxEntity() { 37 | return skyBoxEntity; 38 | } 39 | 40 | public Model getSkyBoxModel() { 41 | return skyBoxModel; 42 | } 43 | } 44 | ``` 45 | 46 | We will store a reference to the `SkyBox` class in the `Scene` class: 47 | 48 | ```java 49 | public class Scene { 50 | ... 51 | private SkyBox skyBox; 52 | ... 53 | public SkyBox getSkyBox() { 54 | return skyBox; 55 | } 56 | ... 57 | public void setSkyBox(SkyBox skyBox) { 58 | this.skyBox = skyBox; 59 | } 60 | ... 61 | } 62 | ``` 63 | 64 | The next step is to create another set of vertex and fragment shaders for the sky box. But, why not reuse the scene shaders that we already have? The answer is that, actually, the shaders that we will need are a simplified version of those shaders. For example, we will not be applying lights to the sky box. Below you can see the sky box vertex shader (`skybox.vert`). 65 | 66 | ```glsl 67 | #version 330 68 | 69 | layout (location=0) in vec3 position; 70 | layout (location=1) in vec3 normal; 71 | layout (location=2) in vec2 texCoord; 72 | 73 | out vec2 outTextCoord; 74 | 75 | uniform mat4 projectionMatrix; 76 | uniform mat4 viewMatrix; 77 | uniform mat4 modelMatrix; 78 | 79 | void main() 80 | { 81 | gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0); 82 | outTextCoord = texCoord; 83 | } 84 | ``` 85 | 86 | You can see that we still use the model matrix. Since we will scale the skybox, we need the model matrix. You may see some other implementations that increase the size of the cube that models the sky box at start time and do not need to multiply the model and the view matrix. We have chosen this approach because it’s more flexible and it allows us to change the size of the skybox at runtime, but you can easily switch to the other approach if you want. 87 | 88 | The fragment shader (`skybox.frag`) is also very simple, we just get the color form a texture or from a diffuse color. 89 | 90 | ```glsl 91 | #version 330 92 | 93 | in vec2 outTextCoord; 94 | out vec4 fragColor; 95 | 96 | uniform vec4 diffuse; 97 | uniform sampler2D txtSampler; 98 | uniform int hasTexture; 99 | 100 | void main() 101 | { 102 | if (hasTexture == 1) { 103 | fragColor = texture(txtSampler, outTextCoord); 104 | } else { 105 | fragColor = diffuse; 106 | } 107 | } 108 | ``` 109 | 110 | We will create a new class named `SkyBoxRender` to use those shaders and perform the render. The class starts by creating the shader program and setting up the required uniforms. 111 | 112 | ```java 113 | package org.lwjglb.engine.graph; 114 | 115 | import org.joml.Matrix4f; 116 | import org.lwjglb.engine.scene.*; 117 | 118 | import java.util.*; 119 | 120 | import static org.lwjgl.opengl.GL20.*; 121 | import static org.lwjgl.opengl.GL30.glBindVertexArray; 122 | 123 | public class SkyBoxRender { 124 | 125 | private ShaderProgram shaderProgram; 126 | 127 | private UniformsMap uniformsMap; 128 | 129 | private Matrix4f viewMatrix; 130 | 131 | public SkyBoxRender() { 132 | List shaderModuleDataList = new ArrayList<>(); 133 | shaderModuleDataList.add(new ShaderProgram.ShaderModuleData("resources/shaders/skybox.vert", GL_VERTEX_SHADER)); 134 | shaderModuleDataList.add(new ShaderProgram.ShaderModuleData("resources/shaders/skybox.frag", GL_FRAGMENT_SHADER)); 135 | shaderProgram = new ShaderProgram(shaderModuleDataList); 136 | viewMatrix = new Matrix4f(); 137 | createUniforms(); 138 | } 139 | ... 140 | } 141 | ``` 142 | 143 | The `createUniforms` method is defined liked this: 144 | 145 | ```java 146 | public class SkyBoxRender { 147 | ... 148 | private void createUniforms() { 149 | uniformsMap = new UniformsMap(shaderProgram.getProgramId()); 150 | uniformsMap.createUniform("projectionMatrix"); 151 | uniformsMap.createUniform("viewMatrix"); 152 | uniformsMap.createUniform("modelMatrix"); 153 | uniformsMap.createUniform("diffuse"); 154 | uniformsMap.createUniform("txtSampler"); 155 | uniformsMap.createUniform("hasTexture"); 156 | } 157 | ... 158 | } 159 | ``` 160 | 161 | We just create some uniforms to hold data we will need while rendering. The next step to create a new render method for the skybox that will be invoked in the global render method. 162 | 163 | ```java 164 | public class SkyBoxRender { 165 | ... 166 | public void render(Scene scene) { 167 | SkyBox skyBox = scene.getSkyBox(); 168 | if (skyBox == null) { 169 | return; 170 | } 171 | shaderProgram.bind(); 172 | 173 | uniformsMap.setUniform("projectionMatrix", scene.getProjection().getProjMatrix()); 174 | viewMatrix.set(scene.getCamera().getViewMatrix()); 175 | viewMatrix.m30(0); 176 | viewMatrix.m31(0); 177 | viewMatrix.m32(0); 178 | uniformsMap.setUniform("viewMatrix", viewMatrix); 179 | uniformsMap.setUniform("txtSampler", 0); 180 | 181 | Model skyBoxModel = skyBox.getSkyBoxModel(); 182 | Entity skyBoxEntity = skyBox.getSkyBoxEntity(); 183 | TextureCache textureCache = scene.getTextureCache(); 184 | for (Material material : skyBoxModel.getMaterialList()) { 185 | Texture texture = textureCache.getTexture(material.getTexturePath()); 186 | glActiveTexture(GL_TEXTURE0); 187 | texture.bind(); 188 | 189 | uniformsMap.setUniform("diffuse", material.getDiffuseColor()); 190 | uniformsMap.setUniform("hasTexture", texture.getTexturePath().equals(TextureCache.DEFAULT_TEXTURE) ? 0 : 1); 191 | 192 | for (Mesh mesh : material.getMeshList()) { 193 | glBindVertexArray(mesh.getVaoId()); 194 | 195 | uniformsMap.setUniform("modelMatrix", skyBoxEntity.getModelMatrix()); 196 | glDrawElements(GL_TRIANGLES, mesh.getNumVertices(), GL_UNSIGNED_INT, 0); 197 | } 198 | } 199 | 200 | glBindVertexArray(0); 201 | 202 | shaderProgram.unbind(); 203 | } 204 | } 205 | ``` 206 | 207 | You will see that we are modifying the view matrix prior to loading that data in the associated uniform. Remember that when we move the camera, what we are actually doing is moving the whole world. So if we just multiply the view matrix as it is, the skybox will be displaced when the camera moves. But we do not want this, we want to stick it at the origin coordinates at \(0, 0, 0\). This is achieved by setting to 0 the parts of the view matrix that contain the translation increments \(the `m30`, `m31` and `m32` components\). You may think that you could avoid using the view matrix at all since the sky box must be fixed at the origin. In that case, you would see that the skybox does not rotate with the camera, which is not what we want. We need it to rotate but not translate. To render the skybox, we just set up the uniforms and render the cube associated to the sky box. 208 | 209 | Finally, we define a `cleanup` method to properly free resources: 210 | 211 | ```java 212 | public class SkyBoxRender { 213 | ... 214 | public void cleanup() { 215 | shaderProgram.cleanup(); 216 | } 217 | ... 218 | } 219 | ``` 220 | 221 | In the `Render` class we just need to instantiate the `SkyBoxRender` class and invoke the render method: 222 | 223 | ```java 224 | public class Render { 225 | ... 226 | private SkyBoxRender skyBoxRender; 227 | ... 228 | 229 | public Render(Window window) { 230 | ... 231 | skyBoxRender = new SkyBoxRender(); 232 | } 233 | 234 | public void render(Window window, Scene scene) { 235 | glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); 236 | glViewport(0, 0, window.getWidth(), window.getHeight()); 237 | 238 | skyBoxRender.render(scene); 239 | sceneRender.render(scene); 240 | guiRender.render(scene); 241 | } 242 | ... 243 | } 244 | ``` 245 | 246 | You can see that we render the sky box first. This is due to the fact that if we have 3D models in the scene with transparencies we want them to be blended with the skybox (not with a black background). 247 | 248 | Finally, in the `Main` class we just set up the sky box in the scene and create a set of tiles to give the illusion of an infinite terrain. We set up a chunk of tiles that move along with the camera position to always be shown. 249 | 250 | ```java 251 | public class Main implements IAppLogic { 252 | ... 253 | private static final int NUM_CHUNKS = 4; 254 | 255 | private Entity[][] terrainEntities; 256 | 257 | public static void main(String[] args) { 258 | ... 259 | Engine gameEng = new Engine("chapter-12", new Window.WindowOptions(), main); 260 | ... 261 | } 262 | ... 263 | 264 | @Override 265 | public void init(Window window, Scene scene, Render render) { 266 | String quadModelId = "quad-model"; 267 | Model quadModel = ModelLoader.loadModel("quad-model", "resources/models/quad/quad.obj", 268 | scene.getTextureCache()); 269 | scene.addModel(quadModel); 270 | 271 | int numRows = NUM_CHUNKS * 2 + 1; 272 | int numCols = numRows; 273 | terrainEntities = new Entity[numRows][numCols]; 274 | for (int j = 0; j < numRows; j++) { 275 | for (int i = 0; i < numCols; i++) { 276 | Entity entity = new Entity("TERRAIN_" + j + "_" + i, quadModelId); 277 | terrainEntities[j][i] = entity; 278 | scene.addEntity(entity); 279 | } 280 | } 281 | 282 | SceneLights sceneLights = new SceneLights(); 283 | sceneLights.getAmbientLight().setIntensity(0.2f); 284 | scene.setSceneLights(sceneLights); 285 | 286 | SkyBox skyBox = new SkyBox("resources/models/skybox/skybox.obj", scene.getTextureCache()); 287 | skyBox.getSkyBoxEntity().setScale(50); 288 | scene.setSkyBox(skyBox); 289 | 290 | scene.getCamera().moveUp(0.1f); 291 | 292 | updateTerrain(scene); 293 | } 294 | 295 | @Override 296 | public void input(Window window, Scene scene, long diffTimeMillis, boolean inputConsumed) { 297 | float move = diffTimeMillis * MOVEMENT_SPEED; 298 | Camera camera = scene.getCamera(); 299 | if (window.isKeyPressed(GLFW_KEY_W)) { 300 | camera.moveForward(move); 301 | } else if (window.isKeyPressed(GLFW_KEY_S)) { 302 | camera.moveBackwards(move); 303 | } 304 | if (window.isKeyPressed(GLFW_KEY_A)) { 305 | camera.moveLeft(move); 306 | } else if (window.isKeyPressed(GLFW_KEY_D)) { 307 | camera.moveRight(move); 308 | } 309 | 310 | MouseInput mouseInput = window.getMouseInput(); 311 | if (mouseInput.isRightButtonPressed()) { 312 | Vector2f displVec = mouseInput.getDisplVec(); 313 | camera.addRotation((float) Math.toRadians(-displVec.x * MOUSE_SENSITIVITY), (float) Math.toRadians(-displVec.y * MOUSE_SENSITIVITY)); 314 | } 315 | } 316 | 317 | @Override 318 | public void update(Window window, Scene scene, long diffTimeMillis) { 319 | updateTerrain(scene); 320 | } 321 | 322 | public void updateTerrain(Scene scene) { 323 | int cellSize = 10; 324 | Camera camera = scene.getCamera(); 325 | Vector3f cameraPos = camera.getPosition(); 326 | int cellCol = (int) (cameraPos.x / cellSize); 327 | int cellRow = (int) (cameraPos.z / cellSize); 328 | 329 | int numRows = NUM_CHUNKS * 2 + 1; 330 | int numCols = numRows; 331 | int zOffset = -NUM_CHUNKS; 332 | float scale = cellSize / 2.0f; 333 | for (int j = 0; j < numRows; j++) { 334 | int xOffset = -NUM_CHUNKS; 335 | for (int i = 0; i < numCols; i++) { 336 | Entity entity = terrainEntities[j][i]; 337 | entity.setScale(scale); 338 | entity.setPosition((cellCol + xOffset) * 2.0f, 0, (cellRow + zOffset) * 2.0f); 339 | entity.getModelMatrix().identity().scale(scale).translate(entity.getPosition()); 340 | xOffset++; 341 | } 342 | zOffset++; 343 | } 344 | } 345 | } 346 | ``` 347 | 348 | [Next chapter](../chapter-13/chapter-13.md) -------------------------------------------------------------------------------- /chapter-12/skybox.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-12/skybox.png -------------------------------------------------------------------------------- /chapter-13/chapter-13.md: -------------------------------------------------------------------------------- 1 | # Chapter 13 - Fog 2 | 3 | In this chapter we will see review how to create a fog effect in our game engine. With that effect, we will simulate how distant objects get dimmed and seem to vanish into a dense fog. 4 | 5 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-13). 6 | 7 | ## Concepts 8 | 9 | Let us first examine the attributes that define fog. The first one is the fog color. In the real world, the fog has a gray color, but we can use this effect to simulate wide areas invaded by a fog with different colors. The attribute is the fog's density. 10 | 11 | Thus, in order to apply the fog effect, we need to find a way to fade our 3D scene objects into the fog color as long as they get far away from the camera. Objects that are close to the camera will not be affected by the fog, but objects that are far away will not be distinguishable. So we need to be able to calculate a factor that can be used to blend the fog color and each fragment color in order to simulate that effect. That factor will need to be dependent on the distance to the camera. 12 | 13 | Let’s call that factor $$fogFactor$$, and set its range from 0 to 1. When the $$fogFactor$$ is 1, it means that the object will not be affected by fog, that is, it’s a nearby object. When the $$fogFactor$$ takes the 0 value, it means that the objects will be completely hidden in the fog. 14 | 15 | Therefore, the equation needed to calculate the fog color is: 16 | 17 | $$finalColor = (1 - fogFactor) \cdot fogColor + fogFactor \cdot framentColor$$ 18 | 19 | * $$finalColor$$ is the color that results from applying the fog effect. 20 | * $$fogFactor$$ is the parameters that controls how the fog color and the fragment color are blended. It basically controls the object visibility. 21 | * $$fogColor$$ is the color of the fog. 22 | * $$fragmentColor$$ is the color of the fragment without applying any fog effect on it. 23 | 24 | Now we need to find a way to calculate $$fogFactor$$ depending on the distance. We can choose different models, and the first one could be to use a linear model. This is a model that, given a distance, changes the fogFactor value in a linear way. 25 | 26 | The linear model can be defined by the following parameters: 27 | 28 | * $$fogStart$$: The distance at where fog effects starts to be applied. 29 | * $$fogFinish$$: The distance at where fog effects reach its maximum value. 30 | * $$distance$$: Distance to the camera. 31 | 32 | With those parameters, the equation to be applied is: 33 | 34 | $$\displaystyle fogFactor = \frac{(fogFinish - distance)}{(fogFinish - fogStart)}$$ 35 | 36 | For objects at distance lower than $$fogStart$$ we just simply set the $$fogFactor$$ to $$1$$. The following graph shows how the $$fogFactor$$ changes with the distance. 37 | 38 | ![Linear model](linear_model.png) 39 | 40 | The linear model is easy to calculate but it is not very realistic and it does not take into consideration the fog density. In reality, fog tends to grow in a smoother way. So the next suitable model is an exponential one. The equation for that model is as follows: 41 | 42 | $$\displaystyle fogFactor = e^{-(distance \cdot fogDensity)^{exponent}} = \frac{1}{e^{(distance \cdot fogDensity)^{exponent}}}$$ 43 | 44 | The new variables that come into play are: 45 | 46 | * $$fogDensity$$ which models the thickness or density of the fog. 47 | * $$exponent$$ which is used to control how fast the fog increases with distance. 48 | 49 | The following picture shows two graphs for the equation above for different values of the exponent \($$2$$ for the blue line and $$4$$ for the red one\). 50 | 51 | ![Exponential model](exponential_model.png) 52 | 53 | In our code, we will use a formula that sets a value of two for the exponent \(you can easily modify the example to use different values\). 54 | 55 | ## Implementation 56 | 57 | Now that the theory has been explained we can put it into practice. We will implement the effect in the scene fragment shader (`scene.frag`) since we have there all the variables we need. We will start by defining a struct that models the fog attributes. 58 | 59 | ```glsl 60 | ... 61 | struct Fog 62 | { 63 | int activeFog; 64 | vec3 color; 65 | float density; 66 | }; 67 | ... 68 | ``` 69 | 70 | The `active` attribute will be used to activate or deactivate the fog effect. The fog will be passed to the shader through another uniform named `fog`. 71 | 72 | ```glsl 73 | ... 74 | uniform Fog fog; 75 | ... 76 | ``` 77 | 78 | We will create a function named `calcFog` which is defined as this. 79 | 80 | ```glsl 81 | ... 82 | vec4 calcFog(vec3 pos, vec4 color, Fog fog, vec3 ambientLight, DirLight dirLight) { 83 | vec3 fogColor = fog.color * (ambientLight + dirLight.color * dirLight.intensity); 84 | float distance = length(pos); 85 | float fogFactor = 1.0 / exp((distance * fog.density) * (distance * fog.density)); 86 | fogFactor = clamp(fogFactor, 0.0, 1.0); 87 | 88 | vec3 resultColor = mix(fogColor, color.xyz, fogFactor); 89 | return vec4(resultColor.xyz, color.w); 90 | } 91 | ... 92 | ``` 93 | 94 | As you can see, we first calculate the distance to the vertex. The vertex coordinates are defined in the `pos` variable and we just need to calculate the length. Then we calculate the fog factor using the exponential model with an exponent of two \(which is equivalent to multiply it twice\). We clamp the `fogFactor` to a range between $$0$$ and $$1$$ and use the `mix` function. In GLSL, the `mix` function is used to blend the fog color and the fragment color \(defined by variable `color`\). It's equivalent to applying this equation: 95 | 96 | $$resultColor = (1 - fogFactor) \cdot fog.color + fogFactor \cdot color$$ 97 | 98 | We also preserve the w component, the transparency, of the original color. We don't want this component to be affected, as the fragment should maintain its transparency level. 99 | 100 | At the end of the fragment shader, after applying all the light effects, we just simply assign the returned value to the fragment color if the fog is active. 101 | 102 | ```glsl 103 | ... 104 | if (fog.activeFog == 1) { 105 | fragColor = calcFog(outPosition, fragColor, fog, ambientLight.color, dirLight); 106 | } 107 | ... 108 | ``` 109 | 110 | We will create also a new class named `Fog` which is another POJO \(Plain Old Java Object\) that contains the fog attributes. 111 | 112 | ```java 113 | package org.lwjglb.engine.scene; 114 | 115 | import org.joml.Vector3f; 116 | 117 | public class Fog { 118 | 119 | private boolean active; 120 | private Vector3f color; 121 | private float density; 122 | 123 | public Fog() { 124 | active = false; 125 | color = new Vector3f(); 126 | } 127 | 128 | public Fog(boolean active, Vector3f color, float density) { 129 | this.color = color; 130 | this.density = density; 131 | this.active = active; 132 | } 133 | 134 | public Vector3f getColor() { 135 | return color; 136 | } 137 | 138 | public float getDensity() { 139 | return density; 140 | } 141 | 142 | public boolean isActive() { 143 | return active; 144 | } 145 | 146 | public void setActive(boolean active) { 147 | this.active = active; 148 | } 149 | 150 | public void setColor(Vector3f color) { 151 | this.color = color; 152 | } 153 | 154 | public void setDensity(float density) { 155 | this.density = density; 156 | } 157 | } 158 | ``` 159 | 160 | We will add a `Fog` instance in the `Scene` class. 161 | 162 | ```java 163 | public class Scene { 164 | ... 165 | private Fog fog; 166 | ... 167 | public Scene(int width, int height) { 168 | ... 169 | fog = new Fog(); 170 | } 171 | ... 172 | public Fog getFog() { 173 | return fog; 174 | } 175 | ... 176 | public void setFog(Fog fog) { 177 | this.fog = fog; 178 | } 179 | ... 180 | } 181 | ``` 182 | 183 | Now we need to set up all these elements in the `SceneRender` class, We start by setting the uniform values for the `Fog` structure: 184 | 185 | ```java 186 | public class SceneRender { 187 | ... 188 | private void createUniforms() { 189 | ... 190 | uniformsMap.createUniform("fog.activeFog"); 191 | uniformsMap.createUniform("fog.color"); 192 | uniformsMap.createUniform("fog.density"); 193 | } 194 | ... 195 | } 196 | ``` 197 | 198 | In the `render` method we need first to enable blending and then populate the `Fog` uniform: 199 | 200 | ```java 201 | public class SceneRender { 202 | ... 203 | public void render(Scene scene) { 204 | glEnable(GL_BLEND); 205 | glBlendEquation(GL_FUNC_ADD); 206 | glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); 207 | shaderProgram.bind(); 208 | ... 209 | Fog fog = scene.getFog(); 210 | uniformsMap.setUniform("fog.activeFog", fog.isActive() ? 1 : 0); 211 | uniformsMap.setUniform("fog.color", fog.getColor()); 212 | uniformsMap.setUniform("fog.density", fog.getDensity()); 213 | ... 214 | shaderProgram.unbind(); 215 | glDisable(GL_BLEND); 216 | } 217 | ... 218 | } 219 | ``` 220 | 221 | Finally, wew will modify the `Main` class to set up fog and just use a single quad as a terrain scaled to show the effect of fog. 222 | 223 | ```java 224 | public class Main implements IAppLogic { 225 | ... 226 | public static void main(String[] args) { 227 | ... 228 | Engine gameEng = new Engine("chapter-13", new Window.WindowOptions(), main); 229 | ... 230 | } 231 | ... 232 | public void init(Window window, Scene scene, Render render) { 233 | String terrainModelId = "terrain"; 234 | Model terrainModel = ModelLoader.loadModel(terrainModelId, "resources/models/terrain/terrain.obj", 235 | scene.getTextureCache()); 236 | scene.addModel(terrainModel); 237 | Entity terrainEntity = new Entity("terrainEntity", terrainModelId); 238 | terrainEntity.setScale(100.0f); 239 | terrainEntity.updateModelMatrix(); 240 | scene.addEntity(terrainEntity); 241 | 242 | SceneLights sceneLights = new SceneLights(); 243 | AmbientLight ambientLight = sceneLights.getAmbientLight(); 244 | ambientLight.setIntensity(0.5f); 245 | ambientLight.setColor(0.3f, 0.3f, 0.3f); 246 | 247 | DirLight dirLight = sceneLights.getDirLight(); 248 | dirLight.setPosition(0, 1, 0); 249 | dirLight.setIntensity(1.0f); 250 | scene.setSceneLights(sceneLights); 251 | 252 | SkyBox skyBox = new SkyBox("resources/models/skybox/skybox.obj", scene.getTextureCache()); 253 | skyBox.getSkyBoxEntity().setScale(50); 254 | scene.setSkyBox(skyBox); 255 | 256 | scene.setFog(new Fog(true, new Vector3f(0.5f, 0.5f, 0.5f), 0.95f)); 257 | 258 | scene.getCamera().moveUp(0.1f); 259 | } 260 | ... 261 | public void update(Window window, Scene scene, long diffTimeMillis) { 262 | // Nothing to be done here 263 | } 264 | } 265 | ``` 266 | 267 | One important thing to highlight is that we must choose wisely the fog color. This is even more important when we have no skybox but a fixed color background. We should set up the fog color to be equal to the clear color. If you uncomment the code that render the skybox and rerun the example you will get something like this. 268 | 269 | You should be able to see something like this: 270 | 271 | ![Fog](fog.png) 272 | 273 | [Next chapter](../chapter-14/chapter-14.md) -------------------------------------------------------------------------------- /chapter-13/exponential_model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-13/exponential_model.png -------------------------------------------------------------------------------- /chapter-13/fog.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-13/fog.png -------------------------------------------------------------------------------- /chapter-13/linear_model.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-13/linear_model.png -------------------------------------------------------------------------------- /chapter-14/chapter-14.md: -------------------------------------------------------------------------------- 1 | # Chapter 14 - Normal Mapping 2 | 3 | In this chapter, we will explain a technique that will dramatically improve how our 3D models look like. By now we are able to apply textures to complex 3D models, but we are still far away from what real objects look like. Surfaces in the real world are not perfectly plain, they have imperfections that our 3D models currently do not have. 4 | 5 | In order to render more realistic scenes, we are going to use normal maps. If you look at a flat surface in the real world you will see that those imperfections can be seen even at distance by the way that the light reflects on it. In a 3D scene, a flat surface will have no imperfections, we can apply a texture to it but we won’t change the way that light reflects on it. That’s the thing that makes the difference. 6 | 7 | We may think of increasing the detail of our models by increasing the number of triangles and reflect those imperfections, but performance will degrade. What we need is a way to change the way light reflects on surfaces to increase the realism. This is achieved with the normal mapping technique. 8 | 9 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-14). 10 | 11 | ## Concepts 12 | 13 | Let’s go back to the plain surface example, a plane can be defined by two triangles which form a quad. If you remember from the lighting chapters, the element that models how light reflects are surface normals. In this case, we have a single normal for the whole surface, each fragment of the surface uses the same normal when calculating how light affects them. This is shown in the next figure. 14 | 15 | ![Surface Normals](surface_normals.png) 16 | 17 | If we could change the normals for each fragment of the surface we could model surface imperfections to render them in a more realistic way. This is shown in the next figure. 18 | 19 | ![Fragment Normals](fragment_normals.png) 20 | 21 | The way we are going to achieve this is by loading another texture that stores the normals for the surface. Each pixel of the normal texture will contain the values of the $$x$$, y and $$z$$ coordinates of the normal stored as an RGB value. 22 | 23 | Let’s use the following texture to draw a quad. 24 | 25 | ![Texture](rock.png) 26 | 27 | An example of a normal map texture for the image above may be the following. 28 | 29 | ![Normal map texture](rock_normals.png) 30 | 31 | As you can see, it's as if we had applied a color transformation to the original texture. Each pixel stores normal information using color components. One thing that you will usually see when viewing normal maps is that the dominant colors tend to blue. This is due to the fact that normals point to the positive $$z$$ axis. The $$z$$ component will usually have a much higher value than the $$x$$ and $$y$$ ones for plain surfaces as the normal points out of the surface. Since $$x$$, $$y$$, $$z$$ coordinates are mapped to RGB, the blue component will have also a higher value. 32 | 33 | So, to render an object using normal maps we just need an extra texture and use it while rendering fragments to get the appropriate normal value. 34 | 35 | ## Implementation 36 | 37 | Usually, normal maps are not defined in that way, they usually are defined in the so called tangent space. The tangent space is a coordinate system that is local to each triangle of the model. In that coordinate space the $$z$$ axis always points out of the surface. This is the reason why a normal map is usually bluish, even for complex models with opposing faces. In order to handle tangent space, we need norm,als, tangent and bi-tangent vectors. We already have normal vector, the tangent and bitangent vectors are perpendicular vectors to the normal one. We need these vectors to calculate the `TBN` matrix which will allow us to use data that is in tangent space to the coordinate system we are using in our shaders. 38 | 39 | You can check a great tutorial on this aspect [here](https://learnopengl.com/Advanced-Lighting/Normal-Mapping) 40 | 41 | Therefore, ye first step is to add support for normal mapping loading the `ModelLoader` class, including tangent and bitangent information. If you recall, when setting the model loading flags for assimp, we included this one: `aiProcess_CalcTangentSpace`. This flag allows to automatically calculate tangent and bitangent data. 42 | 43 | In the `processMaterial` method we will first query for the presence of a normal map texture. If so, we load that texture and associate that texture path to the material: 44 | 45 | ```java 46 | public class ModelLoader { 47 | ... 48 | private static Material processMaterial(AIMaterial aiMaterial, String modelDir, TextureCache textureCache) { 49 | ... 50 | try (MemoryStack stack = MemoryStack.stackPush()) { 51 | ... 52 | AIString aiNormalMapPath = AIString.calloc(stack); 53 | Assimp.aiGetMaterialTexture(aiMaterial, aiTextureType_NORMALS, 0, aiNormalMapPath, (IntBuffer) null, 54 | null, null, null, null, null); 55 | String normalMapPath = aiNormalMapPath.dataString(); 56 | if (normalMapPath != null && normalMapPath.length() > 0) { 57 | material.setNormalMapPath(modelDir + File.separator + new File(normalMapPath).getName()); 58 | textureCache.createTexture(material.getNormalMapPath()); 59 | } 60 | return material; 61 | } 62 | } 63 | ... 64 | } 65 | ``` 66 | 67 | In the `processMesh` method we need to load also data for tangents and bitangents: 68 | 69 | ```java 70 | public class ModelLoader { 71 | ... 72 | private static Mesh processMesh(AIMesh aiMesh) { 73 | ... 74 | float[] tangents = processTangents(aiMesh, normals); 75 | float[] bitangents = processBitangents(aiMesh, normals); 76 | ... 77 | return new Mesh(vertices, normals, tangents, bitangents, textCoords, indices); 78 | } 79 | ... 80 | } 81 | ``` 82 | 83 | The `processTangents` and `processBitangents` methods are quite similar to the one that loads normals: 84 | 85 | ```java 86 | public class ModelLoader { 87 | ... 88 | private static float[] processBitangents(AIMesh aiMesh, float[] normals) { 89 | 90 | AIVector3D.Buffer buffer = aiMesh.mBitangents(); 91 | float[] data = new float[buffer.remaining() * 3]; 92 | int pos = 0; 93 | while (buffer.remaining() > 0) { 94 | AIVector3D aiBitangent = buffer.get(); 95 | data[pos++] = aiBitangent.x(); 96 | data[pos++] = aiBitangent.y(); 97 | data[pos++] = aiBitangent.z(); 98 | } 99 | 100 | // Assimp may not calculate tangents with models that do not have texture coordinates. Just create empty values 101 | if (data.length == 0) { 102 | data = new float[normals.length]; 103 | } 104 | return data; 105 | } 106 | ... 107 | private static float[] processTangents(AIMesh aiMesh, float[] normals) { 108 | 109 | AIVector3D.Buffer buffer = aiMesh.mTangents(); 110 | float[] data = new float[buffer.remaining() * 3]; 111 | int pos = 0; 112 | while (buffer.remaining() > 0) { 113 | AIVector3D aiTangent = buffer.get(); 114 | data[pos++] = aiTangent.x(); 115 | data[pos++] = aiTangent.y(); 116 | data[pos++] = aiTangent.z(); 117 | } 118 | 119 | // Assimp may not calculate tangents with models that do not have texture coordinates. Just create empty values 120 | if (data.length == 0) { 121 | data = new float[normals.length]; 122 | } 123 | return data; 124 | } 125 | ... 126 | } 127 | ``` 128 | 129 | As you can see, we need to modify also `Mesh` and `Material` classes to hold the new data. Let's start by the `Mesh` class: 130 | 131 | ```java 132 | public class Mesh { 133 | ... 134 | public Mesh(float[] positions, float[] normals, float[] tangents, float[] bitangents, float[] textCoords, int[] indices) { 135 | ... 136 | // Tangents VBO 137 | vboId = glGenBuffers(); 138 | vboIdList.add(vboId); 139 | FloatBuffer tangentsBuffer = MemoryUtil.memCallocFloat(tangents.length); 140 | tangentsBuffer.put(0, tangents); 141 | glBindBuffer(GL_ARRAY_BUFFER, vboId); 142 | glBufferData(GL_ARRAY_BUFFER, tangentsBuffer, GL_STATIC_DRAW); 143 | glEnableVertexAttribArray(2); 144 | glVertexAttribPointer(2, 3, GL_FLOAT, false, 0, 0); 145 | 146 | // Bitangents VBO 147 | vboId = glGenBuffers(); 148 | vboIdList.add(vboId); 149 | FloatBuffer bitangentsBuffer = MemoryUtil.memCallocFloat(bitangents.length); 150 | bitangentsBuffer.put(0, bitangents); 151 | glBindBuffer(GL_ARRAY_BUFFER, vboId); 152 | glBufferData(GL_ARRAY_BUFFER, bitangentsBuffer, GL_STATIC_DRAW); 153 | glEnableVertexAttribArray(3); 154 | glVertexAttribPointer(3, 3, GL_FLOAT, false, 0, 0); 155 | 156 | // Texture coordinates VBO 157 | ... 158 | glEnableVertexAttribArray(4); 159 | glVertexAttribPointer(4, 2, GL_FLOAT, false, 0, 0); 160 | ... 161 | MemoryUtil.memFree(tangentsBuffer); 162 | MemoryUtil.memFree(bitangentsBuffer); 163 | ... 164 | } 165 | ... 166 | } 167 | ``` 168 | 169 | We need to create two new VBOs for tangent and bitangent data (which follow a structure similar to the normals data) and therefore update the position of the texture coordinates VBO. 170 | 171 | In the `Material` class we need to include the path to the normal mapping texture path: 172 | 173 | ```java 174 | public class Material { 175 | ... 176 | private String normalMapPath; 177 | ... 178 | public String getNormalMapPath() { 179 | return normalMapPath; 180 | } 181 | ... 182 | public void setNormalMapPath(String normalMapPath) { 183 | this.normalMapPath = normalMapPath; 184 | } 185 | ... 186 | } 187 | ``` 188 | 189 | Now we need to modify the shaders, starting by the scene vertex shader (`scene.vert`): 190 | 191 | ```glsl 192 | #version 330 193 | 194 | layout (location=0) in vec3 position; 195 | layout (location=1) in vec3 normal; 196 | layout (location=2) in vec3 tangent; 197 | layout (location=3) in vec3 bitangent; 198 | layout (location=4) in vec2 texCoord; 199 | 200 | out vec3 outPosition; 201 | out vec3 outNormal; 202 | out vec3 outTangent; 203 | out vec3 outBitangent; 204 | out vec2 outTextCoord; 205 | 206 | uniform mat4 projectionMatrix; 207 | uniform mat4 viewMatrix; 208 | uniform mat4 modelMatrix; 209 | 210 | void main() 211 | { 212 | mat4 modelViewMatrix = viewMatrix * modelMatrix; 213 | vec4 mvPosition = modelViewMatrix * vec4(position, 1.0); 214 | gl_Position = projectionMatrix * mvPosition; 215 | outPosition = mvPosition.xyz; 216 | outNormal = normalize(modelViewMatrix * vec4(normal, 0.0)).xyz; 217 | outTangent = normalize(modelViewMatrix * vec4(tangent, 0)).xyz; 218 | outBitangent = normalize(modelViewMatrix * vec4(bitangent, 0)).xyz; 219 | outTextCoord = texCoord; 220 | } 221 | ``` 222 | 223 | As you can see we need to define the new input data associated to bitangent and tangent. We transform those elements in the same way that we handled the normal and pass that data as an input to the fragment shader (`scene.frag`): 224 | 225 | ```glsl 226 | #version 330 227 | ... 228 | in vec3 outTangent; 229 | in vec3 outBitangent; 230 | ... 231 | struct Material 232 | { 233 | vec4 ambient; 234 | vec4 diffuse; 235 | vec4 specular; 236 | float reflectance; 237 | int hasNormalMap; 238 | }; 239 | ... 240 | uniform sampler2D normalSampler; 241 | ... 242 | ``` 243 | 244 | We start by defining the new inputs from the vertex shader, including and additional element for the `Material` struct which signals if there is a normal map available or not (`hasNormalMap`). We also add a new uniform for the normal map texture (`normalSampler`)). The next step is to define a function hat updates the normal based on normal map texture: 245 | 246 | ```glsl 247 | ... 248 | ... 249 | vec3 calcNormal(vec3 normal, vec3 tangent, vec3 bitangent, vec2 textCoords) { 250 | mat3 TBN = mat3(tangent, bitangent, normal); 251 | vec3 newNormal = texture(normalSampler, textCoords).rgb; 252 | newNormal = normalize(newNormal * 2.0 - 1.0); 253 | newNormal = normalize(TBN * newNormal); 254 | return newNormal; 255 | } 256 | 257 | void main() { 258 | vec4 text_color = texture(txtSampler, outTextCoord); 259 | vec4 ambient = calcAmbient(ambientLight, text_color + material.ambient); 260 | vec4 diffuse = text_color + material.diffuse; 261 | vec4 specular = text_color + material.specular; 262 | 263 | vec3 normal = outNormal; 264 | if (material.hasNormalMap > 0) { 265 | normal = calcNormal(outNormal, outTangent, outBitangent, outTextCoord); 266 | } 267 | 268 | vec4 diffuseSpecularComp = calcDirLight(diffuse, specular, dirLight, outPosition, normal); 269 | 270 | for (int i=0; i 0) { 272 | diffuseSpecularComp += calcPointLight(diffuse, specular, pointLights[i], outPosition, normal); 273 | } 274 | } 275 | 276 | for (int i=0; i 0) { 278 | diffuseSpecularComp += calcSpotLight(diffuse, specular, spotLights[i], outPosition, normal); 279 | } 280 | } 281 | fragColor = ambient + diffuseSpecularComp; 282 | 283 | if (fog.activeFog == 1) { 284 | fragColor = calcFog(outPosition, fragColor, fog, ambientLight.color, dirLight); 285 | } 286 | } 287 | ``` 288 | 289 | The `calcNormal` function takes the following parameters: 290 | 291 | * The vertex normal. 292 | * The vertex tangent. 293 | * The vertex bitangent. 294 | * The texture coordinates. 295 | 296 | The first thing we do in that function is to calculate the TBN matrix. After that, we get the normal value form the normal map texture and use the TBN Matrix to pass from tangent space to view space. Remember that the colour we get are the normal coordinates, but since they are stored as RGB values they are contained in the range \[0, 1\]. We need to transform them to be in the range \[-1, 1\], so we just multiply by two and subtract 1. 297 | 298 | Finally, we use that function only if the material defines a normal map texture. 299 | 300 | We need to modify also the `SceneRender` class to create and use the new normals that we use in the shaders: 301 | 302 | ```java 303 | public class SceneRender { 304 | ... 305 | private void createUniforms() { 306 | ... 307 | uniformsMap.createUniform("normalSampler"); 308 | ... 309 | uniformsMap.createUniform("material.hasNormalMap"); 310 | ... 311 | } 312 | public void render(Scene scene) { 313 | ... 314 | uniformsMap.setUniform("normalSampler", 1); 315 | ... 316 | for (Model model : models) { 317 | ... 318 | for (Material material : model.getMaterialList()) { 319 | ... 320 | String normalMapPath = material.getNormalMapPath(); 321 | boolean hasNormalMapPath = normalMapPath != null; 322 | uniformsMap.setUniform("material.hasNormalMap", hasNormalMapPath ? 1 : 0); 323 | ... 324 | if (hasNormalMapPath) { 325 | Texture normalMapTexture = textureCache.getTexture(normalMapPath); 326 | glActiveTexture(GL_TEXTURE1); 327 | normalMapTexture.bind(); 328 | } 329 | ... 330 | } 331 | } 332 | ... 333 | } 334 | ... 335 | } 336 | ``` 337 | 338 | We need to update the sky box vertex shader because we have new vectors between normal data and texture coordinate: 339 | ```glsl 340 | #version 330 341 | 342 | layout (location=0) in vec3 position; 343 | layout (location=1) in vec3 normal; 344 | layout (location=4) in vec2 texCoord; 345 | ... 346 | ``` 347 | 348 | The last step is to update the `Main` class to show this effect. We will load two quads with and without normal maps associated to them. Also we will use left and right arrows to control light angle to show the effect. 349 | 350 | ```java 351 | public class Main implements IAppLogic { 352 | ... 353 | public static void main(String[] args) { 354 | ... 355 | Engine gameEng = new Engine("chapter-14", new Window.WindowOptions(), main); 356 | ... 357 | } 358 | ... 359 | public void init(Window window, Scene scene, Render render) { 360 | String wallNoNormalsModelId = "quad-no-normals-model"; 361 | Model quadModelNoNormals = ModelLoader.loadModel(wallNoNormalsModelId, "resources/models/wall/wall_nonormals.obj", 362 | scene.getTextureCache()); 363 | scene.addModel(quadModelNoNormals); 364 | 365 | Entity wallLeftEntity = new Entity("wallLeftEntity", wallNoNormalsModelId); 366 | wallLeftEntity.setPosition(-3f, 0, 0); 367 | wallLeftEntity.setScale(2.0f); 368 | wallLeftEntity.updateModelMatrix(); 369 | scene.addEntity(wallLeftEntity); 370 | 371 | String wallModelId = "quad-model"; 372 | Model quadModel = ModelLoader.loadModel(wallModelId, "resources/models/wall/wall.obj", 373 | scene.getTextureCache()); 374 | scene.addModel(quadModel); 375 | 376 | Entity wallRightEntity = new Entity("wallRightEntity", wallModelId); 377 | wallRightEntity.setPosition(3f, 0, 0); 378 | wallRightEntity.setScale(2.0f); 379 | wallRightEntity.updateModelMatrix(); 380 | scene.addEntity(wallRightEntity); 381 | 382 | SceneLights sceneLights = new SceneLights(); 383 | sceneLights.getAmbientLight().setIntensity(0.2f); 384 | DirLight dirLight = sceneLights.getDirLight(); 385 | dirLight.setPosition(1, 1, 0); 386 | dirLight.setIntensity(1.0f); 387 | scene.setSceneLights(sceneLights); 388 | 389 | Camera camera = scene.getCamera(); 390 | camera.moveUp(5.0f); 391 | camera.addRotation((float) Math.toRadians(90), 0); 392 | 393 | lightAngle = -35; 394 | } 395 | ... 396 | public void input(Window window, Scene scene, long diffTimeMillis, boolean inputConsumed) { 397 | if (inputConsumed) { 398 | return; 399 | } 400 | float move = diffTimeMillis * MOVEMENT_SPEED; 401 | Camera camera = scene.getCamera(); 402 | if (window.isKeyPressed(GLFW_KEY_W)) { 403 | camera.moveForward(move); 404 | } else if (window.isKeyPressed(GLFW_KEY_S)) { 405 | camera.moveBackwards(move); 406 | } 407 | if (window.isKeyPressed(GLFW_KEY_A)) { 408 | camera.moveLeft(move); 409 | } else if (window.isKeyPressed(GLFW_KEY_D)) { 410 | camera.moveRight(move); 411 | } 412 | if (window.isKeyPressed(GLFW_KEY_LEFT)) { 413 | lightAngle -= 2.5f; 414 | if (lightAngle < -90) { 415 | lightAngle = -90; 416 | } 417 | } else if (window.isKeyPressed(GLFW_KEY_RIGHT)) { 418 | lightAngle += 2.5f; 419 | if (lightAngle > 90) { 420 | lightAngle = 90; 421 | } 422 | } 423 | 424 | MouseInput mouseInput = window.getMouseInput(); 425 | if (mouseInput.isRightButtonPressed()) { 426 | Vector2f displVec = mouseInput.getDisplVec(); 427 | camera.addRotation((float) Math.toRadians(-displVec.x * MOUSE_SENSITIVITY), (float) Math.toRadians(-displVec.y * MOUSE_SENSITIVITY)); 428 | } 429 | 430 | SceneLights sceneLights = scene.getSceneLights(); 431 | DirLight dirLight = sceneLights.getDirLight(); 432 | double angRad = Math.toRadians(lightAngle); 433 | dirLight.getDirection().x = (float) Math.sin(angRad); 434 | dirLight.getDirection().y = (float) Math.cos(angRad); 435 | } 436 | ... 437 | } 438 | ``` 439 | 440 | The result is shown in the next figure. 441 | 442 | ![Normal mapping result](normal_mapping_result.png) 443 | 444 | As you can see the quad that has a normal texture applied gives the impression of having more volume. Although it is, in essence, a plain surface like the other quad, you can see how the light reflects. 445 | 446 | 447 | [Next chapter](../chapter-15/chapter-15.md) -------------------------------------------------------------------------------- /chapter-14/fragment_normals.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-14/fragment_normals.png -------------------------------------------------------------------------------- /chapter-14/normal_mapping_result.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-14/normal_mapping_result.png -------------------------------------------------------------------------------- /chapter-14/normal_mapping_result.png.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-14/normal_mapping_result.png.png -------------------------------------------------------------------------------- /chapter-14/rock.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-14/rock.png -------------------------------------------------------------------------------- /chapter-14/rock_normals.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-14/rock_normals.png -------------------------------------------------------------------------------- /chapter-14/surface_normals.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-14/surface_normals.png -------------------------------------------------------------------------------- /chapter-15/bones_weights.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-15/bones_weights.png -------------------------------------------------------------------------------- /chapter-15/mesh_bones_weights_vertices.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-15/mesh_bones_weights_vertices.png -------------------------------------------------------------------------------- /chapter-15/node_animations.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-15/node_animations.png -------------------------------------------------------------------------------- /chapter-15/screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-15/screenshot.png -------------------------------------------------------------------------------- /chapter-15/static_vao_animation_vao.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-15/static_vao_animation_vao.png -------------------------------------------------------------------------------- /chapter-16/listener_at_up.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-16/listener_at_up.png -------------------------------------------------------------------------------- /chapter-16/openal_concepts.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-16/openal_concepts.png -------------------------------------------------------------------------------- /chapter-17/cascade_splits.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-17/cascade_splits.png -------------------------------------------------------------------------------- /chapter-17/render_light_perspective.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-17/render_light_perspective.png -------------------------------------------------------------------------------- /chapter-17/result_shadows.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-17/result_shadows.png -------------------------------------------------------------------------------- /chapter-17/result_shadows_debug.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-17/result_shadows_debug.png -------------------------------------------------------------------------------- /chapter-17/shadow_concepts_I.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-17/shadow_concepts_I.png -------------------------------------------------------------------------------- /chapter-17/shadow_concepts_II.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-17/shadow_concepts_II.png -------------------------------------------------------------------------------- /chapter-18/aabb.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 16 | 39 | 44 | 45 | 47 | 54 | 61 | 62 | 67 | 71 | 75 | 82 | AABB 93 | Max Corner 104 | Min Corner 115 | 119 | 123 | 127 | 131 | 135 | 139 | 145 | 151 | 152 | 153 | -------------------------------------------------------------------------------- /chapter-18/chapter-18.md: -------------------------------------------------------------------------------- 1 | # Chapter 18 - 3D Object Picking 2 | 3 | One of the key aspects of every game is the ability to interact with the environment. This capability requires to be able to select objects in the 3D scene. In this chapter we will explore how this can be achieved. 4 | 5 | You can find the complete source code for this chapter [here](https://github.com/lwjglgamedev/lwjglbook/tree/main/chapter-18). 6 | 7 | ## Concepts 8 | 9 | We will add the capability to select entities by clicking the mouse on the screen. In order to do so, we will cast a ray from the camera position (our origin) using as a direction the point where we have clicked with the mouse (transforming from mouse coordinates to world coordinates). With that ray we will check if it intersects with bounding boxes associate to each entity (that is a cube that encloses the model associated to an entity). 10 | 11 | ![Object Picking](object_picking.svg) 12 | 13 | We need to implement the follow steps: 14 | 15 | * Associate a bounding box to each model (to each mesh of the model indeed). 16 | * Transform mouse coordinates to world space ones to cast a ray from the camera position. 17 | * For each entity, iterate over the associated meshes and check if we intersect with the ray. 18 | * We will select the entity which intersects with the closest distance to the ray. 19 | * If we have a selected entity we will highlight it in the fragment shader. 20 | 21 | ## Code preparation 22 | 23 | We will start first by calculating the bounding box for each mesh of the models we load. We will let [assimp](https://github.com/assimp/assimp) do this work for us by adding an additional flag when loading the models: `aiProcess_GenBoundingBoxes`. This flag will automatically calculate a bounding box for each mex. That box will embed all the meshes and will be axis aligned. You may see the acronym "AABB" used for this, which means Axis Aligned Bounding Box. Why axis aligned boxes ? Because it will simplify intersection calculations a lot. By using that flag, [assimp](https://github.com/assimp/assimp) will perform those calculations which will be available as the corners of the bounding box (with minimum and maximum coordinates). The following figure shows how it would look like for a cube. 24 | 25 | ![AABB](aabb.svg) 26 | 27 | Once enabled the calculation, we need to retrieve that information when processing the meshes: 28 | 29 | ```java 30 | public class ModelLoader { 31 | ... 32 | public static Model loadModel(String modelId, String modelPath, TextureCache textureCache, boolean animation) { 33 | return loadModel(modelId, modelPath, textureCache, aiProcess_GenSmoothNormals | aiProcess_JoinIdenticalVertices | 34 | aiProcess_Triangulate | aiProcess_FixInfacingNormals | aiProcess_CalcTangentSpace | aiProcess_LimitBoneWeights | 35 | aiProcess_GenBoundingBoxes | (animation ? 0 : aiProcess_PreTransformVertices)); 36 | 37 | } 38 | ... 39 | private static Mesh processMesh(AIMesh aiMesh, List boneList) { 40 | ... 41 | AIAABB aabb = aiMesh.mAABB(); 42 | Vector3f aabbMin = new Vector3f(aabb.mMin().x(), aabb.mMin().y(), aabb.mMin().z()); 43 | Vector3f aabbMax = new Vector3f(aabb.mMax().x(), aabb.mMax().y(), aabb.mMax().z()); 44 | 45 | return new Mesh(vertices, normals, tangents, bitangents, textCoords, indices, animMeshData.boneIds, 46 | animMeshData.weights, aabbMin, aabbMax); 47 | } 48 | ... 49 | } 50 | ``` 51 | 52 | We need to store that information in the `Mesh` class: 53 | 54 | ```java 55 | public class Mesh { 56 | ... 57 | private Vector3f aabbMax; 58 | private Vector3f aabbMin; 59 | ... 60 | public Mesh(float[] positions, float[] normals, float[] tangents, float[] bitangents, float[] textCoords, int[] indices) { 61 | this(positions, normals, tangents, bitangents, textCoords, indices, 62 | new int[Mesh.MAX_WEIGHTS * positions.length / 3], new float[Mesh.MAX_WEIGHTS * positions.length / 3], 63 | new Vector3f(), new Vector3f()); 64 | } 65 | 66 | public Mesh(float[] positions, float[] normals, float[] tangents, float[] bitangents, float[] textCoords, int[] indices, 67 | int[] boneIndices, float[] weights, Vector3f aabbMin, Vector3f aabbMax) { 68 | try (MemoryStack stack = MemoryStack.stackPush()) { 69 | this.aabbMin = aabbMin; 70 | this.aabbMax = aabbMax; 71 | ... 72 | } 73 | } 74 | ... 75 | public Vector3f getAabbMax() { 76 | return aabbMax; 77 | } 78 | 79 | public Vector3f getAabbMin() { 80 | return aabbMin; 81 | } 82 | ... 83 | } 84 | ``` 85 | 86 | While performing the ray intersection calculations we will need inverse view and projection matrices in order to transform from screen space to world space coordinates. Therefore, we will modify the `Camera` and `Projection` class to automatically calculate the inverse of their respective matrices whenever they are updated: 87 | 88 | ```java 89 | public class Camera { 90 | ... 91 | private Matrix4f invViewMatrix; 92 | ... 93 | public Camera() { 94 | ... 95 | invViewMatrix = new Matrix4f(); 96 | ... 97 | } 98 | ... 99 | public Matrix4f getInvViewMatrix() { 100 | return invViewMatrix; 101 | } 102 | ... 103 | private void recalculate() { 104 | viewMatrix.identity() 105 | .rotateX(rotation.x) 106 | .rotateY(rotation.y) 107 | .translate(-position.x, -position.y, -position.z); 108 | invViewMatrix.set(viewMatrix).invert(); 109 | } 110 | ... 111 | } 112 | ``` 113 | 114 | ```java 115 | public class Projection { 116 | ... 117 | private Matrix4f invProjMatrix; 118 | ... 119 | public Projection(int width, int height) { 120 | ... 121 | invProjMatrix = new Matrix4f(); 122 | ... 123 | } 124 | 125 | public Matrix4f getInvProjMatrix() { 126 | return invProjMatrix; 127 | } 128 | ... 129 | public void updateProjMatrix(int width, int height) { 130 | projMatrix.setPerspective(FOV, (float) width / height, Z_NEAR, Z_FAR); 131 | invProjMatrix.set(projMatrix).invert(); 132 | } 133 | } 134 | ``` 135 | 136 | We will need also to store the selected `Entity` once we have done the calculations, we will do this in the `Scene` class: 137 | 138 | ```java 139 | public class Scene { 140 | ... 141 | private Entity selectedEntity; 142 | ... 143 | public Entity getSelectedEntity() { 144 | return selectedEntity; 145 | } 146 | ... 147 | public void setSelectedEntity(Entity selectedEntity) { 148 | this.selectedEntity = selectedEntity; 149 | } 150 | ... 151 | } 152 | ``` 153 | 154 | Finally, we will create a new uniform while rendering the scene that will be activated if we are rendering an `Entity` that is selected: 155 | 156 | ```java 157 | public class SceneRender { 158 | ... 159 | private void createUniforms() { 160 | ... 161 | uniformsMap.createUniform("selected"); 162 | } 163 | 164 | public void render(Scene scene, ShadowRender shadowRender) { 165 | ... 166 | Entity selectedEntity = scene.getSelectedEntity(); 167 | for (Model model : models) { 168 | List entities = model.getEntitiesList(); 169 | 170 | for (Material material : model.getMaterialList()) { 171 | ... 172 | for (Mesh mesh : material.getMeshList()) { 173 | glBindVertexArray(mesh.getVaoId()); 174 | for (Entity entity : entities) { 175 | uniformsMap.setUniform("selected", 176 | selectedEntity != null && selectedEntity.getId().equals(entity.getId()) ? 1 : 0); 177 | ... 178 | } 179 | ... 180 | } 181 | ... 182 | } 183 | } 184 | ... 185 | } 186 | ... 187 | } 188 | ``` 189 | 190 | In the fragment shader (`scene.frag`), we will just modify the blue component of the fragment that belongs to a selected entity: 191 | 192 | ```glsl 193 | #version 330 194 | ... 195 | uniform int selected; 196 | ... 197 | void main() { 198 | ... 199 | if (selected > 0) { 200 | fragColor = vec4(fragColor.x, fragColor.y, 1, 1); 201 | } 202 | } 203 | ``` 204 | 205 | ## Entity selection 206 | 207 | We can now proceed with the code for determining if an `Entity` must be selected. In the `Main` class, in the `input` method, we will check if the mouse left button has been pressed. If so, we will invoke a new method (`selectEntity`) where will be doing the calculations: 208 | 209 | ```java 210 | public class Main implements IAppLogic { 211 | ... 212 | public void input(Window window, Scene scene, long diffTimeMillis, boolean inputConsumed) { 213 | ... 214 | if (mouseInput.isLeftButtonPressed()) { 215 | selectEntity(window, scene, mouseInput.getCurrentPos()); 216 | } 217 | ... 218 | } 219 | ... 220 | } 221 | ``` 222 | 223 | The `selectEntity` method starts like this: 224 | 225 | ```java 226 | public class Main implements IAppLogic { 227 | ... 228 | private void selectEntity(Window window, Scene scene, Vector2f mousePos) { 229 | int wdwWidth = window.getWidth(); 230 | int wdwHeight = window.getHeight(); 231 | 232 | float x = (2 * mousePos.x) / wdwWidth - 1.0f; 233 | float y = 1.0f - (2 * mousePos.y) / wdwHeight; 234 | float z = -1.0f; 235 | 236 | Matrix4f invProjMatrix = scene.getProjection().getInvProjMatrix(); 237 | Vector4f mouseDir = new Vector4f(x, y, z, 1.0f); 238 | mouseDir.mul(invProjMatrix); 239 | mouseDir.z = -1.0f; 240 | mouseDir.w = 0.0f; 241 | 242 | Matrix4f invViewMatrix = scene.getCamera().getInvViewMatrix(); 243 | mouseDir.mul(invViewMatrix); 244 | ... 245 | } 246 | ... 247 | } 248 | ``` 249 | 250 | We need to calculate that direction vector using the click coordinates. But, how do we pass from a $$(x,y)$$ coordinates in viewport space to world space? Let’s review how we pass from model space coordinates to view space. The different coordinate transformations that are applied in order to achieve that are: 251 | 252 | * We pass from model coordinates to world coordinates using the model matrix. 253 | * We pass from world coordinates to view space coordinates using the view matrix (that provides the camera effect)- 254 | * We pass from view coordinates to homogeneous clip space by applying the perspective projection matrix. 255 | * Final screen coordinates are calculate automatically by OpenGL for us. Before doing that, it passes to normalized device space (by dividing the $$x, y,z$$ coordinates by the $$w$$ component) and then to $$x,y$$ screen coordinates. 256 | 257 | So we need just to perform the traverse the inverse path to get from screen coordinates $$(x,y)$$, to world coordinates. 258 | 259 | The first step is to transform from screen coordinates to normalized device space. The $$(x, y)$$ coordinates in the view port space are in the range $$[0, screen width]$$ $$[0, screen height]$$. The upper left corner of the screen has a coordinate of $$(0, 0)$$. We need to transform that into coordinates in the range $$[-1, 1]$$. 260 | 261 | ![Screen coordinates to normalized device space](screen_coordinates.png) 262 | 263 | The maths are simple: 264 | 265 | $$x = 2 \cdot screen_x / screenwidth - 1$$ 266 | 267 | $$y = 1 - 2 * screen_y / screenheight$$ 268 | 269 | But, how do we calculate the $$z$$ component? The answer is simple, we simply assign it the $$-1$$ value, so that the ray points to the farthest visible distance (Remember that in OpenGL, $$-1$$ points to the screen). Now we have the coordinates in normalized device space. 270 | 271 | In order to continue with the transformations we need to convert them to the homogeneous clip space. We need to have the $$w$$ component, that is use homogeneous coordinates. Although this concept was presented in the previous chapters, let’s get back to it. In order to represent a 3D point we just need the $$x$$, $$y$$ and $$z$$ components, but we are continuously working with an additional component, the $$w$$ component. We need this extra component in order to use matrices to perform the different transformations. Some transformations do not need that extra component but other do. For instance, the translation matrix does not work if we only have $$x$$, $$y$$ and $$z$$ components. Thus, we have added the w component and assigned them a value of $$1$$ so we can work with 4 by 4 matrices. 272 | 273 | Besides that, most of transformations, or to be more precise, most of the transformation matrices do not alter the $$w$$ component. An exception to this is the projection matrix. This matrix changes the $$w$$ value to be proportional to the $$z$$ component. 274 | 275 | Transforming from homogeneous clip space to normalized device coordinates is achieved by dividing the $$x$$, $$y$$ and $$z$$ components by $$w$$. As this component is proportional to the z component, this implies that distant objects are drawn smaller. In our case we need to do the reverse, we need to unproject, but since what we are calculating is a ray we just simply can ignore that step, set the $$w$$ component to $$1$$ and leave the rest of the components at their original value. 276 | 277 | Now we need to go back to view space. This is easy, we just need to calculate the inverse of the projection matrix and multiply it by our 4 components vector. Once we have done that, we need to transform them to world space. Again, we just need to use the view matrix, calculate its inverse and multiply it by our vector. 278 | 279 | Remember that we are only interested in directions, so, in this case we set the $$w$$ component to $$0$$. Also we can set the $$z$$ component again to $$-1$$, since we want it to point towards the screen. Once we have done that and applied the inverse view matrix we have our vector in world space. 280 | 281 | The next step is to iterate over entities with their associated meshes and check if their bounding boxes intersect with the ray which starts at the camera position: 282 | 283 | ```java 284 | public class Main implements IAppLogic { 285 | ... 286 | private void selectEntity(Window window, Scene scene, Vector2f mousePos) { 287 | ... 288 | Vector4f min = new Vector4f(0.0f, 0.0f, 0.0f, 1.0f); 289 | Vector4f max = new Vector4f(0.0f, 0.0f, 0.0f, 1.0f); 290 | Vector2f nearFar = new Vector2f(); 291 | 292 | Entity selectedEntity = null; 293 | float closestDistance = Float.POSITIVE_INFINITY; 294 | Vector3f center = scene.getCamera().getPosition(); 295 | 296 | Collection models = scene.getModelMap().values(); 297 | Matrix4f modelMatrix = new Matrix4f(); 298 | for (Model model : models) { 299 | List entities = model.getEntitiesList(); 300 | for (Entity entity : entities) { 301 | modelMatrix.translate(entity.getPosition()).scale(entity.getScale()); 302 | for (Material material : model.getMaterialList()) { 303 | for (Mesh mesh : material.getMeshList()) { 304 | Vector3f aabbMin = mesh.getAabbMin(); 305 | min.set(aabbMin.x, aabbMin.y, aabbMin.z, 1.0f); 306 | min.mul(modelMatrix); 307 | Vector3f aabMax = mesh.getAabbMax(); 308 | max.set(aabMax.x, aabMax.y, aabMax.z, 1.0f); 309 | max.mul(modelMatrix); 310 | if (Intersectionf.intersectRayAab(center.x, center.y, center.z, mouseDir.x, mouseDir.y, mouseDir.z, 311 | min.x, min.y, min.z, max.x, max.y, max.z, nearFar) && nearFar.x < closestDistance) { 312 | closestDistance = nearFar.x; 313 | selectedEntity = entity; 314 | } 315 | } 316 | } 317 | modelMatrix.identity(); 318 | } 319 | } 320 | } 321 | ... 322 | } 323 | ``` 324 | 325 | We define a variable named ```closestDistance```. This variable will hold the closest distance. For game items that intersect, the distance from the camera to the intersection point will be calculated, If it’s lower than the value stored in ```closestDistance```, then this item will be the new candidate. We need to translate and scale the bounding box of eah mesh. We cannot use the model matrix a sit is as it will take into consideration also the rotation (we do not want that since we want the box to be axis aligned). This is why we just apply translation and scaling using entity's data to construct a model matrix. But, how do we calculate the intersection? This is where the glorious [JOML](https://github.com/JOML-CI/JOML "JOML") library comes to the rescue. We are using [JOML](https://github.com/JOML-CI/JOML "JOML")’s ```Intersectionf``` class, which provides several methods to calculate intersections in 2D and 3D. Specifically, we are using the ```intersectRayAab``` method. 326 | 327 | This method implements the algorithm that test intersection for Axis Aligned Boxes. You can check the details, as pointed out in the JOML documentation, [here](http://people.csail.mit.edu/amy/papers/box-jgt.pdf "here"). 328 | 329 | The method tests if a ray, defined by an origin and a direction, intersects a box, defined by minimum and maximum corner. As it has been said beforeThis algorithm is valid, because our cubes, are aligned with the axis, if they were rotated, this method would not work. In addition to that, when having animations you may need to have different bounding boxes per animation frame (assimp calculates the bounding box for the binding pose). The ```intersectRayAab``` method receives the following parameters: 330 | 331 | * An origin: In our case, this will be our camera position. 332 | * A direction: This is the ray that points to the mouse coordinates (world space). 333 | * The minimum corner of the box. 334 | * The maximum corner. Self explanatory. 335 | * A result vector. This will contain the near and far distances of the intersection points. 336 | 337 | The method will return true if there is an intersection. If true, we check the closes distance and update it if needed, and store a reference of the candidate selected. 338 | 339 | Obviously, the method presented here is far from optimal but it will give you the basics to develop more sophisticated methods on your own. Some parts of the scene could be easily discarded, like objects behind the camera, since they are not going to be intersected. Besides that, you may want to order your items according to the distance to the camera to speed up calculations. 340 | 341 | We will modify the `Main` class to show two spinning cubes to illustrate the technique: 342 | 343 | ```java 344 | public class Main implements IAppLogic { 345 | ... 346 | private Entity cubeEntity1; 347 | private Entity cubeEntity2; 348 | ... 349 | private float rotation; 350 | 351 | public static void main(String[] args) { 352 | ... 353 | Engine gameEng = new Engine("chapter-18", opts, main); 354 | ... 355 | } 356 | ... 357 | public void init(Window window, Scene scene, Render render) { 358 | ... 359 | Model cubeModel = ModelLoader.loadModel("cube-model", "resources/models/cube/cube.obj", 360 | scene.getTextureCache(), false); 361 | scene.addModel(cubeModel); 362 | cubeEntity1 = new Entity("cube-entity-1", cubeModel.getId()); 363 | cubeEntity1.setPosition(0, 2, -1); 364 | scene.addEntity(cubeEntity1); 365 | 366 | cubeEntity2 = new Entity("cube-entity-2", cubeModel.getId()); 367 | cubeEntity2.setPosition(-2, 2, -1); 368 | scene.addEntity(cubeEntity2); 369 | ... 370 | } 371 | ... 372 | public void update(Window window, Scene scene, long diffTimeMillis) { 373 | rotation += 1.5; 374 | if (rotation > 360) { 375 | rotation = 0; 376 | } 377 | cubeEntity1.setRotation(1, 1, 1, (float) Math.toRadians(rotation)); 378 | cubeEntity1.updateModelMatrix(); 379 | 380 | cubeEntity2.setRotation(1, 1, 1, (float) Math.toRadians(360 - rotation)); 381 | cubeEntity2.updateModelMatrix(); 382 | } 383 | } 384 | ``` 385 | 386 | You will be able to see how cubes are rendered in blue when licked with the mouse: 387 | 388 | ![Selected cube](screenshot.png) 389 | 390 | [Next chapter](../chapter-19/chapter-19.md) -------------------------------------------------------------------------------- /chapter-18/object_picking.svg: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 16 | 39 | 44 | 45 | 47 | 54 | 61 | 62 | 67 | 71 | 75 | 82 | 89 | 96 | 103 | 110 | 117 | 124 | 131 | 132 | 139 | 146 | 153 | 160 | 167 | 174 | 181 | 182 | 190 | 198 | 202 | 206 | 210 | 214 | 218 | 222 | Camera 233 | Bounding Box 244 | Mouse click 255 | Ray 266 | 267 | 268 | -------------------------------------------------------------------------------- /chapter-18/screen_coordinates.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-18/screen_coordinates.png -------------------------------------------------------------------------------- /chapter-18/screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-18/screenshot.png -------------------------------------------------------------------------------- /chapter-19/buffAlbedo_texture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-19/buffAlbedo_texture.png -------------------------------------------------------------------------------- /chapter-19/buffNormal_texture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-19/buffNormal_texture.png -------------------------------------------------------------------------------- /chapter-19/buffSpecular_texture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-19/buffSpecular_texture.png -------------------------------------------------------------------------------- /chapter-19/depth_texture.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-19/depth_texture.png -------------------------------------------------------------------------------- /chapter-19/result_shadows.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-19/result_shadows.png -------------------------------------------------------------------------------- /chapter-20/screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-20/screenshot.png -------------------------------------------------------------------------------- /chapter-21/screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/chapter-21/screenshot.png -------------------------------------------------------------------------------- /cover.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/cover.jpg -------------------------------------------------------------------------------- /screenshot.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/lwjglgamedev/lwjglbook-bookcontents/7846ef9d8f270675839c7004475d80db13ee5f63/screenshot.png -------------------------------------------------------------------------------- /styles/pdf.css: -------------------------------------------------------------------------------- 1 | /* CSS for pdf */ 2 | --------------------------------------------------------------------------------