Skip to content

Commit 3a2eb77

Browse files
committed
Refactor of changes due to chapter 10
1 parent 693ce00 commit 3a2eb77

File tree

37 files changed

+304
-180
lines changed

37 files changed

+304
-180
lines changed

bookcontents/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,4 +12,4 @@ The book is structured in the following chapters:
1212
- [Chapter 07](chapter-07/chapter-07.md): In this chapter we go 3D by implementing depth testing and add windows resizing support.
1313
- [Chapter 08](chapter-08/chapter-08.md): In this chapter we add support for loading complex 3D models using Assimp and textures.
1414
- [Chapter 09](chapter-09/chapter-09.md): We will automatically generate mipmaps, add support for transparent objects, add a camera to move around the scene and use dynamic uniform objects.
15-
- [Chapter 10](chapter-09/chapter-10.md): Deferred rendering (Draft code, synchronization needs to be reviewed).
15+
- [Chapter 10](chapter-09/chapter-10.md): Deferred rendering.

bookcontents/chapter-04/chapter-04.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -142,7 +142,15 @@ public class SwapChain {
142142
}
143143
```
144144

145-
The first thing we do is retrieve the number of formats our surface supports by calling the `vkGetPhysicalDeviceSurfaceFormatsKHR` Vulkan function. As with many other Vulkan samples, we first call that function to get the total number of formats supported and then we create a buffer of structures, `VkSurfaceFormatKHR` in this case, to retrieve the data by calling the same function again. Once we have all that data, we iterate over the formats trying to check if `VK_FORMAT_B8G8R8A8_UNORM`(8 bits for RGBA channels normalized) and SRGB non linear color space are supported. `SurfaceFormat` is just a `record` which stores the image format and the color space.
145+
The first thing we do is retrieve the number of formats our surface supports by calling the `vkGetPhysicalDeviceSurfaceFormatsKHR` Vulkan function. As with many other Vulkan samples, we first call that function to get the total number of formats supported and then we create a buffer of structures, `VkSurfaceFormatKHR` in this case, to retrieve the data by calling the same function again. Once we have all that data, we iterate over the formats trying to check if `VK_FORMAT_B8G8R8A8_UNORM`(8 bits for RGBA channels normalized) and SRGB non linear color space are supported. `SurfaceFormat` is just a `record` which stores the image format and the color space:
146+
```java
147+
public class SwapChain {
148+
...
149+
public record SurfaceFormat(int imageFormat, int colorSpace) {
150+
}
151+
...
152+
}
153+
```
146154

147155
It is turn again to go back to the constructor. We need to calculate now the extent of the images of the swap chain:
148156

bookcontents/chapter-05/chapter-05.md

Lines changed: 8 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -541,17 +541,15 @@ public class SwapChain {
541541
```
542542

543543
We create to types of semaphores:
544-
545-
- `imgAcquisitionSemaphores`: They will be used to signal image acquisition.
546-
547-
- `renderCompleteSemaphores`: They will be used to signal that the command submitted have been completed.
544+
- `imgAcquisitionSemaphore`: It will be used to signal image acquisition.
545+
- `renderCompleteSemaphore`: It will be used to signal that the command submitted have been completed.
548546

549547
These semaphores are stored together under a record:
550548

551549
```java
552550
public class SwapChain {
553551
...
554-
public record SyncSemaphores(Semaphore imgAcquisitionSemaphores, Semaphore renderCompleteSemaphores) {
552+
public record SyncSemaphores(Semaphore imgAcquisitionSemaphore, Semaphore renderCompleteSemaphore) {
555553
}
556554
...
557555
}
@@ -567,7 +565,7 @@ public class SwapChain {
567565
try (MemoryStack stack = MemoryStack.stackPush()) {
568566
IntBuffer ip = stack.mallocInt(1);
569567
int err = KHRSwapchain.vkAcquireNextImageKHR(device.getVkDevice(), vkSwapChain, ~0L,
570-
syncSemaphoresList[currentFrame].imgAcquisitionSemaphores().getVkSemaphore(), MemoryUtil.NULL, ip);
568+
syncSemaphoresList[currentFrame].imgAcquisitionSemaphore().getVkSemaphore(), MemoryUtil.NULL, ip);
571569
if (err == KHRSwapchain.VK_ERROR_OUT_OF_DATE_KHR) {
572570
resize = true;
573571
} else if (err == KHRSwapchain.VK_SUBOPTIMAL_KHR) {
@@ -604,7 +602,7 @@ public class SwapChain {
604602
VkPresentInfoKHR present = VkPresentInfoKHR.callocStack(stack)
605603
.sType(KHRSwapchain.VK_STRUCTURE_TYPE_PRESENT_INFO_KHR)
606604
.pWaitSemaphores(stack.longs(
607-
syncSemaphoresList[currentFrame].renderCompleteSemaphores().getVkSemaphore()))
605+
syncSemaphoresList[currentFrame].renderCompleteSemaphore().getVkSemaphore()))
608606
.swapchainCount(1)
609607
.pSwapchains(stack.longs(vkSwapChain))
610608
.pImageIndices(stack.ints(currentFrame));
@@ -740,17 +738,17 @@ public class ForwardRenderActivity {
740738
currentFence.reset();
741739
SwapChain.SyncSemaphores syncSemaphores = swapChain.getSyncSemaphoresList()[idx];
742740
queue.submit(stack.pointers(commandBuffer.getVkCommandBuffer()),
743-
stack.longs(syncSemaphores.imgAcquisitionSemaphores().getVkSemaphore()),
741+
stack.longs(syncSemaphores.imgAcquisitionSemaphore().getVkSemaphore()),
744742
stack.ints(VK_PIPELINE_STAGE_COLOR_ATTACHMENT_OUTPUT_BIT),
745-
stack.longs(syncSemaphores.renderCompleteSemaphores().getVkSemaphore()), currentFence);
743+
stack.longs(syncSemaphores.renderCompleteSemaphore().getVkSemaphore()), currentFence);
746744

747745
}
748746
}
749747
...
750748
}
751749
```
752750

753-
This method, gets the `CommandBuffer` instance that should be used for the frame that we are in (for the image that we are rendering to). It also gets the `Fence`associated to that `CommandBuffer`. We invoke the `fenceAit` and the `reset` one to prevent submitting a `CommandBuffer` which is still been used. Finally, we submit the command to the queue, retrieving also the semaphore that has been used to signal the acquisition of the current swap chain image. This is done by invoking a new method in the `Queue` class, named `submit`. The meaning of the arguments of this method will be explained when we analyzed its definition:
751+
This method, gets the `CommandBuffer` instance that should be used for the frame that we are in (for the image that we are rendering to). It also gets the `Fence`associated to that `CommandBuffer`. We invoke the `fenceWait` and the `reset` one to prevent submitting a `CommandBuffer` which is still been used. Finally, we submit the command to the queue, retrieving also the semaphore that has been used to signal the acquisition of the current swap chain image. This is done by invoking a new method in the `Queue` class, named `submit`. The meaning of the arguments of this method will be explained when we analyzed its definition:
754752

755753
```java
756754
public class Queue {

bookcontents/chapter-06/chapter-06.md

Lines changed: 26 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -159,14 +159,33 @@ public class VulkanBuffer {
159159

160160
## Vertex description
161161

162-
We have now created the buffers required to hold the data for vertices, the next step is to describe to Vulkan the format of that data. In order to do that, we will create a new class named `VertexBufferStructure` which will be used by Vulkan to know hot to extract that data from the underlying buffer. The class starts like this:
162+
We have now created the buffers required to hold the data for vertices, the next step is to describe to Vulkan the format of that data. As you can guess, depending on the specific case, the structure of that data may change, we may have just position coordinates, or position with texture coordinates and normals, etc. Some of the vulkan elements that we will define later on, will need a handle to that structure. In order to support this, we will create an abstract class named `VertexInputStateInfo`, which just stores the handle to a `VkPipelineVertexInputStateCreateInfo` structure:
163+
```java
164+
package org.vulkanb.eng.graph.vk;
165+
166+
import org.lwjgl.vulkan.VkPipelineVertexInputStateCreateInfo;
167+
168+
public abstract class VertexInputStateInfo {
169+
170+
protected VkPipelineVertexInputStateCreateInfo vi;
171+
172+
public void cleanup() {
173+
vi.free();
174+
}
175+
176+
public VkPipelineVertexInputStateCreateInfo getVi() {
177+
return vi;
178+
}
179+
}
180+
```
181+
182+
Now we can extend from tha class to define specific vertex formats. We will create a new class named `VertexBufferStructure` which will be used by Vulkan to know hot to extract that data from the underlying buffer. The class starts like this:
163183

164184
```java
165-
public class VertexBufferStructure {
185+
public class VertexBufferStructure extends VertexInputStateInfo {
166186

167187
private static final int NUMBER_OF_ATTRIBUTES = 1;
168188
private static final int POSITION_COMPONENTS = 3;
169-
private VkPipelineVertexInputStateCreateInfo vi;
170189
private VkVertexInputAttributeDescription.Buffer viAttrs;
171190
private VkVertexInputBindingDescription.Buffer viBindings;
172191

@@ -259,15 +278,10 @@ The rest of the methods are the usual suspects, the `cleanup` one to free the r
259278
public class VertexBufferStructure {
260279
...
261280
public void cleanup() {
262-
vi.free();
281+
super.cleanup();
263282
viBindings.free();
264283
viAttrs.free();
265284
}
266-
267-
public VkPipelineVertexInputStateCreateInfo getVi() {
268-
return vi;
269-
}
270-
...
271285
}
272286
```
273287

@@ -750,9 +764,9 @@ Now it is the turn to create the pipeline, which will be encapsulated in a new c
750764
```java
751765
public class Pipeline {
752766
...
753-
public record PipeLineCreationInfo(long vkRenderPass, ShaderProgram shaderProgram, int numColorAttachments, VertexBufferStructure vertexBufferStructure) {
767+
public record PipeLineCreationInfo(long vkRenderPass, ShaderProgram shaderProgram, int numColorAttachments, VertexInputStateInfo viInputStateInfo) {
754768
public void cleanup() {
755-
vertexBufferStructure.cleanup();
769+
viInputStateInfo.cleanup();
756770
}
757771
}
758772
...
@@ -945,7 +959,7 @@ public class Pipeline {
945959
VkGraphicsPipelineCreateInfo.Buffer pipeline = VkGraphicsPipelineCreateInfo.callocStack(1, stack)
946960
.sType(VK_STRUCTURE_TYPE_GRAPHICS_PIPELINE_CREATE_INFO)
947961
.pStages(shaderStages)
948-
.pVertexInputState(pipeLineCreationInfo.vertexBufferStructure().getVi())
962+
.pVertexInputState(pipeLineCreationInfo.viInputStateInfo().getVi())
949963
.pInputAssemblyState(vkPipelineInputAssemblyStateCreateInfo)
950964
.pViewportState(vkPipelineViewportStateCreateInfo)
951965
.pRasterizationState(vkPipelineRasterizationStateCreateInfo)

bookcontents/chapter-07/chapter-07.md

Lines changed: 11 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -178,6 +178,7 @@ import static org.lwjgl.vulkan.VK11.*;
178178

179179
public class Attachment {
180180

181+
private boolean depthAttachment;
181182
private Image image;
182183
private ImageView imageView;
183184

@@ -187,9 +188,11 @@ public class Attachment {
187188
int aspectMask = 0;
188189
if ((usage & VK_IMAGE_USAGE_COLOR_ATTACHMENT_BIT) > 0) {
189190
aspectMask = VK_IMAGE_ASPECT_COLOR_BIT;
191+
depthAttachment = false;
190192
}
191193
if ((usage & VK_IMAGE_USAGE_DEPTH_STENCIL_ATTACHMENT_BIT) > 0) {
192194
aspectMask = VK_IMAGE_ASPECT_DEPTH_BIT;
195+
depthAttachment = true;
193196
}
194197

195198
imageView = new ImageView(device, image.getVkImage(), image.getFormat(), aspectMask, 1);
@@ -207,16 +210,20 @@ public class Attachment {
207210
public ImageView getImageView() {
208211
return imageView;
209212
}
213+
214+
public boolean isDepthAttachment() {
215+
return depthAttachment;
216+
}
210217
}
211218
```
212-
We just create and image and the associated image view. Depending on the type of image (color or depth image), we setup the aspect mask accordingly.
219+
We just create and image and the associated image view. Depending on the type of image (color or depth image), we setup the aspect mask accordingly. We also have defined a `boolean` attribute, named `depthAttachment` to identify if it is a depth attachment or not.
213220

214221
## Changing vertices structure
215222

216223
In the previous chapter, we defined the structure of our vertices, which basically stated that our vertices were composed by x, y and z positions. Therefore, we would not need anything more to display 3D models. However, displaying a 3D model just using a single color (without shadows or light effects), makes difficult to verify if the model is being loaded property. So, we will add extra components that we will reuse in next chapters, we will add texture coordinates. Although we will not be handling textures in this chapter, we can use those components to pass some color information (at lest for two color channels). We need to modify the `VertexBufferStructure` in this way:
217224

218225
```java
219-
public class VertexBufferStructure {
226+
public class VertexBufferStructure extends VertexInputStateInfo {
220227

221228
public static final int TEXT_COORD_COMPONENTS = 2;
222229
private static final int NUMBER_OF_ATTRIBUTES = 2;
@@ -568,9 +575,9 @@ public class Pipeline {
568575
...
569576
public record PipeLineCreationInfo(long vkRenderPass, ShaderProgram shaderProgram, int numColorAttachments,
570577
boolean hasDepthAttachment, int pushConstantsSize,
571-
VertexBufferStructure vertexBufferStructure) {
578+
VertexInputStateInfo viInputStateInfo) {
572579
public void cleanup() {
573-
vertexBufferStructure.cleanup();
580+
viInputStateInfo.cleanup();
574581
}
575582
}
576583
...

bookcontents/chapter-08/chapter-08.md

Lines changed: 17 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1028,7 +1028,7 @@ public class MatrixDescriptorSetLayout extends DescriptorSetLayout {
10281028

10291029
private static final Logger LOGGER = LogManager.getLogger();
10301030

1031-
public MatrixDescriptorSetLayout(Device device, int binding) {
1031+
public MatrixDescriptorSetLayout(Device device, int binding, int stage) {
10321032
super(device);
10331033

10341034
LOGGER.debug("Creating matrix descriptor set layout");
@@ -1039,7 +1039,7 @@ public class MatrixDescriptorSetLayout extends DescriptorSetLayout {
10391039
.binding(binding)
10401040
.descriptorType(VK_DESCRIPTOR_TYPE_UNIFORM_BUFFER)
10411041
.descriptorCount(1)
1042-
.stageFlags(VK_SHADER_STAGE_VERTEX_BIT);
1042+
.stageFlags(stage);
10431043

10441044
VkDescriptorSetLayoutCreateInfo layoutInfo = VkDescriptorSetLayoutCreateInfo.callocStack(stack)
10451045
.sType(VK_STRUCTURE_TYPE_DESCRIPTOR_SET_LAYOUT_CREATE_INFO)
@@ -1054,7 +1054,7 @@ public class MatrixDescriptorSetLayout extends DescriptorSetLayout {
10541054
}
10551055
```
10561056

1057-
The code is quite similar to the one used in textures, but in this case we are using a different descriptor type: `VK_DESCRIPTOR_TYPE_UNIFORM_BUFFER` to state that this descriptor will be associated directly to a buffer. We are also stating that this will be used in a vertex shader by using the `VK_SHADER_STAGE_VERTEX_BIT` flag.
1057+
The code is quite similar to the one used in textures, but in this case we are using a different descriptor type: `VK_DESCRIPTOR_TYPE_UNIFORM_BUFFER` to state that this descriptor will be associated directly to a buffer. The class receives also a stage parameter which states which pipeline stage the descriptor sets will be used. In our case, as we will see later on, we will use it in a vertex shader, so we will be using the `VK_SHADER_STAGE_VERTEX_BIT` flag.
10581058

10591059
We need also to create the descriptor set associated to the uniform that will hold the projection matrix, in a new class named `MatrixDescriptorSet`:
10601060

@@ -1125,10 +1125,10 @@ public class Pipeline {
11251125
...
11261126
public record PipeLineCreationInfo(long vkRenderPass, ShaderProgram shaderProgram, int numColorAttachments,
11271127
boolean hasDepthAttachment, int pushConstantsSize,
1128-
VertexBufferStructure vertexBufferStructure,
1128+
VertexInputStateInfo viInputStateInfo,
11291129
DescriptorSetLayout[]descriptorSetLayouts) {
11301130
public void cleanup() {
1131-
vertexBufferStructure.cleanup();
1131+
viInputStateInfo.cleanup();
11321132
}
11331133
}
11341134
}
@@ -1185,7 +1185,7 @@ We create the descriptor layouts for the textures and the uniform that will hold
11851185
public class ForwardRenderActivity {
11861186
...
11871187
private void createDescriptorSets() {
1188-
matrixDescriptorSetLayout = new MatrixDescriptorSetLayout(device, 0);
1188+
matrixDescriptorSetLayout = new MatrixDescriptorSetLayout(device, 0, VK_SHADER_STAGE_VERTEX_BIT);
11891189
textureDescriptorSetLayout = new TextureDescriptorSetLayout(device, 0);
11901190
descriptorSetLayouts = new DescriptorSetLayout[]{
11911191
matrixDescriptorSetLayout,
@@ -1208,16 +1208,21 @@ public class ForwardRenderActivity {
12081208

12091209
First we create the descriptor set layouts. Once we have those, we can create the descriptor pool. In this case we will create just one descriptor for a single texture and a single descriptor for the projection matrix. We also create a texture sampler. Warning note: If the uniform could be updated in each frame, we would need as many descriptors as swap chain images we have. If not, we could be updating the descriptor set contents while still being used in rendering another frame. We also create a map, that we will use for the textures. We will store the descriptors associated to each texture indexed by the file used to load it.
12101210

1211-
Going back to the `ForwardRenderActivity` constructor, the projection matrix only will be updated when resizing and when that occurs we will not be drawing anything, so it is safe to have just one. We initialize the buffer associated to the projection uniform by calling the `copyMatrixToBuffer` which is defined like this:
1212-
1211+
Going back to the `ForwardRenderActivity` constructor, the projection matrix only will be updated when resizing and when that occurs we will not be drawing anything, so it is safe to have just one. We initialize the buffer associated to the projection uniform by calling the `copyMatrixToBuffer` method from the `VulkanUtils` class:
12131212
```java
12141213
public class ForwardRenderActivity {
12151214
...
12161215
public ForwardRenderActivity(SwapChain swapChain, CommandPool commandPool, PipelineCache pipelineCache, Scene scene) {
12171216
...
1218-
copyMatrixToBuffer(projMatrixUniform, scene.getPerspective().getPerspectiveMatrix());
1217+
VulkanUtils.copyMatrixToBuffer(device, projMatrixUniform, scene.getPerspective().getPerspectiveMatrix());
12191218
}
1220-
1219+
...
1220+
}
1221+
```
1222+
The `copyMatrixToBuffer` method is defined like this:
1223+
```java
1224+
public class VulkanUtils {
1225+
...
12211226
private void copyMatrixToBuffer(VulkanBuffer vulkanBuffer, Matrix4f matrix) {
12221227
try (MemoryStack stack = MemoryStack.stackPush()) {
12231228
PointerBuffer pointerBuffer = stack.mallocPointer(1);
@@ -1321,13 +1326,13 @@ public class ForwardRenderActivity {
13211326
}
13221327
```
13231328

1324-
The `resize` method needs also to be modified ti update the buffer that will back the projection matrix uniform:
1329+
The `resize` method needs also to be modified to update the buffer that will back the projection matrix uniform:
13251330

13261331
```java
13271332
public class ForwardRenderActivity {
13281333
...
13291334
public void resize(SwapChain swapChain, Scene scene) {
1330-
copyMatrixToBuffer(projMatrixUniform, scene.getPerspective().getPerspectiveMatrix());
1335+
VulkanUtils.copyMatrixToBuffer(device, projMatrixUniform, scene.getPerspective().getPerspectiveMatrix());
13311336
...
13321337
}
13331338
...

0 commit comments

Comments
 (0)