Support me!
If you enjoy these webpages and you want to show your gratitude, feel free to support me in anyway!
Like Me On Facebook! Megabyte Softworks Facebook
Like Me On Facebook! Megabyte Softworks Patreon
Donate $1
Donate $2
Donate $5
Donate $10
Donate Custom Amount
16.) Rendering To A Texture
<< back to OpenGL 3 series

Welcome to the 16th OpenGL 3.3 tutorial. This time we're going to learn how to render to a texture. What's this good for? Imagine a situation where you have security cameras somewhere in the scene, and in the other part of scene there's a terminal, where you want to see camera image. How would you do that? Exactly! You must look at the scene from camera's view, render the scene somewhere, and then copy the final image (to the texture). Then you can apply that texture on that terminal's camera screen. This is probably the most common use and it's calling rendering to a texture.

Significant change in code now

This tutorial also has some significant changes compared to all previous - my coding style opinions have changed recently and I renamed all functions to begin with a capital letter. You know, it just makes sense to differentiate between functions and variables. And the good way to do that is to have variable names starting with lower-case letter and function names starting with capital letter. Another small change is that I unified the deleting / releasing function names. Now they are all beginning with Delete, just like in OpenGL. It's because some classes had releasing function names beginning with Delete and some with Release and it was to avail.

In this tutorial, we'll create SpongeBob watching The Avengers . Well not actual Avengers movie, but a rotating and moving Thor model on the TV screen. So let's get familiar with some new terms.

Framebuffers and renderbuffers

If you haven't been introduced to framebuffers and renderbuffers and your reaction to them is like:

then this tutorial should help you (I wanted to put at least one picture into article other than the screenshot ). Framebuffers and renderbuffers are another types of OpenGL objects (so they're created the traditional way with functions starting with glGen), that allows us to do off-screen rendering. That means, that you render, but not onto screen, but somewhere on virtual screen, a.k.a framebuffer. After that, you can read the framebuffer contents, most generally produced final 2D image, and you create a texture from that, which you can apply anywhere. So what's a renderbuffer then? The thing is, that framebuffer consists of multiple renderbuffers, and you already know some of them. The default framebuffer (with name 0), which is normal on-screen rendering has a color buffer (storing RGBA), depth buffer (storing pixel depths), then optional stencil buffer and maybe some other buffers. All these sub-buffers are called renderbuffers. So that's how it is - nothing difficult .

Working with framebuffer

Now we're getting to part where you'll see how to use these objects. We can summarize using FBOs for the purpose of this tutorial in these 6 steps:

  • 1.) creating FBO
  • 2.) creating a texture for FBO to copy 2D image into
  • 3.) adding depth buffer (or renderbuffer) to it
  • 4.) bind FBO
  • 5.) render the scene off-screen, then copy it to texture
  • 6.) unbind FBO, use created texture anywhere!

These are all important steps we need to do. We'll go through them one by one. But first, the framebuffer class to handle everything nicely in one place:

class CFramebuffer
{
public:
	bool CreateFramebufferWithTexture(int a_iWidth, int a_iHeight);

	bool AddDepthBuffer();
	void BindFramebuffer(bool bSetFullViewport = true);
	
	void SetFramebufferTextureFiltering(int a_tfMagnification, int a_tfMinification);
	void BindFrameBufferTexture(int iTextureUnit = 0, bool bRegenMipMaps = false);

	glm::mat4 CalculateProjectionMatrix(float fFOV, float fNear, float fFar);
	glm::mat4 CalculateOrthoMatrix();

	void DeleteFramebuffer();

	int GetWidth();
	int GetHeight();

	CFramebuffer();
private:
	int iWidth, iHeight;
	UINT uiFramebuffer;
	UINT uiDepthRenderbuffer;
	CTexture tFramebufferTex;
};

Let's get through the main functions. CreateFramebufferWithTexture does first two steps. It just calls glGenFramebuffers function, and creates initally empty texture for FBO (empty texture is created using normal glTexImage2D function, but with NULL pointer to data). Important thing to notice are the parameters width and height. Before using FBO, you must specifiy its dimension. They don't have to be powers of 2, it will work with any numbers as well. But I chose the framebuffer in this tutorial to have 512x256 dimension. Important OpenGL call in this function is:

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tFramebufferTex.GetTextureID(), 0);

This is attaching texture to the framebuffer. The first parameter must be GL_FRAMEBUFFER, second tells what part of framebuffer we want this texture to store. GL_COLOR_ATTACHMENT0 is the framebuffer colorbuffer. A FBO can have multiple color attachments, but GL_COLOR_ATTACHMENT0 is default and rendered image is stored there. If you want to have 2 color attachments, for example one for normal RGB image and one for let's say grayscale image, you can do this, but in the fragment shader, where you have output color specified, you would have to specify it like this:

out vec4 outputColor;                     // Normal output, GL_COLOR_ATTACHMENT0, no need for layout keyword
layout(location = 1)out float fGrayscale; // One float per pixel for grayscale value, GL_COLOR_ATTACHMENT1

Notice, that our texture is only RGB, but we actually output vec4. But it doesn't matter, OpenGL seems to be intelligent enough to copy only RGB values. It worked both on nVidia and AMD cards, so I don't care about deepest details how compatible these outputs and textures must be as long as it works.

Let's get into step 3 - adding depth buffer. It's all in the function AddDepthBuffer. The depth buffer isn't in newly created FBO by default, so we must add it there. Therefore, we create one renderbuffer using function glGenRenderbuffers. After that, we bind it (as with all OpenGL objects we're about to work with) and initialize its size and type with:

glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT24, iWidth, iHeight);

First parameter must be GL_RENDERBUFFER, second is the renderbuffer type, for depth buffer I used GL_DEPTH_COMPONENT24, which is depth buffer with 24-bit precision (3 bytes per pixel), and the last two important parameters are renderbuffer's width and height. These two must match dimension of FBO we want to attach renderbuffer to. Final step is actual attachment of renderbuffer to FBO using function:

glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, uiDepthRenderbuffer);

The first parameter must be GL_FRAMEBUFFER, second specifies that we're attaching depth buffer, third must be GL_RENDERBUFFER and the last is ID of previously generated renderbuffer.

Now, the FBO is ready to be used. Let's have a look into the RenderScene function. Before we render the real on-screen scene, we're going to render The Avengers scene into our FBO. That's why we bind our FBO using call glBindFramebuffer, which has two parameters - first is always GL_FRAMEBUFFER, and second is FBO ID. This is wrapped in function BindFramebuffer of our class and it's the step 4.

Now we're good to proceed with rendering our Avengers scene. We'll render normal way, just like rendering on-screen, but the results are stored in FBO and in its associated texture. This is step 5 and after all the rendering is done, our texture is ready, image is written directly into texture.

The last step is unbinding FBO and returning to normal on-screen rendering. This is done using glBindFramebuffer function with FBO ID 0. Our texture is ready to be mapped anywhere. Notice however, that if you want to use a filtering with mipmaps, you must recalculate them every frame. That's why the function BindFramebufferTexture of our class takes 2 parameters - first is texture unit, and second is whether the mipmaps should be recalculated. I selected mipmap filtering, even trilinear, so the mipmaps must definitely be recalculated.

Result

What we just programmed looks like this:

and it's not bad . Hope you enjoyed this tutorial and if you have never been rendering to a texture, I hope this tutorial makes it clear to you. If you don't understand something, feel free to ask in the comments or mail me. Have a nice day .

Download 3.32 MB (5530 downloads)