Support me!
If you enjoy these webpages and you want to show your gratitude, feel free to support me in anyway!
Like Me On Facebook! Megabyte Softworks Facebook
Like Me On Facebook! Megabyte Softworks Patreon
Donate $1
Donate $2
Donate $5
Donate $10
Donate Custom Amount
01.) Creating OpenGL 3.3 Window
<< back to OpenGL 3 series

Welcome to OpenGL 3.3+ tutorials series. In this series, you will learn how to use OpenGL the new way. This way is little bit more difficult than the way before, now OpenGL relies on you to do many things. But don't get scared, these tutorials will explain stuff slowly and step-by-step you will build a good basis to think the new way.

In old OpenGL (before 2.0 version), most functionality has been FIXED into OpenGL, having it easier for programmers to do simple tasks (like working with matrices, transforming vertices and so on), but it didn't offer much space to do some very specific stuff. With OpenGL 2.0 shaders came, allowing programmer to replace some of the fixed functionality and rewrite it the way he wanted. It was a very good thing. But till OpenGL 3.0, you could still rely on fixed functionality even in shaders. For example, till GLSL 1.40 (OpenGL Shading Language, the most important thing of new OpenGL, which will be covered in these tutorials), you could use function fttransform(), which stands for (I guess) Fixed Transformation, so you could transform vertices using OpenGL built-in modelview and projection matrix and everything was OK. But in OpenGL 3.0, this fixed functionality has been deprecated, and in OpenGL 3.2 and later removed from core functionality (so when using OpenGL 3.2 rendering context and later, calling these functions will have no effect).

So how is it in new OpenGL? Well, now you cannot use ol' good functions like glTranslatef(), glRotatef(), glScalef(), or glMatrixMode(GL_PROJECTION), then setting the perspective with gluPerspective and similar functions. Now you have to calculate the matrices yourself, then upload them to vertex shader, and handle vertices with it. But don't worry, there are libraries over the internet, that work with matrices. We will work with one such later. So it won't be that difficult in the end.

The next thing that has significantly changed is actual rendering of things. Now there is no glBegin() and glEnd() function. Everything is replaced using vertex buffer objects (VBOs) and vertex array objects (VAOs). While in old OpenGL, rendering a triangle was as intuitive as possible,

glBegin(GL_TRIANGLES);
	glVertex2d(-5, 0); // Pass first vertex
	glVertex2d( 5, 0); // Pass second vertex
	glVertex2d( 0, 5); // Pass third vertex
glEnd();

the code of triangle render in OpenGL 3.3 can look like this:

// Some init scene function

UINT uiVAOid, uiVBOid;

void initScene()
{
	float fVert[9];
	fVert[0] = -5; fVert[1] = 0; fVert[2] = 0;
	fVert[3] = 5;  fVert[4] = 0; fVert[5] = 0;
	fVert[6] = 0;  fVert[7] = 5; fVert[8] = 0;
 
	// Generate VAO
	glGenVertexArrays(1, &uiVAOid);
	// Setup of VAO
	glBindVertexArray(uiVAOid);
 
	glGenBuffers(1, &uiVBOid);
 
	glBindBuffer(GL_ARRAY_BUFFER, m_vboID[0]);
	glBufferData(GL_ARRAY_BUFFER, 9*sizeof(GLfloat), fVert, GL_STATIC_DRAW);
	glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0); 
	glEnableVertexAttribArray(0);

	//...
}

// Some render scene function

void RenderScene()
{
	//...

	glBindVertexArray(uiVAOid);
	glDrawArrays(GL_TRIANGLES, 0, 3);

	//...
}

As you can see it is longer, and not as intuitive. But this will bring BLAZING FAST rendering. If you know something about assembly (you don't even have to), you will notice that each call of glVertex3f has 3 floats as parameters. These floats must pass as function parameters to processor registers, before they are sent to GPU. And for one triangle, it's 3 function calls (well for one triangle it really isn't a problem, but scene with one triangle probably isn't what we want :). And for object with 10000 triangles, it's 30000 calls. This is so-called CPU bottleneck, when rendering is slowed by processor passing all this data to GPU. In new OpenGL, you can see that we first set up our objects (store their data in GPU), then we call just few functions to tell which data to use and so on, and then we call (for example) glDrawArrays to render objects. Now the CPU sends only few data to GPU, and rendering is way way way faster :) Of course, in older OpenGL versions you could use for example vertex arrays to speed-up rendering, but it's always passing data from RAM (client memory) to GPU (server memory) each and every frame, which is, no matter how we look at it, not good. But since OpenGL 1.5, you could move forward to use VBOs (store data in GPU), and it would not be that bad. OK, so let's begin.

Setting up glew library

First thing we will need to use is glew (OpenGL Extension Wrangler Library). You can download it from here: http://glew.sourceforge.net. After downlading and extracting, we will need to be able to include it to our project. Since I'm using Visual Studio, the best option is to have glew extracted in some libraries directory, and then add include paths and library paths to Visual Studio. In Visual Studio 2008, you can do it under Tools -> Options -> Projects and Solutions -> VC++ Directories, as you can see on following picture:

In Show directories for, you must choose Include files, and add a glew_installation_folder/include (of course, put there real path, for example C:\Libraries\glew-1.7.0\include). Then you also must add library paths, so select Library files, and add glew_installation_folder/lib there. Then we can have in our code:

#include 

and it will be OK. The worse option is to copy glew.h to your directory, so don't do it. The good thing about this include paths is, that if new version of glew (or any other used library) is out, you will download it, then just change the include path to the new version of your library and you will have new features and functions. This header supersedes the gl.h header on Windows, which hasn't been updated since version 1.1 (along with opengl32.lib). I know that Microsoft wants Windows developers to use DirectX, but they really could offer an alternative and add OpenGL support right into Visual Studio. But they will probably never do. It's sad, but there's nothing I can do about it. But glew will do all the work for us, one single call will get function pointers to all procedures, and we can use OpenGL 3.3 features without problems.

The OpenGL control class

My aim si to create a class, that will control OpenGL creation, releasing and practically everything that deals with OpenGL. So let's start with class declaration:

class COpenGLControl
{
public:
	bool InitOpenGL(HINSTANCE hInstance, HWND* a_hWnd, int iMajorVersion, int iMinorVersion, void (*a_InitScene)(LPVOID), void (*a_RenderScene)(LPVOID), void(*a_ReleaseScene)(LPVOID), LPVOID lpParam);
	
	void ResizeOpenGLViewportFull();

	void Render(LPVOID lpParam);
	void ReleaseOpenGLControl(LPVOID lpParam);

	static void RegisterSimpleOpenGLClass(HINSTANCE hInstance);
	static void UnregisterSimpleOpenGLClass(HINSTANCE hInstance);

	void MakeCurrent();
	void SwapBuffersM();

private:
	bool InitGLEW(HINSTANCE hInstance);

	HDC hDC;
	HWND* hWnd;
	HGLRC hRC;
	static bool bClassRegistered;
	static bool bGlewInitialized;
	int iMajorVersion, iMinorVersion;

	void (*InitScene)(LPVOID lpParam), (*RenderScene)(LPVOID lpParam), (*ReleaseScene)(LPVOID lpParam);
};

Even though it may seem little complicated at first glance, it's not that bad. Let's look at the functions:

initOpenGL - this is most important function, creates OpenGL rendering context within a given window, the parameters are - application instance (if you don't know what it is, it doesn't matter, it's not that important), major and minor version of OpenGL, and pointers to functions - init function, rendering function, and optional release function. The idea is to create one instance of COpenGLControl class somewhere, tell it which functions in your project are init function, rendering function and release function, and then you are ready to go. So simple call of this function will give us OpenGL context of version we want.

ResizeOpenGLViewportFull() - sets OpenGL viewport to whole window

RenderScene() - renders scene, the lpParam parameter is LPVOID type - it means it's a general pointer, it can point to anything you want. But basically, lpParam will point to our OpenGL Controller instance. The thing about using function callbacks is that the code isn't that intuitive at first glance, but they are a very good thing, even though may be hard for those who didn't see it before to understand. You may have a look at Wikipedia about callbacks: http://en.wikipedia.org/wiki/Callback_(computer_programming)

ReleaseOpenGLControl() - cleanup function - releases scene data (if release function callback was set) and deletes rendering context. lpParam applies to same concept, as written previously.

RegisterSimpleOpenGLClass - registers window class that supports OpenGL, this class is used in fake window (you'll see later)

UnregisterSimpleOpenGLClass - unregisters previously registered window class

MakeCurrent() - sets current rendering context to the one we created (it calls traditional wglMakeCurrent function

SwapBuffersM() - swaps front and back buffer - simply calls traditional SwapBuffers function, and that's why it has extra M in the name, because then compiler complains, even though I think it really doesn't have a reason, because we're taking class member function, but still (try it, I have tried)

InitGLEW - initializes GLEW library

That does it, now we should have an idea what does each function do. We will have a closer look only at the initGLEW and InitOpenGL function. In other functions there isn't much to explain, they are pretty straightforward.

InitGLEW function
bool COpenGLControl::InitGLEW(HINSTANCE hInstance)
{
	if(bGlewInitialized)return true;

	RegisterSimpleOpenGLClass(hInstance);

	HWND hWndFake = CreateWindow(SIMPLE_OPENGL_CLASS_NAME, "FAKE", WS_OVERLAPPEDWINDOW | WS_MAXIMIZE | WS_CLIPCHILDREN,
		0, 0, CW_USEDEFAULT, CW_USEDEFAULT, NULL,
		NULL, hInstance, NULL);

	hDC = GetDC(hWndFake);

	// First, choose false pixel format
	
	PIXELFORMATDESCRIPTOR pfd;
	memset(&pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
	pfd.nSize		= sizeof(PIXELFORMATDESCRIPTOR);
	pfd.nVersion   = 1;
	pfd.dwFlags    = PFD_DOUBLEBUFFER | PFD_SUPPORT_OPENGL | PFD_DRAW_TO_WINDOW;
	pfd.iPixelType = PFD_TYPE_RGBA;
	pfd.cColorBits = 32;
	pfd.cDepthBits = 32;
	pfd.iLayerType = PFD_MAIN_PLANE;
 
	int iPixelFormat = ChoosePixelFormat(hDC, &pfd);
	if (iPixelFormat == 0)return false;

	if(!SetPixelFormat(hDC, iPixelFormat, &pfd))return false;

	// Create the false, old style context (OpenGL 2.1 and before)

	HGLRC hRCFake = wglCreateContext(hDC);
	wglMakeCurrent(hDC, hRCFake);

	bool bResult = true;

	if(!bGlewInitialized)
	{
		if(GlewInit() != GLEW_OK)
		{
			MessageBox(*hWnd, "Couldn't initialize GLEW!", "Fatal Error", MB_ICONERROR);
			bResult = false;
		}
		bGlewInitialized = true;
	}

	wglMakeCurrent(NULL, NULL);
	wglDeleteContext(hRCFake);
	DestroyWindow(hWndFake);

	return bResult;
}

So what are we doing here? You may have guessed from the names of variables - we create a fake window. Then we set up the rendering context old way - using wglCreateContext. This will give us access to OpenGL functions. And here comes the reason for all this - now we can initalize GLEW library using glewInit. What GLEW does is that it gets function pointers to all OpenGL functions and extensions (if they are supported by the graphics card). It calls the wglGetProcAddress for every OpenGL function. But without OpenGL context, we couldn't get OpenGL function pointers, so that's why we create a fake window, get OpenGL function pointers and then destroy the fake window. I know - it isn't very nice, but searching OpenGL Wiki and some forums over internet, I didn't find a better way of doing this on Windows.

InitOpenGL function
bool COpenGLControl::InitOpenGL(HINSTANCE hInstance, HWND* a_hWnd, int iMajorVersion, int iMinorVersion,
										  void (*a_InitScene)(LPVOID), void (*a_RenderScene)(LPVOID), void(*a_ReleaseScene)(LPVOID),
										  LPVOID lpParam)
{
	if(!initGLEW(hInstance))return false;

	hWnd = a_hWnd;
	hDC = GetDC(*hWnd);

	bool bError = false;
	PIXELFORMATDESCRIPTOR pfd;

	if(iMajorVersion <= 2)
	{
		memset(&pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
		pfd.nSize		= sizeof(PIXELFORMATDESCRIPTOR);
		pfd.nVersion   = 1;
		pfd.dwFlags    = PFD_DOUBLEBUFFER | PFD_SUPPORT_OPENGL | PFD_DRAW_TO_WINDOW;
		pfd.iPixelType = PFD_TYPE_RGBA;
		pfd.cColorBits = 32;
		pfd.cDepthBits = 32;
		pfd.iLayerType = PFD_MAIN_PLANE;
 
		int iPixelFormat = ChoosePixelFormat(hDC, &pfd);
		if (iPixelFormat == 0)return false;

		if(!SetPixelFormat(hDC, iPixelFormat, &pfd))return false;

		// Create the old style context (OpenGL 2.1 and before)
		hRC = wglCreateContext(hDC);
		if(hRC)wglMakeCurrent(hDC, hRC);
		else bError = true;
	}
	else if(WGLEW_ARB_create_context && WGLEW_ARB_pixel_format)
	{
		const int iPixelFormatAttribList[] =
		{
			WGL_DRAW_TO_WINDOW_ARB, GL_TRUE,
			WGL_SUPPORT_OPENGL_ARB, GL_TRUE,
			WGL_DOUBLE_BUFFER_ARB, GL_TRUE,
			WGL_PIXEL_TYPE_ARB, WGL_TYPE_RGBA_ARB,
			WGL_COLOR_BITS_ARB, 32,
			WGL_DEPTH_BITS_ARB, 24,
			WGL_STENCIL_BITS_ARB, 8,
			0 // End of attributes list
		};
		int iContextAttribs[] =
		{
			WGL_CONTEXT_MAJOR_VERSION_ARB, iMajorVersion,
			WGL_CONTEXT_MINOR_VERSION_ARB, iMinorVersion,
			WGL_CONTEXT_FLAGS_ARB, WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB,
			0 // End of attributes list
		};

		int iPixelFormat, iNumFormats;
		wglChoosePixelFormatARB(hDC, iPixelFormatAttribList, NULL, 1, &iPixelFormat, (UINT*)&iNumFormats);

		// PFD seems to be only redundant parameter now
		if(!SetPixelFormat(hDC, iPixelFormat, &pfd))return false;

		hRC = wglCreateContextAttribsARB(hDC, 0, iContextAttribs);
		// If everything went OK
		if(hRC) wglMakeCurrent(hDC, hRC);
		else bError = true;

	}
	else bError = true;
	
	if(bError)
	{
		// Generate error messages
		char sErrorMessage[255], sErrorTitle[255];
		sprintf(sErrorMessage, "OpenGL %d.%d is not supported! Please download latest GPU drivers!", iMajorVersion, iMinorVersion);
		sprintf(sErrorTitle, "OpenGL %d.%d Not Supported", iMajorVersion, iMinorVersion);
		MessageBox(*hWnd, sErrorMessage, sErrorTitle, MB_ICONINFORMATION);
		return false;
	}

	RenderScene = a_RenderScene;
	InitScene = a_InitScene;
	ReleaseScene = a_ReleaseScene;

	if(InitScene != NULL)InitScene(lpParam);

	return true;
}

At the beginning of function, we initialize GLEW. After we have info about OpenGL capabilities our graphics card has, we can proceed with creating context. If the desired context is 2.1 and lower, we just create OpenGL the old way. But for later versions of OpenGL (3.0 and beyond), we are using new set of functions - wglChoosePixelFormatARB and wglCreateContextAtrribsARB. The line:

if(WGLEW_ARB_create_context && WGLEW_ARB_pixel_format)

is used to check, whether we have access to these functions (whether they are supported by our graphics card). If this check succeeds, we can use new wgl functions - wglChoosePixelFormatARB and wglCreateContextAtrribsARB. These functions allow us to specify attributes of pixel format and context. You just pass a pointer, in this case it points to array of numbers, which has format ATTRIBUTE, VALUE, ATTRIBUTE, VALUE... and zero is at the end. You can specify as many parameters as you want, and you always end it with zero. It is more flexible than old PIXELFORMATDESCRIPTOR structure, which is fixed. However, if you look at SetPixelFormat function, you can see that I pass an unitialized PIXELFORMATDESCRIPTOR structure, even though I didn't use it for finding suitable pixel format. It is because when calling SetPixelFormat, you must pass PIXELFORMATDESCRIPTOR structure as the last parameter. I couldn't find anywhere about the right way of setting this OpenGL 3.3 context on Windows, but since we must pass something, we pass a dummy PIXELFORMATDESCRIPTOR, and everything works :) If there is some info on internet (on MSDN or anywhere), I will edit this article. But for now, I'm happy it works. So in conclusion - to find right pixel format, we use wglChoosePixelFormatARB, and for setting it, we call SetPixelFormat with whatever the third parameter (but not NULL). In the end of function, we just set the function pointers for init, render and release function, and finally call the init function to initialize our scene. And that does it - we are done with initialization of OpenGL.

Because this is only the first tutorial, it ends here. We won't render the first triangle now, but in second tutorial. It's because the first tutorial would get simply too long, since rendering primitives got little bit more difficult, and requires explanation of terms and functions used. But in order to test the functionality of our OpenGL context, we will clear the background color to nice light blue :) and it will look just like that:

So that's it for today! You can have a look at whole code, but really don't worry if you don't understand it. I'm a native Win32 platform programmer - it means I don't use MFC or any wrapper, and that's why I have to handle window messages and everything by myself. This approach has drawbacks of course - you must write more code (that's already written in some libraries, like MFC), but also many advantages, like you have a COMPLETE control over your application flow, there is no need for additional libraries, and the final EXE file isn't as big as if you have used for example MFC. And these tutorials want to teach OpenGL, not Win32 programming. Just know, that everything is setup fine :) Thank you for reading so far! Add comments and/or questions below.

Download 119 KB (15126 downloads)