Issue rendering freetype font in OpenGL

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 26, 2012 at 21:39

Hi all,

I’m a beginner with OpenGL. After reading some tutorials i try to get freetype running for
text rendering. To increase my knowledge I’ve implemented a simple test, which should
only render a single char on a quad following the NeHe tutorial (43).

After Freetype initialization I create a data buffer with 2 byte image values from freetype
using this:

//Load the Glyph for our character.
if(FT_Load_Glyph(m_face, FT_Get_Char_Index( m_face, 'K' ), FT_LOAD_DEFAULT ))
  throw std::runtime_error("FT_Load_Glyph failed");

//Move the face's glyph into a Glyph object.
FT_Glyph glyph;
if(FT_Get_Glyph( m_face->glyph, &glyph ))
  throw std::runtime_error("FT_Get_Glyph failed");

//Convert the glyph to a bitmap.
FT_Glyph_To_Bitmap( &glyph, ft_render_mode_normal, 0, 1 );
bitmap_glyph = (FT_BitmapGlyph)glyph;

//This reference will make accessing the bitmap easier
bitmap=bitmap_glyph->bitmap;

//Create power of 2 texture
width = nextP2( bitmap.width );
height = nextP2( bitmap.rows );

//Allocate memory for the texture data.
expanded_data = new GLubyte[ 2 * width * height];

//Here we fill in the data for the expanded bitmap.
//Notice that we are using two channel bitmap (one for
//luminocity and one for alpha), but we assign
//both luminocity and alpha to the value that we
//find in the FreeType bitmap.
//We use the ?: operator so that value which we use
//will be 0 if we are in the padding zone, and whatever
//is the the Freetype bitmap otherwise.
        for(int j=0; j <height;j++) {
            for(int i=0; i < width; i++){
                expanded_data[2*(i+j*width)] = expanded_data[2*(i+j*width)+1] =
                    (i>=bitmap_glyph->bitmap.width || j>=bitmap_glyph->bitmap.rows) ?
                         0 :
                         bitmap_glyph->bitmap.buffer[i + bitmap_glyph->bitmap.width*j];
            }
        }

And then I create a texture based on this buffer:

// Generate an alpha texture with it.
      GLuint texture;
   glGenTextures(1, &texture);
      glBindTexture(GL_TEXTURE_2D, texture);
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
   glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
// use 2 Channel values.
glTexImage2D( GL_TEXTURE_2D, 0, GL_LUMINANCE_ALPHA, width, height, 0, GL_LUMINANCE_ALPHA, GL_UNSIGNED_BYTE, data);

Finally try to render the texture on a single quad:

glEnalbe(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texture);
        glBegin(GL_QUADS);
      glTexCoord2f(0.0f, 0.0f); glVertex3f(-1.0f, -1.0f,  1.0f);
      glTexCoord2f(1.0f, 0.0f); glVertex3f( 1.0f, -1.0f,  1.0f);
      glTexCoord2f(1.0f, 1.0f); glVertex3f( 1.0f,  1.0f,  1.0f);
      glTexCoord2f(0.0f, 1.0f); glVertex3f(-1.0f,  1.0f,  1.0f);
        glEnd();

But I only get a black texture with some pixel disortion:

xlppqblp.jpg

I searched the net and found many examples using this NeHe snipped. So I think it should working …
Can anyone help?

Thanks,
Snoop

15 Replies

Please log in or register to post a reply.

B5262118b588a5a420230bfbef4a2cdf
0
Stainless 151 Jan 27, 2012 at 09:39

I haven’t used GL_LUMINANCE_ALPHA for years, but as far as I recall it expects a luminance/alpha pair.

in expanded_data you copy the glyph into the first half of the buffer, leaving the next half empty.

Which means if you had a glyph like this…

00X00
00X00
00X00

In expanded data you would have

{0,0},{X,0},{0,0},{0,0},{0,0},{0,0},{0,0},{0,0}

{0,0},{X,0},{0,0},{0,0},{0,0},{0,0},{0,0},{0,0}

{0,0},{X,0},{0,0},{0,0},{0,0},{0,0},{0,0},{0,0}

So you would have one pixel with a luminance value, though since this has a 0 alpha value, you wouldn’t see it anyway.

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 27, 2012 at 11:58

Thanks Stainless you are right, I only fill the half buffer although the explanation of the NeHe demo
says luminocity and alpha are filled with the same value from the freetype bitmap. But what I do not
understand is why it is working for them?

There are many examples using this snipped, which say it’s working correctly …
i.e.

http://www.g0dsoft.n…e/font_d3d9.cpp

http://stackoverflow…2-iphone-device

http://stackoverflow…s-not-rendering

Have I overseen something or not understood correctly?

B5262118b588a5a420230bfbef4a2cdf
0
Stainless 151 Jan 27, 2012 at 12:30

No it was me not reading the code correctly, it is filling both bytes. I missed this bit of the code.

expanded_data[[color=#800000]2[/color]*(i+j*width)] = expanded_data[[color=#800000]2[/color]*(i+j*width)+[color=#800000]1[/color]] =
I would check your matrices are correct, could be you are drawing the right data but in a screwed view

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 27, 2012 at 12:48

Ahh ok I see. My matrices and other code is correct. If I map any other texture I loaded from file, this texture
is mapped and rendered correctly. Only the generated textures from freetype are not mapped as expected

B5262118b588a5a420230bfbef4a2cdf
0
Stainless 151 Jan 28, 2012 at 09:46

Then print the texture into a file so you can look at it.

If the texture supplied by freetype is correct, print the expanded texture to a file and look at it.

If both are correct, then you have to single step the code and look at every line as it is executed

Or you can send me the code and I’ll have a go for you

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 28, 2012 at 19:52

Hmm good idea, but I actually not know how I could print my informations to file.
I’ve uploaded my actual Visual Studio Express 2010 project here : http://www.mediafire…jblz277a802j8tc

Meanwhile I will google how I could easy write my image data to file …

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 29, 2012 at 10:48

I’ve added a SOIL save image sequence to create a debug texture. Hope I’ve made anything correct.
The debug image looks also corrupted. But I do not understand why, checked the source again against
NeHe tutorial, I could not find a failure … This drives me nuts …

Here is the updated project source : http://www.mediafire…wr5j1jmjz4ilomh

Next I try to find out how I could save the origin freetype data to another debug image …

46407cc1bdfbd2db4f6e8876d74f990a
0
Kenneth_Gorking 101 Jan 29, 2012 at 14:28
//Create power of 2 texture
width = nextP2( bitmap.width );
height = nextP2( bitmap.rows );

This might be the problem. You are using these to index the bitmap, but they might be larger than the original sizes, which would cause you to read the wrong data.

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 29, 2012 at 15:29

I don’t think the creation of pow2 dimensions is the problem, they do it also in the NeHe tutorial. I’ve seen other examples (using texture atlas)
where they calculate the target texture in dimension from 32x32 up to 1024x1024 … Those dimensions are only for the target data buffer
reading is performed by original dimensions (bitmap.width and bitmap.rows) and padding zones are filled whith zero if I understand it correctly.

6837d514b487de395be51432d9cdd078
0
TheNut 179 Jan 29, 2012 at 16:18

I’m not sure what your problem is as I use a different method to load the fonts into a bitmap. To summarize,

FT_Library library;
const char file[] = "My_Font.ttf"
unsigned int fontHeight = 14; // max 14 pixel high font


if ( FT_Init_FreeType(&library) == 0 )
{
     FT_Face face;
     if ( FT_New_Face(library, file, 0, &face) == 0 )
     {
          if ( FT_Set_Pixel_Sizes(face, 0, fontHeight) == 0 )
          {
               for (unsigned int i = start; i <= end; ++i)
               {
                    if ( FT_Load_Char(face, i, FT_LOAD_RENDER) == 0 )
                    {
                         // face->glyph contains all the info (bitmap + metrics) you need for the glyph at char index i.
                         // face->glyph->bitmap.buffer contains your bitmap data (grayscale? 8bpp?). Load it into a texture or whatever.
                    }
               }
          }
     }
}

If NEHE’s code is proving problematic, I suggest reading over the FreeType documentation. It covers the library quite well and it’s where I learned to extract bitmap data.

B5262118b588a5a420230bfbef4a2cdf
0
Stainless 151 Jan 29, 2012 at 17:51

Can’t load your project, it complains about \Lesson06.vcxproj” could not be found.

Thinking about it though you may need a …

glActiveTexture(GL_TEXTURE0);
glPixelStorei(GL_UNPACK_ALIGNMENT,1);

Before you bind the texture, I don’t know what state OpenGL is in when you try and generate your texture.

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 30, 2012 at 10:38

I’ve added the missing files to the second project (You can just copy it to get it runable).

I will check your OpenGL calls, but actually I got trouble to get those Extension calls working in Visual Studio without using glew … I’ve updated the OpenGL headers to 1.4 but
now i got a link error (unresolved external symbol __imp__glActiveTexture). Hm I think I’ve also to update my openg32.lib …

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 30, 2012 at 11:55

I’ve added the extensions, but this have no effect, same failure. I think there must be a failure in glyph bitmap value
convertion, while the debug image which is created before texture creation is also not correct (If I’ve do anything
correct with debug file cration). But I can’t find any difference to the origin tutorial …

I’ve read the freetype library doc as introduced by The Nut, but I’ts realy hard stuff …
I’ve added the updated project files here : http://www.mediafire…x37sh8619i94m94

07b9e9e8483468e4a0e1bac1843e6e74
0
Snoob 105 Jan 30, 2012 at 21:02

Ok fixed this issue. I’ve got a typo on Freetype font character size defintion.
Used width / 64 instead of width * 64. After fixing this the character is rendered.
The character is rendered upside down but it is rendered :)

B5262118b588a5a420230bfbef4a2cdf
0
Stainless 151 Jan 31, 2012 at 13:02

At least the mystery is solved