0
101 Jan 03, 2012 at 08:19

I’m no C++ expert I’ll give you that.

I know enough to do what I can for now, but I never really understood, even after reading, or saw an overall point as to why, what or how exactly typedefs really do anything useful or are necessary for anything in the first place.

Maybe it’s just me unable to understand how a typedef may shape something better, but I seriously see no absolute use in it whatsoever.

#### 22 Replies

0
117 Jan 03, 2012 at 09:44

If you want to use function pointers, typedefs are the only way to go. =)

Apart from that, they’re pretty much sugar in c++.

0
101 Jan 03, 2012 at 14:31

Actually, even that is sugar, but sweeter than the other, more generic typedef sugar… ;)

0
126 Jan 03, 2012 at 18:03

They’re a way to obfuscate your code and make it more unreadable. You don’t see it in java or c# because there was a movement toward understandable, readable code. People that use it tend to also use 3 letter, meaningless variables.

0
101 Jan 04, 2012 at 07:32

So based on that logic I have no absolute need to even look into or waste time with typedefs?

0
101 Jan 04, 2012 at 13:24

You never need typedefs. They just give you a shorthand, which is nice when you type signatures get long and complex. Like Sol mentioned, it happens with function pointers, but also with C++ templates sometimes.

std::map<std::vector<std::string>, std::list<std::pair<unsigned long, float> > >


Type that 10 times fast…

0
101 Jan 04, 2012 at 13:57

@fireside

They’re a way to obfuscate your code and make it more unreadable.

I do agree it can make code abusers more abusive. It’s like giving a switchblade-based thief a gun…

0
101 Jan 04, 2012 at 23:37

There are other practical uses for typedefs in addition to short-hand notations. Here are couple:

1)

typedef unsigned uint32;

2)

define policy classes:

struct my_policy

{

typedef list<int> container;
];

template<class T> void foo(const T &v_)

{

typename T::container c;

}

And you can use function pointers without typedefs:

void foo(int) {}
void(*bar)(int)=&foo;

Cheers, Jarkko

0
101 Jan 05, 2012 at 04:39

Okay, thanks then.

I now know that a typedef is 100% officially useless now(to me). :)

0
125 Jan 05, 2012 at 14:47

They are very useful if you work on different platforms, if you only code for a single platform they have limited use.

int may be 8 , 16, 32, 64, 128, or N bits in length, you cannot assume it’s 32 bits.

so having a little header file with

#if defined( __64__)
typedef int bigint;
#elif defined(__32__)
typedef long bigint;
#else
#error no idea how big the cpu is
#endif

and then using bigint in all your code is a real lifesaver.

0
101 Jan 05, 2012 at 22:05

@fireside

They’re a way to obfuscate your code and make it more unreadable. You don’t see it in java or c# because there was a movement toward understandable, readable code. People that use it tend to also use 3 letter, meaningless variables.

Please take your FUD elsewhere. C++ is not C# or Java, and the C++ type system is exponentially more complex than those other languages. Also, you’re just plain wrong. A feature similar to typedefs exists in C#:

using FooList = List<foo>;


Typedefs can actually help make your code more readable, especially when dealing with lots of nested templates. They’re not meant to make your code shorter, and if you do use them for that purpose you’re just misusing them.

As for whether you actually *need* them, if you work with elaborate template frameworks you actually do. They are used to define type traits (see STL iterators and allocators for example - you can define an allocator that defines a pointer to T to be something other than simply a T*). Or, if you create crossplatform code, a type might be different on each platform, so typedefs are useful there as well.

But I guess you’re that wiseass that defines your function returning a pointer to another function as:

int (*MyFuncThatReturnsFuncPtr(int, float)) (const char *);

// rather than

typedef int MyFuncSignature(const char*);
MyFuncSignature* MyFuncThatReturnsFuncPtr(int, float);


Yeah, the first one is way more readable! </sarcasm>

0
126 Jan 06, 2012 at 00:34

Yeah, the first one is way more readable! </sarcasm>

To each his own, I don’t see how one is more readable than the other. C# uses delegates which supposedly have some advantages over function pointers.

• Delegates are similar to C++ function pointers, but are type safe.
• Delegates allow methods to be passed as parameters.
• Delegates can be used to define callback methods.
• Delegates can be chained together; for example, multiple methods can be called on a single event.
• Methods don’t need to match the delegate signature exactly. For more information, see Covariance and Contravariance
• C# version 2.0 introduces the concept of Anonymous Methods, which permit code blocks to be passed as parameters in place of a separately defined method.

I’ve only seen typedefs misused, so I wouldn’t actually know about legitimate uses. I see them used for sloppy shorthand like 3 letter variable names. Wasn’t trying to spread FUD. Forgive me oh great and learned one. And no sarcasm there because you could code circles around me in your sleep.

0
101 Jan 06, 2012 at 00:50

I don’t see how delegates versus function pointers are even in the slightest bit on-topic here. We were talking about typedefs, not comparing language features.

0
126 Jan 06, 2012 at 01:20

Hmm, yes. I think I lost track of the topic, there. Part of it was on topic, though. I think I was originally implying that c++ is sort of a messy language with my original post and went off on a tangent.

0
155 Jan 06, 2012 at 05:11

@JarkkoL

typedef unsigned uint32;

That’s actually one thing that annoys me. There’s a lot of libraries that implement their own typedefs for each data type. You run accross someone’s method and read “gl_int”, “my_library_name_int”, “word”, “dword”, etc. I wish people wouldn’t do that. I do love the shorthand though. uint, ushort, etc. should have been the standard since day one.

0
101 Jan 06, 2012 at 06:00

I’m pretty sure I’d rather torture someone to death than use a typedef.

0
101 Jan 06, 2012 at 06:57

@TheNut

That’s actually one thing that annoys me. There’s a lot of libraries that implement their own typedefs for each data type. You run accross someone’s method and read “gl_int”, “my_library_name_int”, “word”, “dword”, etc. I wish people wouldn’t do that. I do love the shorthand though. uint, ushort, etc. should have been the standard since day one.

Those libraries are only doing that to avoid clashing with client code. I kind of prefer these, to a compilation error.
Everybody should just use <cstdint>, then we can all live in peace :)

One thing I personally use typedefs for, is for my resources. Instead of havig to write Resource<Texture> or Resource<Model> or Resource<etc.>, I typedef them to RTexture and RModel respectively. Less typing = faster coding.

They can also be used to easily attach information to values:

// Unknown unit
float getDistance(Position p);

// Same type, but now we know the unit
typedef float meters;
meters getDistance(Position p);

0
125 Jan 06, 2012 at 11:07

Believe me when you have to port as much code as I do, you LOVE it when people use typedefs.

In my system int is not a valid data type.

I have KDint64,KDint32, KDint16,KDint8 etc. but no int.

So my alternatives are…

1) Go through every line of code changing the variables
2) Use a global find and replace
3) use typedefs.

No brainer really

0
101 Jan 06, 2012 at 14:42

@TheNut

That’s actually one thing that annoys me. There’s a lot of libraries that implement their own typedefs for each data type. You run accross someone’s method and read “gl_int”, “my_library_name_int”, “word”, “dword”, etc. I wish people wouldn’t do that. I do love the shorthand though. uint, ushort, etc. should have been the standard since day one.

How would you ensure the size of a fundamental type without typedef in C++ then? IIRC, C++0x has uint32_t but prior to that?

0
155 Jan 06, 2012 at 17:52

Given the way things are, you can’t. Thus it’s a necessary evil, but an evil nonetheless. The C standard should have added specific primitive type sizes since day one, like what .NET did. As Stainless wrote above, there should have been an int16, int32, int64, etc. I don’t believe a developer should be responsible for that.

0
101 Jan 06, 2012 at 23:21

@TheNut

The C standard should have added specific primitive type sizes since day one, like what .NET did.

But that’s comparing apples and oranges. .NET code only runs on a specific welldefined machine - the .Net virtual machine. C, on the other hand, is designed to be compiled for any thinkable platform. It’s perfectly viable for a C implementation to have an int type that is, say, 19 bits.

And in that respect, it makes no sense for you to define your loop counter to be of a size-specific type, as long as it’s large enough to hold all the possible iteration values. You usually just want the type that is most efficient. And an int32 on a 64 bit platform might not necessarily be it. Hard-defined sized ints only make sense for storage (eg, in structs and classes), not for local function variables.

0
101 Jan 07, 2012 at 03:11

I think sometimes typedefs are necessary. I’ve been in a situation like this before

template<class edge_type, class node_type>
class Graph
{
public:
typedef edge_type EdgeType;
typedef node_type NodeType;
[…]
};

template<class graph_type, class heuristic>
class GraphSearchAlgorithm
{
}

I don’t think they are avoidable in a situation like this. If they are, correct me.

Also, I hate overusing typedefs myself, but sometimes it can be handy to use them when you get to deal with annoying dependant names like A::B::C: :D::E::F::G::H::I<J::K::L<M::N: :o, P::Q::R< S::T, U::V> >::W::X>::Y::Z ;D

You might think it’s good to see where the types come from for example, without having to scroll over the identifier. But sometimes it’s really not worth sacrificing the speed and code clarity of using typedefs, especially when dealing with ultra long names. This doesn’t mean that you have to make your typedefs global, it can be good to do so within a class or function too, and it will hide outside of those functions/classes

They’re also way better than defines for debugging.

0
125 Jan 07, 2012 at 09:57

#defines are evil

I lost three hours on Thursday trying to figure out why the compiler was throwing out an error about a loc does nothing.

It was in a bit of code that specified inline functions on a namespace rather than in a class. So at first I thought it was an issue with our compiler.

Eventually I forced the compiler to save the results of the preprocessing and found that I had left a semicolon dangling on a #define

After hitting my head against the wall a few times I fixed the problem in 2 seconds flat.

Knowing the real size of variables was very important to me, with the new compiler it’s not so much of an issue, but with the last compiler I spent most of my time sorting out bugs like this…..

typedef struct Buggy
{
char a;
int b;
char c;
union
{
char * pa;
int * pb;
}u;
}buggy_t;

If char is a KDuint8 then I will get an alignment error every time I access pb.

Typedefs are really useful when you move away from main stream coding and start working on devices other than PC’s