Reorganizing code in c++

6837d514b487de395be51432d9cdd078
0
TheNut 179 Mar 07, 2010 at 15:05

Having developed in C# for a while, I’ve grown accustomed to the way things are organized there. I was thinking about revamping my C++ code into a similar structure. What I’m thinking of in particular is this:

Classical Approach

#include "System/String.h"
#include "System/Thread.h"
#include "Math/Vector2D.h"
#include "Math/Noise/Perlin.h"
...
String str = new String()
str.Insert(0, "Hello");

Desired Approach

// System.h includes all classes in the System namespace.
#include "System.h"
// Math.h includes all classes in the Math namespace.
#include "Math.h"
// Math/Noise.h includes all classes in the Math::Noise namespace.
#include "Math/Noise.h"
...
System::String str = new System::String();
Math::Noise::Perlin perlin = new Math::Noise::Perlin();
...

What I like most about this is that when I call up the namespace, I will get a list of all the classes that belong to it, so I can quickly scan around and see what I could use. It also cuts down on the amount of including being done and now I can focus more on what classes I want to use and less on knowing all the include paths and files I need.

The only disadvantage I can see is that a change to one header requires a lengthy recompile of everything attached. Internally in the library I can always use classical class including in order to avoid that when debugging, but once built this becomes an option for application developers to take advantage of.

What are your thoughts about this?

13 Replies

Please log in or register to post a reply.

A638aa42130293f319eda7fa4ba121f4
0
fireside 141 Mar 07, 2010 at 15:40

I guess it’s a good idea. I’m not the best with c++, though. I don’t really like the lengthy include files that normally happen. I think another solution I’ve seen for that is to have a header that is just a bunch of includes.

B7568a7d781a2ebebe3fa176215ae667
0
Wernaeh 101 Mar 07, 2010 at 21:01

Currently, I’m using a similar per-module setup in the C++ coding standard at work.

Originally, I had one header / source file per class. However, this had several disadvantages.

* For instance, this one-by-one setup means many translation units, which is costy, in particular if you use lots of boost or other template magic internally.

* Many (small) translation units also mean a smaller “system headers” to “own code” ratio - which in effect means parsing the same system files many times all over again.

* This also means many include directives at the top of source files, most of which reference to things anyways contained inside a single “module”.

*Additionally, older compilers (don’t laugh - sometimes we are still down to VC++ 6) do (measuredly) better optimization within the latter setup.

* Finally, I think it helps with documentation of intent: The smallest element of a system typically is not a class (when do you actually use a class standalone…) but a module, which contains a series of interlinked classes. For example, consider a file system: The file system has classes FileSystem, File, and FileNotFoundException. Neither of these is typically used on its own, they depend on each other, so why place them in different translation units or in different headers ?

I think the roots of the “one-class-one-file” paradigm are to be found somewhere within Java programming, where there simply may not be more than one openly available class in each package. This is somehow a complement of the C++ roots: In C, a translation unit typically was a functionality module (i.e. an entire Java package, i.e. a series of classes…)

These days, I pack everything together that forms an entire module (i.e. a file system, a rendering context, a physics context, …). Hardware-specific elements (… OglContext, W32Window, …) end up in modules of their own. If the associated source files become too crowded, they can be split up into different translation units, but this needn’t necessarily effect the associated headers. If headers become too crowded, this is a typical sign of an overloaded module => the module should be split into subfunctionality.

The module-based header architecture solves all of the problems named above.

Finally, it also gives a clear indication where to place documentation for an entire submodule: Often, documenting single classes within single header files ends up with documentation for the entire submodule in one of the major classes. In a shared module setup, the module documentation can simply be placed at the top of the module header.

The namespace-per-header issue is something I can’t comment on in as much detail - we generally use but a few namespaces per project, since this saves typing, and a single project needn’t protect its “main” namespace from itself ;)

Hope this helps :)

Cheers,
- Wernaeh

Fe8a5d0ee91f9db7f5b82b8fd4a4e1e6
0
JarkkoL 102 Mar 07, 2010 at 22:16

You can provide it as an option, but I wouldn’t recommend forcing it. I.e. just have system_all.h, math_all.h, etc. which include everything in system/math/etc. but programmers should still be able to include math/vector2d.h to cut down compilation times.

36b416ed76cbaff49c8f6b7511458883
0
poita 101 Mar 08, 2010 at 17:11

I thought everyone did this? O_O

Ed823df35ecbbb211ffda5974a156439
0
kvakvs 101 Mar 09, 2010 at 10:32

Won’t you end with tens of classes and hundreds of functions per file.
That’s just troublesome when you have to navigate through 2-5k lines of code (be thankful its not 20k lines), or trying to scroll down through huge file like this. When working on 2-3 different classes at a time, I’m using Ctrl+Tab to switch between files, that won’t really be possible in a single module. How would you solve this - bookmarking? Splitting screen into 2-3 views? How comfortable would you be working like this?

I’m definitely in favor of having nicely split code one-class-per-file, and to group them up there are Folders in solution explorer. Build time while using Boost? Modern PC’s are cheaper than cost of your time. Ask your boss to buy you better PC, and organize your code to be easier for maintaining, not build time.

When you’re trying to compare C# to C++, its impossible to get same level of support from IDE in C++ like what you can get in C#. C++ does not have clearly defined language semantic, to know meaning of a symbol, you have to parse big chunk of project files from start to end, to resolve all preprocessor macros, templates, inlines and whatever else inventive programmers added to the project code. That is a very time-expensive operation.

6837d514b487de395be51432d9cdd078
0
TheNut 179 Mar 09, 2010 at 11:41

@poita

I thought everyone did this? O_O

Most people still develop in C actually. C++ is a novelty language :)

@JarkkoL

but programmers should still be able to include math/vector2d.h to cut down compilation times.

Which is why I decided to stick with the classical approach for building the library and having a post-build process to create the higher level header files. At that point compile times won’t matter since the core framework has already been built.

kvakvs, I’m not hoarding code into a single file. Everything is still split the way it should be, only that there’s one more level of abstraction to assist with application development. The idea of getting into the habit of not individually including classes will save time down the road since I don’t have to remember the class path and file name or look-up what classes are available in a particular module (VS intellisense isn’t that primitive). Currently I have to do all this hunting in documentation or manually look in windows explorer.

Fe8a5d0ee91f9db7f5b82b8fd4a4e1e6
0
JarkkoL 102 Mar 09, 2010 at 11:54

I think one class per file just bloats your code base with files and makes things harder to navigate. IMO good solution is something between one class and everything in a file (:

A8433b04cb41dd57113740b779f61acb
0
Reedbeta 167 Mar 09, 2010 at 17:43

@JarkkoL

I think one class per file just bloats your code base with files and makes things harder to navigate. IMO good solution is something between one class and everything in a file (:

Yes, Wernaeh brought up the idea of modules, which are often a constellation of related classes/structs/functions/whatever, and that seems to me the appropriate granularity for header/source files.

Ed823df35ecbbb211ffda5974a156439
0
kvakvs 101 Apr 16, 2010 at 00:09

Regarding faster compilation times, and combining two ways of organizing your code (one class-one module vs. many classes-one module).

Discovered this for myself today: http://en.wikipedia.org/wiki/Single_Compilation_Unit

Tried it at work on our current game in development, also on my side-project which i’m doing at home. For each group of sources created a CPP file to include each of CPP modules, and then clicked “Exclude from build” for them. So they are still module per class, but they compile together as a big module. Precompiled headers also work fine with this.

To say least, I’m pleasantly surprised how it worked out.

Fe8a5d0ee91f9db7f5b82b8fd4a4e1e6
0
JarkkoL 102 Apr 16, 2010 at 05:57

Yes, that’s old trick usually referred as “unity build”

340bf64ac6abda6e40f7e860279823cb
0
_oisyn 101 Apr 16, 2010 at 09:51

@kvakvs

That’s just troublesome when you have to navigate through 2-5k lines of code (be thankful its not 20k lines), or trying to scroll down through huge file like this. When working on 2-3 different classes at a time, I’m using Ctrl+Tab to switch between files, that won’t really be possible in a single module. How would you solve this - bookmarking? Splitting screen into 2-3 views? How comfortable would you be working like this?

Bookmarking helps, or something like alt-m in visual assist which brings up a list of functions in the current file. Then simply type a substring for the function where you want to navigate to.

99f6aeec9715bb034bba93ba2a7eb360
0
Nick 102 Apr 16, 2010 at 12:08

@kvakvs

To say least, I’m pleasantly surprised how it worked out.

Did the executable get any faster and/or smaller?

Personally compile time is less relevant to my medium-size projects, but performance is everything. Visual Studio supports link-time optimization, but I’ve always wondered how well that really works… :ninja:

Ed823df35ecbbb211ffda5974a156439
0
kvakvs 101 Apr 16, 2010 at 12:38

@Nick

Did the executable get any faster and/or smaller? Personally compile time is less relevant to my medium-size projects, but performance is everything. Visual Studio supports link-time optimization, but I’ve always wondered how well that really works… :ninja:

I compared them before and after, and no, they do not differ in size, because Visual Studio comes with an excellent link-time code optimizer. Also runtime performances were not compared with any profiler, and from what I was feeling performance did not change.

But the build speed improved, that alone was worth the change, as in our team we hit Build/Run button quite often. And I will plan to build my both projects on Linux with GCC, which behaves, kind of, differently from Visual Studio, so may get some profit there.

For when you need to improve your performance, you won’t be able to do it by simply merging all files, you should run profiler and do some estimations yourself.