0
178 Nov 19, 2012 at 14:25

A while back Stainless made a post about creating galaxies. It sort of re-energized my interest in the subject. It didn’t help that Google also recently released a new WebGL demo showing off 100,000 stars from our local neighbourhood. Cool demo, but they got the galaxy formation wrong.. *oops*. I also discovered a cool new space project intuitively called Space Engine. A competitive alternative to Infinity, which has been on the back burner for quite some time. Nevertheless, there’s an interesting trend towards vast, space-based engines. All cool stuff both visually and technologically.

For myself, I dusted off my old OpenCL demo and started playing with galactic formations again. Trying to construct various galaxies like spirals and ellipticals with just 32K particles.

It’s crude and fairly slow, so I’m looking into implementing the Barnes-Hut algorithm to help speed things up and try to improve the resolution to 10 to 100 million particles and time permitting, real star formations using properties of gasses. My main goals are to produce a procedural galaxy that you can export high quality renders to a texture or a skybox and then use them for space based projects. I haven’t seen any tools out there that do this sort of thing. I’m also rather surprised that space engine and infinity don’t already provide tools (either free or even commercially) to do these sort of renderings. Terragen modelled their business around terrains, I could easily see a market for space renders.

#### 20 Replies

0
146 Nov 19, 2012 at 14:45

I haven’t given up on it. I’ve just done some research and discovered I can get away with opengles 2.0.

So I can have shaders :wub:

At the moment I’m playing about with generating the base bitmap which defines the shape of the galaxy.

One of the things I’m playing with is splitting the RGBA components into meaningful values.

R - Hue at long range
G - Star density
B - Hydrogen density
A - Dust density

Let’s throw some ideas (and code) about and see what we can come up with

0
178 Nov 20, 2012 at 00:40

That’s a nice plus. I cringe when I have to work with 1.X. It’s like I’m devolving my skills.

While my goal is to produce something intentionally slow and accurate, I have thought about real time solutions for free flying. Dust is perhaps one of the more important issues I’ve been thinking about lately. I think once you’ve nailed the dust problem, you’re already on your way to great looking galaxies. I’ve been analysing high res Hubble photos, and there’s some experiments I’d like to do. Note that solving the galactic dust problem should solve the nebula cloud problem too.

$F_D$ = dust density

1. Create dust clouds using pattern shaped, high frequency 3D noise and marching cubes to construct the geometry. Alternatively, use meta shapes. I believe this will create good looking 3D clouds when combined with backlighting and volumetric lighting. For the first stage, I’d like to just construct the geometry and play around with it in Blender to see what kind of neat things I can do.The dust density would simply equal to the volume of the geometry multiplied by some scalar amount to control the translucency. This could be approximated by comparing the depth values of the back facing polygons with the front facing polygons and the light source.

$F_D = (Depth_{Back} - Depth_{Front}) * Scalar$ if light is behind cloud
$F_D = (Depth_{Light} - Depth_{Front}) * Scalar$ if light is inside cloud
$F_D =0$ if light is in front of cloud

1. Similar concept, but instead of rendering 3D geometry I would render a single 2D noise map for the entire galaxy that would act as the middle slice along the plane of the galaxy. This texture, combined with a patterned filter such as a spiral or whirl pattern, would be used as my dust density function. On it’s own, it could probably be used to render the galaxy as a 2D surface floating in space. The texture would look something like this.

There’s a tutorial on it here. Moving in closer however, I’m not so sure how long this smoke and mirrors show will last. I’ve thought about rendering 2D slices from a 3D noise function, but I despise such techniques.

0
104 Nov 20, 2012 at 06:29

I love starfield backdrops, and these look like quality stars. When your rendering your armoured up space marines, this would look real slick behind them.

0
140 Nov 20, 2012 at 08:44

No one’s done a decent space shooter in ages. An xbox controller would be great for one and particle effects have improved a lot.

0
146 Nov 20, 2012 at 09:30

I have been working on FOFT 2 for a while, it’s a background task so progress is slow.

My objective is to do the traditional space epic, but using a combat system that works on devices like TV’s and mobile phones.

The bit that has stopped me so far is the galaxy display, once I get a problem in my head I go a bit deep thought, my circuits are committed.

I am wondering if we can come up with a way of generating an encoded 2D bitmap that defines the galaxy.

My plan at the moment is to use a CLOD system, at long range draw a billboard for the block, as you get closer multiple billboards, at close range a set of billboards and a set of sprites for the stars.

A similar system is often used for grass.

But I really don’t like the idea, it’s lighting the dust. I cannot see a good way of doing it real time.

So I am thinking about using gpu raytracing for the dust, and then adding the stars with point sprites. However then we have the issue of mapping the dust.

It’s a complex problem.

0
146 Nov 20, 2012 at 11:37

0
146 Nov 20, 2012 at 11:37

Rendering the dust density as a particle system of point sprites

Runs fast on a laptop, but too slow for other devices.

0
178 Nov 21, 2012 at 02:24

Hmmm, that galaxy looks familiar :D

How are you setting up your point sprites? I see some interlaced patterns, which makes me wonder how much overlap there is between them. Fillrate is a killer on mobiles. One thing I would probably do is leave that galaxy as a single billboard and when the camera gets in close enough to a sector, you could start fading out the billboard and start rendering in the details for that sector. The google universe demo I posted above does something like that, but admittedly their transition effect is a bit harsh looking.

On the topic of star colours, I spent some time reading up on colour spaces and black bodies. Thus far, I’ve been colouring my stars based on randomly selecting from a palette of known star colours, with a small variance to add noise. My biggest problem with this technique is that if you want to have fun with a star, for example convert a medium yellow star to a red giant, you have to fake the process. Using the CIE colour space and Planck’s law, you can conveniently specify the temperature for the star and have it’s RGB value computed. It’s not all that much of an expensive process either and you could cache the results in a 1D texture if you wanted to. Some results below show how energetic gas heats up and changes colour.

On the left, you have an idle universe that generates little to no energy, and thus the gas emits a black or infrared to dark red light. In the middle, things start to pick up, with the outer gas packing up more energy and heat as it races to the centre of gravity. Finally, the stars in the centre have tons of heat and energy, giving off brighter, smaller wavelengths. It’s an interesting effect and quite nice compared to randomly painting the universe. Although on the grand scheme of things it’s still quite incomplete since there are more things to factor in, such as nuclear forces.

@Fireside

No one’s done a decent space shooter in ages.

Interesting you said that because I just found out today that Chris Roberts (former Wing Comm and Privateer creator) has crowd funded his new space combat game. I don’t know how well it will turn out for him, but Privateer and Wing Comm were fun games at the time, so I trust he’ll do well and add to the genre.

0
140 Nov 21, 2012 at 04:49

I don’t know how well it will turn out for him, but Privateer and Wing Comm were fun games at the time, so I trust he’ll do well and add to the genre.

Yeah, I heard about that a while back. The video looks really exciting and Privateer was one of the best in the genre as far as I’m concerned. It looks a little ambitious, but I hope he pulls it off.

0
104 Nov 21, 2012 at 08:02

those red ones are nice, just talking aesthetics… when im dotting stars and blurring nebulas in behind my characters, pretty is what counts the most. :)

0
118 Nov 21, 2012 at 08:16

Heh, I’ll just throw in my approach..

Formation is just some simple sin/cos trickery with noise, rendered through point sprites. (and yeah, galaxql is a sql tutorial).

0
146 Nov 21, 2012 at 09:19

The artifacts are really weird, they only appear at certain view angles, but the technique is too slow so I’m not chasing it.

On colour, there is a table somewhere that maps star temperature against colour. Things like this http://docs.kde.org/…lorandtemp.html just substitute temperature for energy.

The problem is that at the moment you seem to be ignoring dust, and dust is the real visual in a galaxy.

Has anyone seen a good gpu raymarching demo? All the ones I have seen use render to texture, which I don’t have.

I think that is the approach I need to use along with a CLOD

Braben is trying to cloud source a new Elite as well, I may try to invest, he doesn’t like me very much. :)

0
146 Nov 21, 2012 at 12:51

Assuming the input to the display routines is a bitmap, ignoring what each byte of the bitmap actually represents.

1) scan the bitmap and identify regions with the same dust density. Group these into a 3D box (CELL)
2) have an atlas of sprites with different lighting, use another byte in the image to define a lighting state, parse these to choose a sprite for the cell
3) draw each cell as an axis aligned quad

Should reduce the number of draw calls at least and the structure will be a good for a CLOD solution

0
140 Nov 21, 2012 at 13:01

I don’t think the exact rendering would be that important since you would obviously have to do jumps to get where you need to go. I think you would bore someone to death flying between galaxies or even stars. That’s the trouble with realism. Kind of like rpg’s where all you do is walk over vast terrains.

Off topic, but I found a space game that uses xbox controller. Star Wars: Starfighters, made in 2002. One thumbstick is used to pan and one is used to rotate the screen. Takes practice, but I like it. The other defaults don’t work for me, though. They didn’t even use the triggers for firing.

0
178 Nov 21, 2012 at 18:10

Hey Sol, did you get your database from Hipparcos or some such? I’ve often thought about digging through some star databases to get some accurate data to work with. Not sure if I would use it directly, but nonetheless interesting stuff.

Stain, I’m not ignoring dust intentionally. It’s just my solution can’t render millions (billions?) of particles (gas) in an acceptable real-time way just yet. My goal is to get something like this. With enough particles (gas), the dust will form on its own. Once I’m at this stage, I will either export the data as a volume mesh, or a single billboard for distant rendering, or as a skybox when I don’t plan to leave a solar system, but want a nice backdrop. It’s purely a pre-stage process. Real time stuff obviously requires a completely different process.

The problem I see with an atlas approach is that you’re limited to finite resolution and it relies on fixed data. Well, it could be procedural if your input textures were created mathematically, but then that is where the bulk of the work would be done and where the interest lies. To achieve infinite detail, I believe the process must be mathematically rendered and it should look the same both on the outside and inside. Lighting dust is an interesting problem that needs to be solved, but I think the first step is to get the dust looking proper from all angles. Once that part is nailed, I think solving the lighting issue could be solved using light scattering techniques, which is used today for real time clouds.

fireside, for the most part, yes. But just a black skybox with white dots does not look very good :) X3 had some pretty backdrops, but that likely required hours, days, or maybe even weeks of effort to make and tweak. I wouldn’t be surprised if some of them were even hand painted (digitally of course). Having most or even some of that content procedurally generated would be a nice bonus. And that’s only from a backdrop perspective. Imagine if jumping from one sector to another wasn’t just a load screen. Imagine if you started to go FTL (with all the fancy effects) and you dynamically passed through nebulas, star systems, and such. It would raise the bar, guaranteed. Every sci-fi nut would be all over it.

0
146 Nov 22, 2012 at 09:22

I think we will have to come up with two solutions.

One the all singing and dancing version for consoles and laptops

The other a cut down version for other devices

Nebulas are a whole new can of worms, to get them we will need to add star death. A few years ago I contacted a professor in Sweden who was working on simulating nebula creation. I have his source code somewhere…. let me look

0
118 Nov 23, 2012 at 05:56

@TheNut

Hey Sol, did you get your database from Hipparcos or some such? I’ve often thought about digging through some star databases to get some accurate data to work with. Not sure if I would use it directly, but nonetheless interesting stuff.

Nah, it’s just noise and sin and cos trickery. I did look into some real star databases after that but they were too much of a hassle (for the time being, at least).

0
146 Nov 23, 2012 at 09:36

I’ve found the nebula code I was on about.

**********************************************************
*  Rectangular FCT hydro routine in two dimensions,      *
*  according to 1991 Astron. Astrophys. 251, 369         *
**********************************************************
*
**********************************************************
*  This code was written for educational purposes        *
*  the GNU Public License.                               *


It’s pure c and very poorly written, he’s an astrophysicist not a coder

 *
* The program takes about 180 bytes per hydro cell.
* A simulation on a 500x500 grid then requires 45 Mb heap memory.
* The limits have been set to 1024x1024, which means
* 210 Mb of heap memory. This is still practical on an
* 1.25 GHz PowerBook.
*
* -------------------- Running Time Benchmark ---------------
*
* On the 1 Ghz PowerBook G4, the code takes 9.7 microseconds
* per gridpoint and per timestep in full Strang time
* splitting mode.
* Thus, 1000 timesteps on a 500x500 grid take 40 minutes.
* In ordinary flux splitting mode, it takes
* 7.7 microseconds, for 32 minutes total.
*


I can make it available if anyone wants it.

0
118 Jan 30, 2013 at 08:34

@TheNut

Hey Sol, did you get your database from Hipparcos or some such? I’ve often thought about digging through some star databases to get some accurate data to work with. Not sure if I would use it directly, but nonetheless interesting stuff.

Nah, like I mention above, it’s just random points with some sin/cos trickery. Feel free to look at the sources if you’re interested =)

I did consider using the “real” data, but never found a suitable source, and in retrospect it “might” have made the galaxql package “tad bit” bigger. (And I’d still have to generate random planets and moons, so..)

EDIT: oops, I replied to this already. Too bad there doesn’t seem to be a ‘delete post’ option that I can see..

0
146 Apr 23, 2013 at 00:01

I’ve come back to this at last and I’m trying a new approach.

I’ve generated a texture atlas of 64 128 by 128 images which I want to volume render.

It’s not very good at this stage as the only database of stars I could find was limited to 88,000 stars.

I have parsed the stars into a 128 by 128 by 32 array and counted the number of stars in each bin. I have used this value as alpha and grabbed the colour from the standard galaxy rendering to end up with an 8 by 8 array of textures.

Does anybody have a volume renderer that can handle this type of texture atlas before I spend hours writing one and find out the data is rubbish.