just how did you manage to do that with NN?
Thank you so much, TheNut! Since I couldn’t find any tutorial, I did assume it was something based on logic. Only thing, I didn’t know how to approach it. You did save me a lot of head scratching, and hopefully I still have most of my hair. Your very clear and detailed explanation helped me understand it very clearly, thank you. One more question, though… How would I go about animating sprites of different sizes? And I am not talking about empty pixels of wasted space. No, this is about a series of sprites that are already packed. They are of different dimensions.
It’s like this… The first frame sprite is a character standing idly, so his whole body takes center frame in the sprite. Next two frames show him outstretching his arm towards the left, and the final frame shows his arm completely outstretched. The final frame has his body to the right of the sprite image and his arm to the left end of the image. Basically, the final frame is wider than the first.
So, if I animate it as it is, his body offsets to the right a bit. Now, throughout the sequence, his head does not move, so what I was thinking was maybe putting an anchor point on any fixed point on his head and use this anchor point while animating the sprites. That way, his body stays where it is and his arm moves forward. Is this the right way of approaching it?
But then, some sprites show him bending, so head might not be a great anchor point. I was thinking his feet would be better as they stay fixed on the ground. Depends on the sprite… But the point is, am I approaching this correctly? Please clarify this. Thank you.
man, i buy top of the market and just raytrace em, quick as reflections, quicker if your doing refractions too.
Reedbeta posted on article discussing the technicalities of texture compression if you’re interested. The article is available here.
As Vilem pointed out, drivers can do both decompression and compression, although for quality purposes you should use an offline tool for that. For high quality, compression can actually take quite a bit of time. Keep in mind the output is just raw image data. You need to wrap this in your own header format to store important details about the texture (resolution, mipmaps, compression type, etc.) and deal with that when you’re uploading the data to the video card.
When working with texture compression, you should also poll for the supported formats on various platforms. S3 compressions are well supported on desktop hardware, but Android uses Ericcson texture compression. In the old days, one had to even decompress manually if the hardware didn’t support it (ghastly!). Just something to keep in mind.
I sent them an email and got a response less than a day later for the download link. From my understanding okam is going to do a public release of the Godot engine next month. The scripting language is GDscript which is similar to python.
I’m not aware of any tutorials on this subject. I developed my solution simply on intuition. You seem to have an idea what to do. You know for instance that you need a proper timer to cycle between frames and you’re looking for how to work with sprite sheets (aka: sprite atlases), so I’ll try and give you a few pointers.
I assume you are already fluent with OpenGL textures? How to load them into video memory, bind them, and place them on polygons? That’s about 80% of the problem. The other 20% is how to display only portions of the texture on the screen (see point 3 and 4).
2. Sprite Sheets
You will need some sort of file format that describes your (x,y) locations and width x height dimensions of all the sprites in your sprite sheet. I use my own Texture Packer tool, which outputs a single image file + XML file describing the coordinates. There are other tools out there more dedicated to that. Just doing a Google search for “Sprite Packer” will net you a few links.
It’s important to pack relevant textures into a sprite sheet. Sprite sheets serve two purposes. The first purpose was to reduce wasted texture space back when textures had to be a power of two. Simply uploading a single 100x200 image would expand to 256x256 in video memory, creating waste. Although that’s not a big concern anymore (video graphics has matured), speed is another key reason. Binding a single texture and then rendering a dozen sprites is more efficient than binding a texture for each sprite. This is most important when you get into rendering bitmap fonts (it follows the same principles as this).
Once you load in your sprite sheet, you want to translate the coordinates into texture space. When you deal with images, you often work in pixels. A 320x400 image for example. In video graphics, texture coordinates are normalized. That is, they are represented between 0.0 and 1.0. So let’s say you pack all your sprites in a 1024x1024 texture. Let’s say your 320x400 image is located at the offset (x,y) = (200,100). You need to prepare what I call a “Sprite Frame” that describes these coordinates in texture space. Simply put:
Frame.X = 200 / 1024;
Frame.Y = 100 / 1024;
Frame.Width = 320 / 1024;
Frame.Height = 400 / 1024;
You now have the texture space coordinates for your sprite.
4. Rendering the frame
I assume you know how to render polygons in an orthographic view (although you could also render sprites in perspective if you wanted). Regardless of your geometry and its transformations, if you wanted to render the above sprite on the geometry, you would pass the sprite frame coordinates into your vertex shader and do something like this.
spriteUV = SpriteFrame.xy + (UV * SpriteFrame.zw);
You pack your sprite frame into a 4d vector, where (x,y) represents the normalized top-left position in the texture and (z,w) is the width and height you calculated in step 3. UV is the original texture coordinates of the geometry, which generally should be planar. If you visualize this in your head, you’ll see that the sprite will be drawn to fill the entire area of the geometry (typically a quad). In your fragment shader, it’s a direct texture assignment:
gl_FragColor = texture2D(Sample0, spriteUV);
Timing is a bit more involved. Don’t think of timing just for your sprite animations, think of it as a global feature that you will want to use all over in your engine. I wrote my own timer class, which is based on an event and delegate design. In my render/update loop, I update the core timer (a singleton event that all instantiated timer objects listen to). Each timer object has a set interval and when that interval has passed, it will dispatch an event and notify the delegates. In my sprite engine, each timer triggered event will advance the sprite frame of the animation, which is generally defined in the sprite sheet. It looks a little something like this.
void MyRenderLoop ()
void SpriteClass::InitTimer ()
// Set interval to 30 FPS
mMyTimer.SetInterval(1.0 / 30.0);
void SprintClass::OnTimer ()
Hopefully this should get the idea across, but there’s a couple edge cases you have to take into account, such as when the frame rate of your game drops. You may want to skip frames, in which case you would need to check the interval that has passed and advance the number of frames based on that value. In my case, I would do this by setting the timer’s interval to 0 and getting updates every frame, then checking the elapsed time. Generally you don’t want to skip to many frames otherwise your animations get chaotic, so you have to design it with an upper bound in mind.
Hopefully this will get you started. Most of this stuff should feel intuitive. As long as you know the OpenGL API and how to render surfaces, textures, etc, then this should just be an application of logic.
Ok… So, if I wanted good quality textures, I should use an external tool to compress the textures and then load the compressed texture file into the program using the methods provided, right? I think I got it. Thanks, Vilem. I’ll ask if I have any further questions.
Huh, you’re mixing two things together…
Texture filtering - the main purpose of this is to reduce aliasing of applied texture on image. For the purpose of this we use texture magnification filtering (aka when 1 pixel in texture covers more pixels on screen), and texture minification filtering (aka when several pixels in image covers 1 pixel on screen).
The minification filtering is often done using MIP maps (multum in parvo) - I guess you know the principle, just put it into wiki and read how it works.
Texture compression is something else, this relates to texture storage. You can store textures are RGB8 (8-bits per channel), RGB32F (32-bit float per channel), R5G6B5 (5 bits red and blue, 6 bits red), S3TC DXT1 aka BC1 (this one is compressed using S3 texture compression algorithm - this is compressed texture format).
OpenGL supports hardware decompressing of several formats (F.e. named S3TC), and it can also compress these textures. The quality of compressed textures by hardware is low, so it is recommended to compress using other tools (I use my own), but F.e. The Compressonator from AMD works perfectly.
two days after this has been posted… I don’t feel like meditating more on this, I’v already moved on.
Don’t compare unreal to unity, its reputation is light years ahead. Check released titles and compare again.
How long ago did you contact them?
You can see above that the original poster has been very prompt in responding to a couple of initial enquiries – maybe you should exercise a little more patience if it hasn’t been long?
I’ve already finished about 3 small adventures with different engines, so I know basically what I’m getting into, except this time I have to do a lot more framework because I did those games with adventure game engines. It’s a long term project. It’s the deeper story that interests me this time, so there really isn’t anyway around it. I don’t really care if it’s popular or whatever. If it is, that’s great, but if it isn’t, I’ll still feel like I’ve achieved a higher personal goal. I was going to use an adventure engine, but the one I was using felt like it was getting abandoned and it was too hard to use with Blender. Now it’s definitely a long term project as I’m still working on the framework and the introduction. This will be a large game for me, but probably not all that large compared to most games.
is a larger story actually doable. for a single person. why not do a single level and put the aspects you want in that level and see how enjoyable the game is. before doing anything larger than a single level
Getting started on some music for an upcoming Kickstarter release! Really excited for this one.
i hate when somebody makes a post like this and doesn’t respond to the comments or emails….
I would have to agree that Unity is for minor players, which is probably why I am using it. I think anyone who makes anything at all with it is porting to phones. A lot of great games have been made with the Unreal engine, so I would disagree there. Probably the most from any engine.
I got 220 fps on my 680GTX. Very impressed with this example level. Would love to see how Esenthel could handle a more taxing benchmark like Unigine’s Valley demo.
Almost forgot to reply…
well I didn’t mean to sound like a **. I honestly never actually did that (I just mute sound and let them try to catch me), but I know a man who actually put them to court. He was receiving like 3 or 4 calls like this daily for almost a month.
Being him, I probably would do the same, it would just become too annoying.
None of the games I work on use an engine. Or rather they all use libraries of code they have written themselves, rather than an engine.
As far as I am concerned, Unity and the ilk are for minor players. Unreal has a better reputation, but not much.
Porting games from one platform to another is big business, I should know. It’s my main job :>
I’m sticking with Unity mainly because it exports to so many platforms. If I change engines too much, I end up not doing anything, and I’m used to Unities problems. I think Unreal free version, whatever that is, would be my next choice and maybe should have been my first. I didn’t have a computer that could run it decently, but now, even my laptop runs it just fine. A game is mostly a personal challenge for me so it doesn’t matter very much. I’m interested in doing a larger story than I’ve done before and that doesn’t really have much to do with an engine. Speaking of larger stories, I’m currently playing Brothers: A Tale of Two Sons, and am really blown away by the mixture of story and physical puzzles. What’s amazing to me is I hadn’t even heard of it until I saw a half price sale on Steam and just bought it to take a look. It just shows the kind of quality that having these engines is causing. It uses Unreal. I would hate to be competing in the commercial market right now.
Ive used unity, coding in that engine is no cakewalk. dealing with physics and graphics are not difficult with unity. agreed, but these two aspects can be worked around. if the coding structure is difficult. you can not work around that. I think you settled on unity, as your main engine. have you considered an engine with less popularity. but may allow for more productivity because of a better code structure
For sure the guy who made the call would lose his job over it, though, and that would just be mean. Stainless had his fun already, no need to get lawyers involved. :)
Haha, this made my day. I’d try contacting phone services about information about the call - as you clearly made a bet in it, which is an agreement (and in our country, agreement over phone is an agreement), also most phone calls are stored for I think 2 or 4 weeks, so you could clearly start a lawsuit against them in my country.
I actually do use geothermal. That’s the real funny bit.
The complex I live in has a geothermal plant built into it.