Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was messing around a bit with SDL2 and either I was doing something wrong or it was just plain slow. My machine is plenty fast, but even just blitting a few dozen PNGs around a screen 60 times a second was pushing its limits. I freely admit I may have been doing something wrong, but I was surprised at just how inefficient it was at a task that we used to do without too much trouble on 1Mhz CPUs.

Maybe SDL_RenderCopy is the wrong API to use to blit things from a sprite sheet onto a display? The docs didn't give any warning if this is the case.



How recent a version were you using? Plenty of games and graphical apps use SDL2 under the hood, and rendering rects from a spritesheet is trivial. Recent versions use the geometry API for rendering rects, so it should be able to handle tons of sprites without much effort.


I'm using SDL2 2.30.0. The main loop is pretty simple, it does a few SDL_RenderFillRects to create areas, then several SDL_RenderCopy where the source is a SDL_Texture created from a SDL_Surface using SDL_CreateTextureFromSurface that was loaded from files at boot. A final call to SDL_RenderPresent finishes it off. They do include an alpha channel however.

I was expecting the sprite blitting to be trivial, but it is surprisingly slow. The sprites are quite small, only a few hundred pixels total. I have a theory that it is copying the pixels over the X11 channel each time instead of loading the sprite sheets onto the server once and copying regions using XCopyArea to tell the server to do its own blitting.


This should be plenty fast. SDL_RenderCopy generally should be doing things the 'right' way for on any video card made roughly in the last 15ish years (basically binding a texture in GPU RAM to a quad).

You probably need to due some debugging/profiling to find where your problem is. Make sure you aren't creating SDL_Textures (or loading SDL_Surfaces) inside your main game play loop. You also may want to check what backend the SDL_Renderer is utilizing (e.g. OpenGL, Direct3D, Vulkan, Metal, software). If you are on software, that is likely your problem. Try forcing it to something hardware accelerated.

Also, I vaguely recall there was a legacy flag on SDL_Surfaces called "hardware" or "SDL_HWSURFACE" or "SDL_HWACCEL" or something. Don't set that. It was a a very legacy hardware from like 25 years ago that is slow on everything now.


Whatever the problem is, it probably isn't SDL. Here's a test project I worked on[0], and I'm using a garbage laptop. The sprites aren't that big but if you're just using a single texture it shouldn't matter, since SDL does sprite batching anyway.

Your theory might be right - the first thing I would look for was something allocating every frame.

You might ask the SDL Discourse forum and see what they think: https://discourse.libsdl.org/

[0]https://cdn.masto.host/krappmastohost/media_attachments/file...


> Plenty of games and graphical apps use SDL2 under the hoo

How many of them use the render API though rather than just using SDL to create a window and handle input (and perhaps manage an OpenGL context).


Is 60 FPS your screen refresh rate? Perhaps you have VSync enabled.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: