Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sorry, but the proposal for the included shading language looks pretty braindead to me.

See for yourself: https://github.com/libsdl-org/SDL_shader_tools/blob/main/doc...

Deviations from C-language families, such as "Flow control statements don't need parentheses." are completely unnecessary, I think. Same goes for "Flow control statements must use braces."



The current SDL GPU API does not intend to use this shader language. Instead, users are expected to provide shaders in the relevant format for each underlying graphics API [1], using whatever custom content pipeline they desire.

One of the developers made an interesting blog post motivating this decision [2] (although some of the finer details have changed since that was written).

There is also a "third party" solution [3] by another one of the developers that enables cross-platform use of SPIR-V or HLSL shaders using SPIRV-Cross and FXC/DXC, respectively (NB: It seems this currently wouldn't compile against SDL3 master).

[1] https://github.com/libsdl-org/SDL/blob/d1a2c57fb99f29c38f509...

[2] https://moonside.games/posts/layers-all-the-way-down

[3] https://github.com/flibitijibibo/SDL_gpu_shadercross


Thanks for the clarification. From the sparse documentation of SDL_GPU it was somewhat difficult to understand which parts are part of the SDL 3 merge, and which parts are something else.

I did find an example of using the GPU API, but I didn't see any mention of selecting a backend (Vk, etc.) in the example - is this possible or is the backend selected e.g. based on the OS?


> is this possible or is the backend selected e.g. based on the OS?

Selected in a reasonable order by default, but can be overridden.

There are three ways to do so:

- Set the SDL_HINT_GPU_DRIVER hint with SDL_SetHint() [1].

- Pass a non-NULL name to SDL_CreateGPUDevice() [2].

- Set the SDL_PROP_GPU_DEVICE_CREATE_NAME_STRING property when calling SDL_CreateGPUDeviceWithProperties() [3].

The name can be one of "D3D11", "D3D12", "Metal" or "Vulkan" (case-insensitive). Setting the driver name for NDA platforms would presumably work as well, but I don't see why you would do that.

The second method is just a convenient, albeit limited, wrapper for the third, so that the user does not have to create and destroy their own properties object.

The global hint takes precedence over the individual properties.

[1] https://wiki.libsdl.org/SDL3/SDL_HINT_GPU_DRIVER

[2] https://wiki.libsdl.org/SDL3/SDL_CreateGPUDevice

[3] https://wiki.libsdl.org/SDL3/SDL_CreateGPUDeviceWithProperti...


> The global hint takes precedence over the individual properties.

This seems like a bad design - when I explicitly pass something to a function I expect it to be honored and not overwritten by some global state, especially one that can come from an environment variable.

I'm not even sure how a hint or a null parameter makes sense at all here since the program will be responsible for passing the shaders in the correct format (which isn't even checked outside of debug mode lol). There also doesn't seem to even be a way for the application to check what shader format is supported by the mystery device it was handed against its wishes, outside of getting the name and then mapping that back to supported shaders which may or may not change in the future.

Having two entry points for device creation with widly different argument types (one using flags, one using string-based properties with comically long names you might find the Java world) is also not something I would have expected in a newly designed API - that kind of uglyness is usually the result of changing requirements that the initial entry point did not forsee.


Deviating from conventions to avoid footguns is so misguided. I've been writing C family languages for like 15 years and never once accidentally done a if (foo); whatever;

The convention itself IS the thing that stops you from fucking that up. It's the kind of thing you do once 2 days into a 30 year career and never again.

I still think it's dumb in Javascript, where you could be using the language on day 2 of learning programming. But in a GPU shader language that it would be almost impossible to understand with no programming experience? It's actually insane.

Having said that everything else about this project looks pretty good, so I guess they can get a pass lol.


If control flow statements don't require parentheses to be parseable, doesn't that mean that it is the parentheses that are completely unnecessary?


I, on the other hand, find the C way brain dead and would be very happy with these changes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: