76
Why do libraries define their own true and false?
(pawb.social)
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Follow the wormhole through a path of communities !webdev@programming.dev
I’m not sure I understand readability? I guess is disambiguates numeric variables if you used 1 and 0. But with true and false available that would seemingly do the same thing. You still have to know what the arguments your passing are for regardless.
A function call of "MyFunction(parameter: GLFW_TRUE)" is more readable than "MyFunction(parameter: 1)". Not by much, mind you, but if given the choice between these two, one is clearly better. It requires no assumptions about what the reader may or may not already know about the system.It communicates intent without any ambiguity.
Does C have a logical type these days? Never use to.
I guess in reading not until c99(see other comment); they just used integers in place of Booleans, in which case your readability statement makes more sense given the historical context
stdbool.h's true and false are macros that expand to integers 1 and 0
C23 adds a proper bool type
C99 has a proper boolean (_Bool), C23 makes true and false booleans (and properly gives _Bool the name bool without the macro)
Only 50 years after it's creation.
For what it is worth. I learned C in 1990. Switched largely to Python in 1998.
True is not zero in C. You can always compare a value to 0 if you need false but cumparing to any single value for true is wrong. Often functions will return some calculated value which would be zero for false and who cares - it isn't zero so return it for true. Thus all defines of true are suspect.