gpt4 book ai didi

opengl - 为什么 GLsizei 没有定义为无符号?

转载 作者:行者123 更新时间:2023-12-04 19:16:26 25 4
gpt4 key购买 nike

我正在查找 typedefGLsizei用于 iOS 上的 OpenGL ES 1.1 实现,并惊讶地发现它被定义为 int .一些快速的谷歌搜索表明这是正常的。 (包括普通 OpenGL。)

我原以为它会被定义为 unsigned intsize_t .为什么它被定义为只是 Vanilla int ?

最佳答案

除非您有任何 4GB 数据结构,否则这似乎不太可能成为问题。

这是某人的回答:http://oss.sgi.com/archives/ogl-sample/2005-07/msg00003.html

Quote:

(1) Arithmetic on unsigned values in C doesn't always yield intuitively
correct results (e.g. width1-width2 is positive when width1<width2).
Compilers offer varying degrees of diagnosis when unsigned ints appear
to be misused. Making sizei a signed type eliminates many sources of
semantic error and some irrelevant diagnostics from the compilers. (At
the cost of reducing the range of sizei, of course, but for the places
sizei is used that's rarely a problem.)

(2) Some languages that support OpenGL bindings lack (lacked? not sure
about present versions of Fortran) unsigned types, so by sticking to
signed types as much as possible there would be fewer problems using
OpenGL in those languages.

这两种解释似乎都有道理——我自己曾多次遇到过 1) 愚蠢地将 NSUInteger 用作循环计数器的情况(提示:不要这样做,尤其是在倒数到零时)。

关于opengl - 为什么 GLsizei 没有定义为无符号?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/8996743/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com