gpt4 book ai didi

c++ - SDL2 - 颜色快速变化时性能会受到奇怪影响

转载 作者:行者123 更新时间:2023-12-01 21:57:05 25 4
gpt4 key购买 nike

当我发现这一点时,我很惊讶,最初我以为我的算法有问题,但经过仔细检查,我发现颜色改变得越多,对性能的影响就越大。这是为什么?

这是(全部)代码:

#include <iostream>
#include <SDL2/SDL.h>

const int WIDTH = 1024;
const int HEIGHT = 768;

int main(int argc, char *argv[])
{
SDL_Window *window;
SDL_Renderer *renderer;
SDL_Texture *texture;
SDL_Event event;

if (SDL_Init(SDL_INIT_VIDEO) < 0)
{
SDL_LogError(SDL_LOG_CATEGORY_APPLICATION, "Couldn't initialize SDL: %s", SDL_GetError());
return 3;
}

window = SDL_CreateWindow("SDL_CreateTexture",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
1024, 768,
SDL_WINDOW_RESIZABLE);

renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED);

texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, WIDTH, HEIGHT);

bool alive = true;
while (alive)
{
while(SDL_PollEvent(&event) > 0)
{
switch(event.type)
{
case SDL_QUIT:
alive = false;
break;
}
}

const Uint64 start = SDL_GetPerformanceCounter();

SDL_SetRenderTarget(renderer, texture);
SDL_SetRenderDrawColor(renderer, 0x00, 0x00, 0x00, 0x00);
SDL_RenderClear(renderer);

for(int i = 0; i < 10000; ++i)
{
SDL_SetRenderDrawColor(renderer, rand() % 255, rand() % 255, rand() % 255, 255);
SDL_RenderDrawPoint(renderer, rand() % WIDTH, rand() % HEIGHT);
}



SDL_SetRenderTarget(renderer, NULL);
SDL_RenderCopy(renderer, texture, NULL, NULL);
SDL_RenderPresent(renderer);


const Uint64 end = SDL_GetPerformanceCounter();
const static Uint64 freq = SDL_GetPerformanceFrequency();
const double seconds = ( end - start ) / static_cast< double >( freq );
std::cout << "Frame time: " << seconds * 1000.0 << "ms" << std::endl;
}

SDL_DestroyRenderer(renderer);
SDL_DestroyWindow(window);
SDL_Quit();

return 0;
}

问题在于这段代码:

for(int i = 0; i < 10000; ++i)
{
SDL_SetRenderDrawColor(renderer, rand() % 255, rand() % 255, rand() % 255, 255);
SDL_RenderDrawPoint(renderer, rand() % WIDTH, rand() % HEIGHT);
}

以下是此代码的性能:

enter image description here

这是此代码的性能:

SDL_SetRenderDrawColor(renderer, 255, 255, 255, 255);
for(int i = 0; i < 10000; ++i)
{
SDL_RenderDrawPoint(renderer, rand() % WIDTH, rand() % HEIGHT);
}

enter image description here

正如您所看到的,当您大量更改颜色时,会对性能产生相当大的影响。事实上,它的速度要慢 100 倍以上。我究竟做错了什么?或者这就是它应该如何工作的?

最佳答案

显然,SDL_SetRenderDrawColor 函数需要一些时间来执行和更改颜色。也许颜色数据甚至必须发送到 GPU,这与常规内存访问相比速度非常慢。 rand 函数也占用了一些性能。

使用您的数据,SDL_SetRenderDrawColor 和 10000 之间存在大约 550ms 的差异。因此,每次调用此函数的成本约为 55μs。这是非常小的,调用它几十次并不会真正影响性能,但 10000 次调用显然要多得多。

如果您同意一次调用向 GPU 传输 4 个字节,那么仅用于颜色您就已经传输了 40kB。

关于c++ - SDL2 - 颜色快速变化时性能会受到奇怪影响,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58719469/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com