gpt4 book ai didi

c - 不使用 Visual Studio 时出现意外值输出

转载 作者:太空宇宙 更新时间:2023-11-04 02:56:51 26 4
gpt4 key购买 nike

我一直在为我的算法分析类(class)开发一个程序,我必须在其中解决 Knapsack problem使用蛮力、贪婪、动态和分支定界策略。当我在 Visual Studio 2012 中运行它时一切正常,但如果我使用 gcc 编译并在命令行上运行它,我会得到不同的结果:

Visual Studio :

+-------------------------------------------------------------------------------+
| Number of | Processing time in seconds / Maximum benefit value |
| +---------------+---------------+---------------+---------------+
| items | Brute force | Greedy | D.P. | B. & B. |
+---------------+---------------+---------------+---------------+---------------+
| 10 + 0 / 1290 + 0 / 1328 + 0 / 1290 + 0 / 1290 |
+---------------+---------------+---------------+---------------+---------------+
| 20 + 0 / 3286 + 0 / 3295 + 0 / 3200 + 0 / 3286 |
+---------------+---------------+---------------+---------------+---------------+

指令:

+-------------------------------------------------------------------------------+
| Number of | Processing time in seconds / Maximum benefit value |
| +---------------+---------------+---------------+---------------+
| items | Brute force | Greedy | D.P. | B. & B. |
+---------------+---------------+---------------+---------------+---------------+
| 10 + 0 / 1290 + 0 / 1328 + 0 / 1599229779+ 0 / 1290 |
+---------------+---------------+---------------+---------------+---------------+
| 20 + 0 / 3286 + 0 / 3295 + 0 / 3200 + 0 / 3286 |
+---------------+---------------+---------------+---------------+---------------+

始终显示相同的数字“1599229779”。请注意,只有在第一次运行动态算法时输出才会困惑。

这是我的代码:

typedef struct{
short value; //This is the value of the item
short weight; //This is the weight of the item
float ratio; //This is the ratio of value/weight
} itemType;

typedef struct{
time_t startingTime;
time_t endingTime;
int maxValue;
} result;

result solveWithDynamic(itemType items[], int itemsLength, int maxCapacity){
result answer;
int rowSize = 2;
int colSize = maxCapacity + 1;
int i, j; //used in loops
int otherColumn, thisColumn;

answer.startingTime = time(NULL);

int **table = (int**)malloc((sizeof *table) * rowSize);//[2][(MAX_ITEMS*WEIGHT_MULTIPLIER)];
for(i = 0; i < rowSize; i ++)
table[i] = (int*)malloc((sizeof *table[i]) * colSize);

table[0][0] = 0;
table[1][0] = 0;

for(i = 1; i < maxCapacity; i++) table[1][i] = 0;

for(i = 0; i < itemsLength; i++){
thisColumn = i%2;
otherColumn = (i+1)%2; //this is always the other column

for(j = 1; j < maxCapacity + 1; j++){
if(items[i].weight <= j){
if(items[i].value + table[otherColumn][j-items[i].weight] > table[otherColumn][j])
table[thisColumn][j] = items[i].value + table[otherColumn][j-items[i].weight];
else
table[thisColumn][j] = table[otherColumn][j];
} else {
table[thisColumn][j] = table[thisColumn][j-1];
}//end if/else
}//end for
}//end for

answer.maxValue = table[thisColumn][maxCapacity];

answer.endingTime = time(NULL);

for(i = 0; i < rowSize; i ++)
free(table[i]);
free(table);

return answer;
}//end solveWithDynamic

稍微解释一下。我遇到了这个算法的内存消耗问题,因为我必须为一组 10,000 个项目运行它。我意识到我不需要存储整个表,因为我只看过上一列。我其实想通了,你只需要存储当前行和x+1个附加值,其中x是当前itemType的权重。它将所需的内存从 (itemsLength+1) * (maxCapacity+1) 元素带到 2*(maxCapacity+1) 并且可能是 (maxCapacity+1) + (x+1)(虽然我不需要优化那么多)。

此外,我在这个函数中使用了 printf("%d", answer.maxValue);,它仍然显示为“1599229779”。谁能帮我弄清楚发生了什么事?谢谢。

最佳答案

不能确定是不是这个原因,但是

for(i = 1; i < maxCapacity; i++) table[1][i] = 0;

你让 table[1][maxCapacity] 未初始化,但可能会使用它:

for(j = 1; j < maxCapacity + 1; j++){
if(items[i].weight <= j){
if(items[i].value + table[otherColumn][j-items[i].weight] > table[otherColumn][j])
table[thisColumn][j] = items[i].value + table[otherColumn][j-items[i].weight];
else
table[thisColumn][j] = table[otherColumn][j];
} else {
table[thisColumn][j] = table[thisColumn][j-1];
}//end if/else
}//end for

如果对于 Visual Studio 始终为零,但对于 gcc 则为非零,这可以解释差异。

关于c - 不使用 Visual Studio 时出现意外值输出,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/16301898/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com