gpt4 book ai didi

c++ - 处理从十六进制到十六进制的转换

转载 作者:太空宇宙 更新时间:2023-11-04 12:11:30 25 4
gpt4 key购买 nike

我想构建一个函数来轻松地将包含十六进制代码(例如“0ae34e”)的字符串转换为包含等效 ascii 值的字符串,反之亦然。我是否必须将十六进制字符串分成 2 个值对并再次将它们拼在一起,或者是否有方便的方法来做到这一点?

谢谢

最佳答案

基于 binascii_unhexlify()来自 Python 的函数:

#include <cctype> // is*

int to_int(int c) {
if (not isxdigit(c)) return -1; // error: non-hexadecimal digit found
if (isdigit(c)) return c - '0';
if (isupper(c)) c = tolower(c);
return c - 'a' + 10;
}

template<class InputIterator, class OutputIterator> int
unhexlify(InputIterator first, InputIterator last, OutputIterator ascii) {
while (first != last) {
int top = to_int(*first++);
int bot = to_int(*first++);
if (top == -1 or bot == -1)
return -1; // error
*ascii++ = (top << 4) + bot;
}
return 0;
}

Example

#include <iostream>

int main() {
char hex[] = "7B5a7D";
size_t len = sizeof(hex) - 1; // strlen
char ascii[len/2+1];
ascii[len/2] = '\0';

if (unhexlify(hex, hex+len, ascii) < 0) return 1; // error
std::cout << hex << " -> " << ascii << std::endl;
}

输出

7B5a7D -> {Z}

引用源代码中的一段有趣的话:

While I was reading dozens of programs that encode or decode the formats here (documentation? hihi:-) I have formulated Jansen's Observation:

Programs that encode binary data in ASCII are written in such a style that they are as unreadable as possible. Devices used include unnecessary global variables, burying important tables in unrelated sourcefiles, putting functions in include files, using seemingly-descriptive variable names for different purposes, calls to empty subroutines and a host of others.

I have attempted to break with this tradition, but I guess that that does make the performance sub-optimal. Oh well, too bad...

Jack Jansen, CWI, July 1995.

关于c++ - 处理从十六进制到十六进制的转换,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/9620891/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com