c++在使用串口发送之前将十六进制转换为ASCII

标签 c++ ascii

我想用 C++ 与电子负载通信。我使用 win32.h。要将电子负载置于远程控制状态,我需要发送:

“AA 00 20 01 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 CB”

但是在发送之前,我需要将它转换成ASCII码。

我的代码是:

 HANDLE hCom;
 DWORD dwError;
 BOOL fSuccess;
 DWORD dwEvtMask;
 int i;
 int NbOctet;
 char *Message;
 unsigned long nBytesWrite;
LPCWSTR Port = L"COM14";
 Message = new char[200];
std::string Test;
/*-----------------------------------------------*/
/* Ouverture du port de communiucation           */
/*-----------------------------------------------*/

hCom = CreateFile(Port,
   GENERIC_READ | GENERIC_WRITE,
   0,
   NULL,
   OPEN_EXISTING,
   0,
   NULL
   );

Message = "AA 00 20 01 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 CB";

NbOctet = strlen(Message);

WriteFile(hCom,Message,NbOctet,&nBytesWrite,NULL);


CloseHandle(hCom);


delete[] Message;

我的问题是:如何在发送前将消息转换为 ASCII 字符?

我在 python 中有一个我想要的例子:

# Construct a set to remote command
cmd = chr(0xaa) + chr(0x00) + chr(0x20) # First three bytes
cmd += chr(0x01) + chr(0x00)*(length_packet - 1 - 4)
cmd += chr(CalculateChecksum(cmd))

sp.write(cmd)

我的新代码是:

void main(int argc, TCHAR *argv[])
{
 HANDLE hCom;
 DWORD dwError;
 BOOL fSuccess;
 DWORD dwEvtMask;
 int i;
 int NbOctet;
 unsigned long nBytesWrite;
 LPCWSTR Port = L"\\\\.\\COM14";

 /*-----------------------------------------------*/
 /* Ouverture du port de communiucation           */
 /*-----------------------------------------------*/

 hCom = CreateFile(Port,
   GENERIC_READ | GENERIC_WRITE,
   0,
   NULL,
   OPEN_EXISTING,
   0,
   NULL
   );

char Message[] = {0xAA,0x00,0x20,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0xCB};

NbOctet = strlen(Message);
qDebug() << Message;
WriteFile(hCom,Message,NbOctet,&nBytesWrite,NULL);


CloseHandle(hCom);

}

但是没用

最佳答案

这可以通过 std::ostringstream 来完成将单独的值放入字符串中,然后 std::stoi将字符串解析为整数:

std::ostringstream os("AA 00 20 01 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 CB");

std::vector<uint8_t> values;

std::string value_string;
while (os >> value_string)
    values.push_back(static_cast<uint8_t>(std::stoi(value_string, nullptr, 16)));

WriteFile(hCom, values.data(), sizeof(uint8_t) * values.size(), &nBytesWrite, NULL);

关于c++在使用串口发送之前将十六进制转换为ASCII,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/18890301/

相关文章:

javascript - 将 Javascript 对象转换为 JSAPIPtr 时出现错误的变体转换异常

c++ - 自动检测模板函数的返回类型

c++ - 我如何在 Mac 上构建 gcc?

java - 在 Java 中将字符串从 ASCII 转换为 EBCDIC?

C++ 从字符串中去除非 ASCII 字符

javascript - 将非 ASCII 字符(元音变音、重音符号...)转换为其最接近的 ASCII 等效字符(用于创建 slug)

ascii - 256对于ascii表中的128个唯一字符意味着什么

c++ - Q_OBJECT 在将 qmake 项目转换为 cmake 时没有命名类型

c++ - 是否有一种方法可以根据不同的变量重新计算和方程式?

java - Roguelike 游戏有闪烁的绘图