如何在 C# 中实现此 Python 代码?
Python代码:
print(str(int(str("e60f553e42aa44aebf1d6723b0be7541"), 16)))
结果:
305802052421002911840647389720929531201
但是在 c# 中我遇到了大数字的问题。
你能帮帮我吗?
我在 python 和 c# 中得到了不同的结果。哪里会出错?
最佳答案
原始类型(例如Int32
、Int64
)的长度是有限的,对于这么大的数字来说是不够的。例如:
Data type Maximum positive value Int32 2,147,483,647 UInt32 4,294,967,295 Int64 9,223,372,036,854,775,808 UInt64 18,446,744,073,709,551,615 Your number 305,802,052,421,002,911,840,647,389,720,929,531,201
In this case to represent that number you would need 128 bits. With .NET Framework 4.0 there is a new data type for arbitrarily sized integer numbers System.Numerics.BigInteger. You do not need to specify any size because it'll be inferred by the number itself (it means that you may even get an OutOfMemoryException
when you perform, for example, a multiplication of two very big numbers).
To come back to your question, first parse your hexadecimal number:
string bigNumberAsText = "e60f553e42aa44aebf1d6723b0be7541";
BigInteger bigNumber = BigInteger.Parse(bigNumberAsText,
NumberStyles.AllowHexSpecifier);
然后简单地将它打印到控制台:
Console.WriteLine(bigNumber.ToString());
您可能有兴趣计算需要多少位来表示任意数字,请使用此函数(如果我记得原始实现来自 C Numerical Recipes):
public static uint GetNeededBitsToRepresentInteger(BigInteger value)
{
uint neededBits = 0;
while (value != 0)
{
value >>= 1;
++neededBits;
}
return neededBits;
}
然后计算写成字符串的数字所需的大小:
public static uint GetNeededBitsToRepresentInteger(string value,
NumberStyles numberStyle = NumberStyles.None)
{
return GetNeededBitsToRepresentInteger(
BigInteger.Parse(value, numberStyle));
}
关于c# - C# 中的任意大整数,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/10293603/