我试图理解这个 challenge 的第一个测试用例在代码力量中。
描述是:
Sergey is testing a next-generation processor. Instead of bytes the processor works with memory cells consisting of n bits. These bits are numbered from 1 to n. An integer is stored in the cell in the following way: the least significant bit is stored in the first bit of the cell, the next significant bit is stored in the second bit, and so on; the most significant bit is stored in the n-th bit. Now Sergey wants to test the following instruction: "add 1 to the value of the cell". As a result of the instruction, the integer that is written in the cell must be increased by one; if some of the most significant bits of the resulting number do not fit into the cell, they must be discarded. Sergey wrote certain values of the bits in the cell and is going to add one to its value. How many bits of the cell will change after the operation?
摘要
给定一个二进制数,将其十进制值加 1,计算运算后有多少位发生变化?
测试用例
4
1100
= 34
1111
= 4
注意 在第一个示例中,单元格的最终值为 0010,在第二个示例中,单元格的最终值为 0000。
在2个测试用例中,1111是15,所以15 + 1 = 16(二进制为10000),所以所有的1都改变了,因此是4
但是在2个测试用例中1100是12,所以12 + 1 = 13(01101),这里只是最后左边的1改变了,但结果是3,为什么?
最佳答案
您错过了关键部分:最低有效位是第一个(即最左边的),而不是我们通常编写二进制的最后一个。
因此,1100 不是 12,而是 3。因此,1100 + 1 = 3 + 1 = 4 = 0010,因此更改了 3 位。
“最低有效位”字面意思是不是最高有效位的位,因此您可以将其理解为“代表最小值的位”。在二进制中,表示 2^0 的位是最低有效位。所以你的任务中的二进制代码写成如下:
bit no. 0 1 2 3 4 (...)
value 2^0 2^1 2^2 2^3 2^4 (...)
| least | most
| significant | significant
| bit | bit
这就是 1100 的原因:
1100 = 1 * 2^0 + 1 * 2^1 + 0*2^2 + 0*2^3 = 1 + 2 + 0 + 0 = 3
不是相反(正如我们通常写的那样)。
关于algorithm - 我如何理解此代码挑战中此测试用例的结果?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31525568/