如何为对数组元素求和的函数编写 Spark 后置条件? (Spark 2014,但如果有人告诉我如何为更早的 Spark 做这件事,我应该能够适应它。)
如果我有:
type Positive_Array is array (Positive range <>) of Positive;
function Array_Total(The_Array: Positive_Array) return Positive
with
Post => Array_Total'Return = -- What goes here?
is
-- and so on
在我的特定情况下,我不需要担心溢出(我知道初始化时的总数,它只能单调减少)。
大概我在实现中需要一个循环变体,以帮助证明者,但这应该是后置条件的直接变体,所以我还不担心。
最佳答案
这是一个古老但有趣的问题。这是一个迟到的答案,只是为了完整性和 future 引用。
博文 Taking on a Challenge in SPARK 中给出了解决此类问题的主要“技巧”发布在 AdaCore 的网站上。
与某些答案已经提出的相反,您不能使用递归函数来证明求和。相反,您需要一个 ghost function如下例所示。可以扩展该方法以证明类似的“列表折叠”操作,例如(条件)计数。
下面的例子可以用证明级别 1 的 GNAT CE 2019 证明。
2020 年 7 月更新
Sum_Acc
的后置条件略有改进。另见 this另一个例子的相关答案。
sum.ads
package Sum with SPARK_Mode is
-- The ranges of the list's index and element discrete types must be
-- limited in order to prevent overflow during summation, i.e.:
--
-- Nat'Last * Int'First >= Integer'First and
-- Nat'Last * Int'Last <= Integer'Last
--
-- In this case +/-1000 * +/-1000 = +/-1_000_000 which is well within the
-- range of the Ada Integer type on typical platforms.
subtype Int is Integer range -1000 .. 1000;
subtype Nat is Integer range 0 .. 1000;
type List is array (Nat range <>) of Int;
-- The function "Sum_Acc" below is Ghost code to help the prover proof the
-- postcondition (result) of the "Sum" function. It computes a list of
-- partial sums. For example:
--
-- Input : [ 1 2 3 4 5 6 ]
-- Output : [ 1 3 6 10 15 21 ]
--
-- Note that the lengths of lists are the same, the first elements are
-- identical and the last element of the output (here: "21"), is the
-- result of the actual function under consideration, "Sum".
--
-- REMARK: In this case, the input of "Sum_Acc" and "Sum" is limited
-- to non-empty lists for convenience.
type Partial_Sums is array (Nat range <>) of Integer;
function Sum_Acc (L : List) return Partial_Sums with
Ghost,
Pre => (L'Length > 0),
Post => (Sum_Acc'Result'Length = L'Length)
and then (Sum_Acc'Result'First = L'First)
and then (for all I in L'First .. L'Last =>
Sum_Acc'Result (I) in
(I - L'First + 1) * Int'First ..
(I - L'First + 1) * Int'Last)
and then (Sum_Acc'Result (L'First) = L (L'First))
and then (for all I in L'First + 1 .. L'Last =>
Sum_Acc'Result (I) = Sum_Acc'Result (I - 1) + L (I));
function Sum (L : List) return Integer with
Pre => L'Length > 0,
Post => Sum'Result = Sum_Acc (L) (L'Last);
end Sum;
sum.adb
package body Sum with SPARK_Mode is
-------------
-- Sum_Acc --
-------------
function Sum_Acc (L : List) return Partial_Sums is
PS : Partial_Sums (L'Range) := (others => 0);
begin
PS (L'First) := L (L'First);
for Index in L'First + 1 .. L'Last loop
-- Head equal.
pragma Loop_Invariant
(PS (L'First) = L (L'First));
-- Tail equal.
pragma Loop_Invariant
(for all I in L'First + 1 .. Index - 1 =>
PS (I) = PS (I - 1) + L (I));
-- NOTE: The loop invariant below holds only when the range of "Int"
-- is symmetric, i.e -Int'First = Int'Last. If not, then this
-- loop invariant may have to be adjusted.
-- Result within bounds.
pragma Loop_Invariant
(for all I in L'First .. Index - 1 =>
PS (I) in (I - L'First + 1) * Int'First ..
(I - L'First + 1) * Int'Last);
PS (Index) := PS (Index - 1) + L (Index);
end loop;
return PS;
end Sum_Acc;
---------
-- Sum --
---------
function Sum (L : List) return Integer is
Result : Integer := L (L'First);
begin
for I in L'First + 1 .. L'Last loop
pragma Loop_Invariant
(Result = Sum_Acc (L) (I - 1));
Result := Result + L (I);
end loop;
return Result;
end Sum;
end Sum;
关于ada - 数组总数的 Spark-Ada 后置条件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44808451/