In C99 6.5 says:
Between the previous and next sequence point an object shall have its
stored value modified at most once by the evaluation of an expression.
Furthermore, the prior value shall be read only to determine the value
to be stored
What does "Furthermore, the prior value shall be read only to determine the value to be stored" mean? In C99, why a[i++] = 1
is undefined behavior?
a[i++] = 1
is defined (unless it has other reasons to be undefined than the sequencing of side-effects: out of bound access, or uninitialized i
).
You mean a[i++] = i
, which is undefined behavior because it reads i
between the same sequence points as i++
, which change it.
The “Furthermore, the prior value shall be read only to determine the value to be stored” part means that i = i + 1;
is allowed, although it reads from i
and modifies i
.
On the other hand, a[i] = (i=1);
isn't allowed, because despite writing to i
only once, the read from i
is not for computing the value being stored.
The "prior value shall be read only to determine the value to be stored" wording is admittedly counterintuitive; why should the purpose for which a value is read matter?
The point of that sentence is to impose a requirement for which results depend on which operations.
I'll steal examples from Pascal's answer.
This:
i = i + 1;
is perfectly fine. i
is read and written in the same expression, with no intervening sequence point, but it's ok because the write cannot occur until after the read has completed. The value to be stored cannot be computed until the expression i + 1
, and its subexpression i
, have been completely evaluated. (And i + 1
has no side effects that might be delayed until after the write.) That dependency imposes a strict ordering: the read must be completed before the write can begin.
On the other hand, this:
a[i] = (i=1);
has undefined behavior. The subexpression a[i]
reads the value of i
, and the subexpression i=1
writes the value of i
. But the value to be stored in i
by the write does not depend on the evaluation that reads i
on the left hand side, and so the ordering of the read and the write are not defined. The "value to be stored" is 1
; the read of i
in a[i]
does not determine that value.
I suspect this confusion is why the 2011 revision of the ISO C standard (available in draft form as N1570) re-worded that section. The standard still has the concept of sequence points, but 6.5p2 now says:
If a side effect on a scalar object is unsequenced relative to either
a different side effect on the same scalar object or a value
computation using the value of the same scalar object, the behavior is
undefined. If there are multiple allowable orderings of the
subexpressions of an expression, the behavior is undefined if such an
unsequenced side effect occurs in any of the orderings.
And paragraph 1 states explicitly what was only implicitly assumed in C99:
The value computations of the operands of an operator are sequenced
before the value computation of the result of the operator.
Section 5.1.2.3 paragraph 2 explains the sequenced before and sequenced after relationships.