portability: <' and >' are not always defined on addresses.

Specifically, don't sort objects by their memory addresses when
they're not allocated in the same array or other object.  Though
I haven't found a test case where that fails on my platform, C
says the behavior is undefined.
* src/AnnotationList.c (AnnotationList__insertInto): Remove
FIXME.  Use new id field of InadequacyList nodes rather than
their memory addresses when sorting.
(AnnotationList__compute_from_inadequacies): Add
inadequacy_list_node_count argument to pass to
InadequacyList__new_conflict.
* src/AnnotationList.h
(AnnotationList__compute_from_inadequacies): Update prototype
and documentation for new argument.
* src/InadequacyList.c (InadequacyList__new_conflict): Add
node_count argument and use it to assign a unique ID.
* src/InadequacyList.h (InadequacyListNodeCount): New typedef.
(InadequacyList): Add id field.
(InadequacyList__new_conflict): Update prototype and
documentation for new argument.
* src/ielr.c (ielr_compute_annotation_lists): Update
AnnotationList__compute_from_inadequacies invocation.
This commit is contained in:
Joel E. Denny
2009-12-29 12:43:26 -05:00
parent abcc7c03cc
commit 2728ac7ecd
6 changed files with 96 additions and 46 deletions

View File

@@ -77,12 +77,11 @@ AnnotationList__isContributionAlways (AnnotationList const *self,
* - Otherwise, \c list now contains the node \c self, \c result is true, and
* \c list assumes responsibility for the memory of \c self.
* - The sort in \c list is:
* - Sort in reverse order on memory address of the associated inadequacy
* node. Because memory is usually allocated in ascending order (FIXME:
* Is this true enough? Should we keep some sort of global index to
* guarantee it?), this should mean that the insertion position within an
* annotation list is usually near the beginning with other annotations
* associated with the same inadequacy.
* - Sort in reverse order on the unique ID of the associated
* inadequacy node. Because these IDs are assigned in ascending
* order, this should mean that the insertion position within an
* annotation list is usually near the beginning with other
* annotations associated with the same inadequacy.
* - Next, sort on the first contribution that is different as follows:
* - Sort an always-contribution before a never-contribution before a
* potential-contribution.
@@ -104,9 +103,9 @@ AnnotationList__insertInto (AnnotationList *self, AnnotationList **list,
{
int cmp = 0;
ContributionIndex ci;
if (self->inadequacyNode < (*node)->inadequacyNode)
if (self->inadequacyNode->id < (*node)->inadequacyNode->id)
cmp = 1;
else if ((*node)->inadequacyNode < self->inadequacyNode)
else if ((*node)->inadequacyNode->id < self->inadequacyNode->id)
cmp = -1;
else
for (ci = 0;
@@ -408,18 +407,14 @@ AnnotationList__computePredecessorAnnotations (AnnotationList *self, state *s,
}
void
AnnotationList__compute_from_inadequacies (state *s,
bitsetv follow_kernel_items,
bitsetv always_follows,
state ***predecessors,
bitset **item_lookahead_sets,
InadequacyList **inadequacy_lists,
AnnotationList **annotation_lists,
AnnotationIndex *annotation_counts,
ContributionIndex
*max_contributionsp,
struct obstack
*annotations_obstackp)
AnnotationList__compute_from_inadequacies (
state *s, bitsetv follow_kernel_items, bitsetv always_follows,
state ***predecessors, bitset **item_lookahead_sets,
InadequacyList **inadequacy_lists, AnnotationList **annotation_lists,
AnnotationIndex *annotation_counts,
ContributionIndex *max_contributionsp,
struct obstack *annotations_obstackp,
InadequacyListNodeCount *inadequacy_list_node_count)
{
bitsetv all_lookaheads;
bitset shift_tokens;
@@ -530,8 +525,9 @@ AnnotationList__compute_from_inadequacies (state *s,
}
{
InadequacyList *conflict_node =
InadequacyList__new_conflict (s, symbols[conflicted_token],
actions);
InadequacyList__new_conflict (
s, symbols[conflicted_token], actions,
inadequacy_list_node_count);
actions = NULL;
annotation_node->inadequacyNode = conflict_node;
if (ContributionIndex__none