Consider the typical search algorithm:
for (i=0; i < MAX_ENTRIES; i++) { if( A[i].key == searchkey ) goto FoundKey; }It is well known that this sort of thing can be speeded up by use of a hash table, if MAX_ENTRIES is large. But what if MAX_ENTRIES is small. Is this optimal? The problem with it is that there are two flow conditions in an otherwise very tight loop. On modern processors which are often deeply pipelined, unpredictable branching can be costly. If you are the owner of the array A, then I would suggest augmenting the size by one entry and using it as a sentinel in the following way: DataBaseType A[MAX_ENTRIES+1]; /* 1 extra for sentinel */ ... A[MAX_ENTRIES].key = searchkey; /* Set sentinel */ i = -1; /* Prime for do-while */ do { i++; } while (A[i].key != searchkey); if (i < MAX_ENTRIES) goto FoundKey;The loop has only one comparison condition per time through the loop, and by using do-while it avoids some compilers tendency to put in extra jumps in for loops. |