An equivalence class if a conceptual category that contains terms without common attributes, like a picture of an object (A), its name written in English (B) and in Arabic (C), as well as its name dictated in English (D). To create such a class, some pairs of terms are linked by training, (DC, CB and BA, for example) while others such as AD and DA become related without training because they are linked by prior training to common intermediary stimuli, in this case B and C. AD and DA are called derived relations of which there are two types; transitive and equivalence relations. B and C are called nodes, which are stimuli linked by training to at least two other stimuli. Control of behavior by derived relations demonstrates the existence of an equivalence class. Class size governs the maximum number of derived relations in a class; there are (N - 1)(N-1) logically defined derived relations in a class of N stimuli. Recently, we identified three structural parameters of equivalence classes: number of nodal stimuli in a class, the distribution of single stimuli among the nodes (single stimuli are linked by training to only one other stimulus), and directionality of stimulus linkages. In addition, we developed algorithms that produce all specific variations for each of these parameters. The effects of class size and of each structural-parameter upon derived relations that actually acquire control of behavior are unknown. Using arbitrary matching-to- sample procedures in a mixed experimental design, we will study the effects of class size and of nodality upon the number of derived relations that actually gain control of responding. Theoretically, the strength of derived relations should be inversely related to the number of nodes that mediate derived relations. Further, equivalence relations should exert less control than transitive relations. Stimulus control by these derived relations will be measured using conditional probabilities of responding and reaction time. The results will be related to intelligence, production of insights, transitive inference, cross classification, and semantic memory networks.