Division of Research School of Business Administration January 1989 ASSESSING MAIN EEC' S IN INIERACTIVE REGRESSION MODEIS Working Paper #595 Youjae Yi The University of Michigan FOR DISCUSSION PURPOSES ONLY None of this material is to be quoted or reproduced without the expressed permission of the Division of Research Copyright 1989 University of Michigan School of Business Administration Ann Arbor, Michigan 48109

Author's Note Youjae Yi is Assistant Professor of Marketing, School of Business Administration, The University of Michigan, Ann Arbor, MI 48109-1234. The author wishes to thank Richard P. Bagozzi and Claes Fornell for their helpful comments for an earlier version of this paper.

1 ASSESSING MAIN EFFECTS IN INTERACTIVE REGRESSION MODELS Abstract This paper investigates the problems in assessing main effects of interval scaled variables in interactive regression models. Inclusion of an interaction term induces the following problems in assessing main effects; 1) conclusions can change dramatically from "strong" to "no" main effects with allowable scale transformations, 2) severe multicollinearity is introduced, and 3) the interpretation of main effects is ambiguous. It is shown how a mean-centering procedure reduces these problems, and these issues are illustrated with an empirical example.

2 INTRODUCTION Marketing theories often imply that independent variables have interactive as well as main effects on a certain dependent variable (e.g., Laroche and Howard 1980; Punj and Stewart 1983). If the independent variables (say, X1 and X2) and the dependent variable (Y) are measured on numeric scales, it is common to investigate such effects by including the product term (X1X2) in a multiple regression (Lilien and Kotler 1983; Hornik 1984; Louviere and Woodworth 1983). That is, the following equation, which will be called a"raw-score" model, is used:1 Y = Bo+ BiXi + B2X2 + B3X1X2 + e (1) where B1 is an intercept, the Bi's are regression coefficients, and e is the residual. The use of the product term represents a generalization of the concept of interaction in analysis of variance. Since the effect of a marginal change in X1 depends on the level of X2 (dY/9Xi = Bi+B3X2), X2 can be seen as a moderator variable (c.f., Sharma, Durand, and Gur-Arie 1981). The regression coefficients (Bi's) are tested for significance to examine the effects of independent variables. If B1 (B2) differs significantly from zero, it is concluded that XI (X2) has main effects on Y. If B3 is significant, the interaction effect is said to exist. This seemingly straightforward practice, however, has certain problems when the independent variables are measured on interval scales rather than ratio scales. Researchers can draw different conclusions about the effects of Xi from the same data with changes in the origins of the scale, which are allowable for the interval scales; for example, 1 The product term X1X2 is only one of many possible interaction effects; for example, Y=BoXlBlX2B2 is another form of interaction (c.f., Southwood 1978). The concern of this paper is, however, the regression model with a product term such as equation (1). See Cooper and Nakanishi (1983) for discussion of the issues in another popular interactive model, an attraction model.

3 conclusions can differ widely from "no main effects" to "strong main effects" depending on what origins are used for the scales. The variance of the regression coefficient estimates becomes large, and one may fail to find significant effects of important predictors. Further, the main effects are difficult to interpret. These problems are highly relevant to marketing research, where the levels of many measurement possess, at best, interval properties. The purpose of this paper is to highlight the issues in assessing main effects of interval scaled variables in multiplicative regression models, which are often ignored in practice. Several marketing researchers have investigated the problems in using the multiplicative regression model for evaluating the interaction effects (Bagozzi 1984; Holbrook 1977; Schmidt and Wilson 1975). However, little research has been done in marketing on the issues in assessing the main effects. The present paper integrates and puts into perspective such issues that have appeared in disparate literatures (e.g., Althauser 1971; Lane 1981; Finney et al. 1984; Overall et al. 198 1). After the problems in testing main effects are discussed, it is demonstrated how one of the well-known procedures, mean-centering, solves these problems. These issues are then illustrated with an empirical example with actual data. PROBELMS IN ASSESSING MAIN AND INTERACTION EFFECTS This section examines the issues in assessing the effects of interval-scale variables in a multiple regression analysis with a product term. Problems are discussed with respect to: (1) scale dependence, (2) multicollinearity, and (3) substantive interpretations. Scale Dependence The origin of an interval scale is arbitrary; that is, linear transformation through addition or subtraction of constants is an operation that preserves the properties of interval level measurements. For example, in marketing studies of expectancy-value models using

4 seven-point interval scales, the items have been scored either from 1 to 7 (unipolar scales) or from -3 to +3 (bipolar scales) (Bagozzi 1984; Holbrook 1977; Ryan and Bonfield 1975). Since such transformation is theoretically possible, it should not affect conclusions as to the effects of variables. Conclusions about the interaction effect are indeed invariant under such transformation. However, conclusions about the main effect can differ widely from "no main effects" to "strong main effects," depending on what origins are used for the scales. These properties can be demonstrated as follows. Let us define new variables X*1 and X*2 by changing the origins of X1 and X2; that is, X* =Xl-c, X*2=X2-d, where c and d are arbitrary constants. Solving for X1 and X2 and substituting the results into equation (1), we get Y=(Bo+cBl+dB2+cdB3) + (Bi+dB3)X*l + (B2+cB3)X*2 + B3X* X*2 + e. (2) What are the effects of changing origins of X1 and X2? We can note that the coefficient for the rescaled Xi's (X*i's) are affected by the changes in origins (c or d). First, the regression coefficient for X*1 varies with changes in the origin of X2 (d). An implication is that it should be possible, by the appropriate changes in the origin of X2, to force the coefficient for X*i to zero. If we let d=-B1/B3, the main effect of X*i becomes zero. That is, when the origin of X2 is changed by B1/B3 [X*2=X2+Bl/B3], the main effect of XI disappears. A symmetrical relationship holds for X*2; the regression coefficient for X*2 (B2+cB3) is influenced by the change in the origin of X1 (c). Conclusions as to the main effect of X*2 can therefore vary with the coding scheme for X1. The standardized coefficient and the ratios of the coefficients to their standard errors also change (Southwood 1978). In sum, changes in origins of one variable affect the regression coefficients of the other variable and thereby conclusions about its main effect.

5 The sensitivity of the main effect with regard to the scale change, which is frequently overlooked in practice (e.g., Laroche and Howard 1980; Hornik 1984), provides serious threats to a regression analysis of additive and multiplicative effects. When an independent variable (say Xi) is measured on an interval rather than a ratio scale, no theoretical meaning can be attached to the (partial) regression coefficient for the other variable (X2). By adding or subtracting a constant from the X1 scale, as can be done without loss of generality when the scale is measured at the interval level, it is possible to make the main effect of X2 either significantly positive, significantly negative, or insignificant. An analytic tool is needed that can provide conclusions invariant under such transformation. Note that the coefficient for the product term (X1X2) is invariant under such transformation (see equations (1) and (2)). Although zero-order correlations involving the product variables are scale dependent (Schmidt and Wilson 1975), the partial regression coefficients for them and corresponding standard errors are scale invariant; therefore, conclusions as to the interaction do not vary with changes in origins of the scales (Allison 1977). Also, since equations (1) and (2) are mathematically equivalent, the predictive accuracy of the two equations is the same (Cohen 1978). Since the conclusions as to the interaction effects and predictive accuracy are invariant with changes in origins, nothing more will be said about them herein. This paper will focus on the estimation and interpretation of the main effects, which are affected by scale transformation. Multicollinearity (MC) The X1X2 term is usually highly correlated with its constituents, Xi and X2 (see Althouser 1971, for exact forms of the correlations between Xi's and XiX2 terms). The size of these correlations depends on the means, standard deviations, and correlations of X1 and X2, but very large correlations are often found in practice (Althauser 1971; Tate 1984).

6 The correlation among the independent variables, known as multicollinearity, poses well-known problems for the estimation and interpretation of regression coefficients (e.g., Belsley et al. 1980; Johnston 1984). Specifically, standard errors of regression coefficient estimates become very large, and the t-statistic for the coefficients can be quite low. As a result, the individual regression coefficients may not be statistically significant, even though a definite statistical relationship exists between the dependent variable and the set of independent variables. Another consequence is that estimates and their standard errors become quite unstable. In sum, conclusions from the interactive regression analysis can be quite imprecise and unstable. Interpretations of Main Effects In an analysis of equation (1), to assess the main effect of a variable (say, Xi) the corresponding coefficient (B1) is examined. How can we interpret the coefficient? The answer can be found by looking at the marginal effect of the variable. For a marginal effect of Xi, we can take the partial derivative of the function with respect to XI; a Y/aX1 = Bi+B3X2. Note that BI is the effect of X1 on Y when other variables (here X2) are zero. A conceptual assumption underlying this approach is that a main effect refers to the effect of a variable in the absence of all other variables. This situation is, however, practically quite unlikely; it is difficult to imagine situations where people are completely free from influences of other variables. To the extent that such a questionable assumption underlies this approach, the method is of limited applicability. Ideally, the effect of a variable should be understood in relations with other variables in an interactive model, but an analysis of equation (1) lacks such properties.

7 ALTERNATIVE METHODS FOR ASSESSSING INTERACTION EFFECTS Some alternative procedures are available for assessing whether the relationship between the two variables (e.g., X1 and Y) varies over levels of a third variable (e.g., X2). Two methods reviewed here are: (1) subgroup analysis and (2) orthogonal centering.2 Subgroup Analysis One may examine interaction effects by probing directly for differential relationships between Y and XI among individuals grouped on the basis of a moderator variable (X2). In this approach, the sample is split into subgroups, and each subgroup is analyzed separately. The subgroup analysis, however, has several drawbacks. First, there is a loss of statistical power and measurement information when scores on a continuous variable (e.g., X2) are artificially collapsed into two or three categories (Cohen and Cohen 1983). Second, grouping can often produce "mystyfying descriptive results" in that different cutting scores used to form subgroups can sometime radically alter observed relationships (Cronbach and Snow 1977). Orthogonal Centering Smith and Sasaki (1979) have proposed that one should center the variables around the values (f and g) that make the product term orthogonal (uncorrelated) to its constituents. The orthogonal centering uses the following equation, which will be called an "orthogonally centered" model. Y=bo + blX1 + b2X2 + b3(Xl-f)(X2-g) + e (3) 2 Several estimation procedures have been suggested to reduce multicollinearity; e.g., Ridge, Principal Components, Latent Root, Stein, Equity estimators. But they are biased, and the usual t-test are not applicable to these estimates. Further, the main effects are difficult to interpret. Thus, they are not examined in this paper (c.f., Krishnamurthi and Rangaswamy 1987).

8 where the centering constants (f and g) that make the predictors orthogonal are calculated by f=(S11S23-S12S13)/(S11S22-S122), and g=(S13S22-S12S23)/(Sl S22-S 122). Sij refers to the sample covariance between Xi and Xj, Sii is the sample variance of Xi, and X3 indicates X1X2. This approach reduces multicollinearity by eliminating the correlations between the interaction term and the X1 and X2 terms. However, estimates of the bi's are still scale dependent in that they vary with changes in origins of XI and X2. The main effects are difficult to interpret as well; they are the effects of one variable when the other variable takes on the value that eliminates the correlation between the variable and the product term. This approach is also cumbersome by requiring somewhat lengthy calculation on the part of the users. A SUGGESTED METHOD: MEAN CENTERING It is proposed in this section that deviation scores from the means be used when a multiple regression model includes product terms. This approach, often called "meancentering," consists of transforming the X1 and X2 scores to deviations from their means and then forming the product term from these deviations. This mean-centering approach implies the following equation, which will be called a "mean-centered" model: Y= ao + alxl + a2x2 + a3xlx2 + e (4) where xl=Xi- X1, and x2=X2- X2. Next, let us examine how this approach resolves the three issues in using the multiplicative model: (1) scale dependence, (2) multicollinearity, and (3) substantive interpretation.

9 Scale Dependence. The mean-centered model (4) overcomes the scale-dependence problem encountered in an analysis of equation (1), the raw-score model. That is, the regression coefficients of equation (4) do not vary with changes in the origins of Xi and X2. Suppose origins of the Xi and X2 scales are changed by subtracting constants from them: Zi=Xl-c, Z2=X2-d. In such cases, resulting deviation scores for equation 3 are as follows: zl=Z1- Z1 =(Xi-c)-( X1 -c)=Xl- X1 =xi, and similarly Z2=x2. That is, xi and X2 of equation (3) remain unchanged after transformations ofX1 and X2. As a result, the coefficients for the two independent variables are invariant with respect to changes in origins. Multicollinearity. Centering around the means has often been proposed as a way to reduce the correlations between individual Xi's and the cross-product term (X1X2) in interactive models or those between individual X's and the higher-order term (X2) in polynomial models (for example, Y=Bo+BlX1+B2X12+e) (Cohen and Cohen 1983). It has been shown that expressing the independennt variables as deviations from their means reduces the multicollinearity substantially and avoids the associated problems (Tate 1984). Interpretation ofMain Effects. This approach also helps researchers in interpreting the effects of X1 and X2. The main effect of a variable (say, X1) can be interpreted as the average effect of a variable across levels of the other variables (X2). If we take the partial derivative of equation (4) with respect to X1, we get OY/9X1= b1+b3x1=bl+b3(X2- X2 ). It can be seen that bI is the effect of Xi when other variables (here, X2) are equal to their own means ( X2 ). This interpretation of main effects is consistent with that suggested by some researchers. Several researchers have argued that it is valid to estimate main effects in the presence of interaction effect, but that the interpretation of main effects is different from that when no interaction is present (Overall et al. 1981). In an additive model without

10 interaction effects (Y= Bo +BlX1+B2X2), the main effect is a constant effect that is generalizable across all scores on other variables. In an interactive model, however, the main effect varies with the level of other variables. In such a case, the main effect should be interpreted as the average effect of a variable across all observed scores of other variables, or equivalently the effect of the variable at the average score on the other variables (Cohen and Cohen 1983; Lane 1981). The proposed model yields estimates of such main effects, and the main effects are now interpretable in relations to other variables. Additional Comments. The mean-centering approach has several additional advantages. First, since the estimates of the Bi's in equation (1) can be expressed as linear functions of the ai's in equation (4), one can easily calculate the 0B's indirectly through computations of the ai's. Second, since equations (1) and (4) are mathematically equivalent, they provide an identical fit to the data. The mean-centering yields the same R2 as the current practice, while producing such desirable properties as scale independence, low multicollinearity, and a clear interpretation of main effects. Third, the proposed model is not confined to the two-variable case, but it is easily generalizable to the models with more than two variables. There are, however, some situations where applications of equation (1) to uncentered data may provide better insights. First, when moderator variables (X2) are dummy variables that define subgroups, then B3 in the raw-score model represents the difference in the within-group slopes, compared with a baseline group. Second, when X1 and X2 are indicator variables for experimental studies, it is easy to interpret the coefficient for the product term in raw scores; B3 is the same as the interaction effect in analysis of variance (Smith and Sasaki 1979).

11 AN EMPIRICAL EXAMPLE In order to demonstrate the problems in analyzing interactive models, the effects of attitude toward an act (Aact) and subjective norm (SN) on behavioral intentions (BI) are examined with actual data. Ryan and Bonfield (1975) have questioned the additivity of Aact and SN assumption underlying the Fishbein model, and have suggested that Aact and SN might have interaction effects on BI. This claim is tested here with a regression model with a product term to illustrate the issues. Measures of these variables were collected from 118 respondents in a study about automobile purchases. The dependent variable is subjects' intention (BI) to purchase the Hyundai Excel, an imported car. BI is measured by asking subjects to give the likelihood of purchasing the Hyundai Excel the next time they buy a car on a 11-point scale (1 to 11). The first independent variable is attitude (Aact) toward purchasing the car, measured on the good-bad 7-point scale (numbered from 1 to 7). The second independent variable is subjective norm (SN), measured on the 1 1-point scale (numbered from 1 to 1 1) using standard "Most people important to me think I should/should not buy" item.3 A product of Aact and SN (Aact x SN) is included in the regression equation to assess the main (additive) and interactive effects. The regression coefficients and their standard errors are reported in the first column of Table 1. Note that none of the coefficients is significant, suggesting a conclusion that Aact and SN have neither main nor interactive effects on BI, which is quite contradictory with the predictions of the Fishbein model or Ryan and Bonfield (1975). Yet, the regression as a whole is quite significant at the.001 level (R2=.44, F=29.6). This suggests that there might be multicollinearity problems in the analysis, a consequence that individual estimates are insignificant due to inflated standard errors while the overall equation is quite significant. 3 Details on data collection procedures are available from the author upon request.

12 Table 1 here The degree of multicollinearity is determined by examining the following measures which are suggested for detecting multicollinearity: (1) extreme pairwise correlation between predictors, (2) small determinant of the correlation matrix, det (R), (3) large variance inflation factors, VIF (j), which are the diagonal elements of the inverse of the correlation matrix, and (4) the condition number, k (X), which indicates the sensitivity of estimates to the change in the input data (Belsley et al. 1980; Johnston 1984; Mansfield and Helms 1982). In the first three columns and rows of Table 2 are the correlations among Aact, SN, and their product (Aact x SN). Note that one of the correlations is.92, suggesting high degree of multicollinearity. Other measures also show similar results: det (R)=.038, the largest VIF(j)=20, and k(X)= 11.4. Table 2 here The issue of scale dependence is explored by changing the coding schemes for the independent variables. Three types of transformation are adopted in the analysis: (1) a change in the Aact scale, (2) a change in the SN scale, and (3) changes in both Aact and SN scales. Columns 2-4 of Table I show the results from these transformations. First, the Aact scale is renumbered as a bipolar scale (-5 to +5). The regression coefficient for SN changes as a result of this transformation; it has changed from 1.30 to 1.73. Furthermore, the coefficient has become significant after the transformation of the Aact scale (p<.01). A symmetric result is observed for the change in the SN scale (i.e., -3 to +3); the coefficient for Aact changed from.74 to 1.38. The coefficient, which was insignificant in the baseline model, has become significant in the transformed model (p<.05). When both Aact

13 and SN scales are changed, coefficients for both variables change. The main effects of both Aact and SN, which were not significant in the baseline model, are now significant at the.05 level. In sum, conclusions as to main effects are drastically affected by changes in scales.4 These results imply that researchers can draw different conclusions about the main (additive) effects of attitude and subjective norm from the same data, depending on what coding schemes are used in the analysis. This fact is disturbing because there is no agreedupon coding scheme for many interval scale measures among marketing researchers. For example, in marketing studies using seven-point interval scales, the items have been scored either from 1 to 7 (unipolar scales) or from -3 to +3 (bipolar scales) (Bagozzi 1984; Holbrook 1977; Ryan and Bonfield 1975). A proposed mean-centering is applied in order to demonstrate its effects on the problems; that is, the deviation scores from their means are used in assessing the effects of Aact and SN. The input correlation matrix, given in Table 2, suggests that mean centering has greatly reduced the pairwise correlations among the predictors; the correlations of the new product term with the Aact and SN are much lower (.06 and.21) than those (.61 and.92) in the original model. Other indices also indicate low degree ofmulticollinearity; det(R)=.858, the largest VIF(j)=1.2, and k(X)=9.9. The standard errors for the coefficients of Aact and SN have decreased drastically by using the mean centering (from.92 to.51, and from.79 to.23, respectively), which is expected with the reduction of multicollinearity (see Table 3). Overall, the multicollinearity is substantially reduced by mean centering. 4We can note that the coefficient for the interaction effect is invariant under these transformations; it is. 1 with the standard error of. 17 under any transformation. We can also note that the R2 of the model is identical for all models.

14 Table 3 here Next, the scale independence of conclusions in the mean-centered model is examined. Since the mean-deviation scores are invariant as to changes in the origins of scales, the regression coefficients and the related standard errors remain the same under the three transformations. As a result, conclusions as to main effects as well as interaction effects are not affected by changes in origins of the scales. The orthogonal centering is also applied for comparison purposes. As we can see in the final row of Table 2, the correlations between the product term and its constituents are eliminated by this orthogonal centering. Measures of multicollinearity, however, are quite similar to those from mean centering; det(R)=.898, the largest VIFOj)=. 1, k(X)=9.7. The regression results from orthogonal centering are summarized in the second column of Table 3. The coefficients and their standard errors are quite similar to those obtained under the mean-centered model. For example, the coefficient for Aact is 1.21 in the orthogonal model, while it is 1.20 in the mean-centered model. In sum, mean-centering has provided satisfactory results in reducing multicollinearity. The additional reduction in multicollinearity by using orthogonal centering is minimal for this example. It can be noted that orthogonal constants f and g were quite close to the means of X1 and X2 for this example (f=4.38, g=4.39, X1 =4.12, and X2 =4.40), which is consistent with the prediction by Smith and Sasaki (1979). However, meancentering would still be preferred because it provides the additional advantages of scale independence and ease of interpreting the main effects.

15 CONCLUSION We have examined the issues in assessing the effects in a regression analysis with a product term. Much interest has been in the appropriateness of this analysis regarding the interaction effect (Bagozzi 1984; Holbrook 1977; Schmidt and Wilson 1975), but the issues in assessing main effects have not received much attention in marketing research. This study is an attempt to highlight such neglected issues regarding the main effects. The traditional analysis has problems in (1) scale dependency, (2) multicollinearity, and (3) substantive interpretations. It has been demonstrated that mean-centering, which has been proposed usually as a way to reduce multicollinearity, can reduce other problems as well. In this approach, conclusions are invariant to changes in origin, they do not suffer from multicollinearity, and the estimates are clearly interpretable.

16 REFERENCES Allison, Paul D. (1977), "Testing for Interaction in Multiple Regression," American Journal of Sociology, 83 (1), 144-153. Althauser, Robert P. (1971), "Multicollinearity and Non-additive Models," in Causal Models in the Social Sciences, H. M. Blalock, Jr., ed., Chicago: Aldine-Atherton, 433-472. Bagozzi, Richard P. (1984), "Expectancy-Value Attitude Models: An Analysis of Critical Measurement Issues," International Journal of Research in Marketing, 1, 295-310. Belsley, David A., E. Kuh, and R. E. Welsch (1980), Regression Diagnostics: Identifying Influential Data and Sources of Collinearity, New York: John-Wiley & Sons Cohen, Jacob (1978), "Partialed Products Are Interactions: Partialed Powers Are Curve Components," Psychological Bulletin, 85 (3), 858-866. Cohen, Jacob and Patricia Cohen (1983), Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences, 2nd ed. New York: John Wiley and Sons, Inc. Cooper, Lee G. and Masao Nakanishi (1983), "Standardizing Variables in Multiplicative Choice Models," Journal of Consumer Research, 10 (June), 96-108 Cronbach, Lee J. and Richard E. Snow (1977), Aptitudes and Instructional Methods: A Handbook for Research on Interactions, New York: Irvington. Finney, John W., R. E. Mitchell, R. C. Cronkite, and R. H. Moos (1984), Methodological Issues in Estimating Main and Interactive Effects: Examples for Coping/Social Support and Stress Field, Journal of Health and Social Behavior, 25 (March), 85-98. Holbrook, Morris B. (1977), "Comparing Multiattribute Attitude Models by Optimal Scaling," Journal of Consumer Research, 4 (December), 165-171.

17 Hornik, Jacob (1984), "Subjective vs. Objective Time Measures: A Note on the Perception of Time in Consumer Research," Journal of Consumer Research, 11 (June), 615-8. Johnston, John (1984), Econometric Methods, 2nd ed. New York: McGraw-Hill. Lane, David M. (1981), "Testing Main Effects of Continuous Variables in Nonadditive Models," Multivariate Behavioral Research, 16 (October), 499-509. Laroche, Michel and John A. Howard (1980), "Nonlinear Relations in a Complex Model of Buyer Behavior," Journal of Consumer Research, 6 (March), 377-388. Lilien, Gary L. and Philip Kotler (1983), Marketing Decision Making: A Model-Building Approach, New York: Harper & Row. Louviere, Jordan J. and George Woodworth (1983), "Design and Analysis of Simulated Consumer Choice or Allocative Experiments: An Approach Based on Aggregate Data," Journal of Consumer Research, 20 (November), 350-367. Mansfield, Edward R. and Billy P. Helms (1982), "Detecting Multicollinearity," The American Statistician, 36 (3), 158-60. Overall, John E., Dennis M. Lee, and Chris W. Hornick (1981), "Comparison of Two Strategies for Analysis of Variance in Nonorthogonal Designs," Psychological Bulletin, 90 (2), 367-375. Punj, Girish N. and David W. Stewart (1983), "An Interaction Framework of Consumer Decision Making," Journal of Consumer Research, 10 (September), 181-196. Ryan, Michael J. and E. H. Bonfield (1975), "The Fishbein Extended Model and Consumer Behavior," Journal of Consumer Research, 2 (September), 119-136. Schmidt, Frank L. and Terry C. Wilson (1975), "Expectancy Value Models of Attitude Measurement: A Measurement Problem," Journal of Marketing Research, 12 (August), 366-368.

18 Sharma, Subhash, Richard M. Durand, and Oded Gur-Arie (1981), "Identification and Analysis of Moderator Variables," Journal of Marketing Research, 18 (August), 29 1 -300. Smith, Kent W. and M. S. Sasaki (1979), "Decreasing Multicollinearity: A Method for Models with Multiplicative Functions," Sociological Methods and Research, 8 (August), 35-56. Southwood, Kenneth E. (1978), "Substantive Theory and Statistical Interaction: Five Models," American Journal of Sociology, 83 (5), 1154-1203. Tate, Richard L (1984), "Limitations of Centering for Interactive Models," Sociological Methods and Research, 13 (November), 251-271.

19 TABLE 1 THE EFFECTS OF SCALE CHANGES ON REGRESSION COEFFICIENTS Baseline Aact SN Both Modela Changeb Changec Changesd Attitude.74 (.92)e.74 (.92) 1.38*(.59) 1.38*(.59) Subjective Norm 1.30 (.79) 1.73**(.23) 1.30 (.79) 1.73**(.23) Interaction.11 (.17).11 (.17).11 (.17).11 (.17) (Intercept) 2.5 5.5 10.4 15.9 R2.44.44.44.44 aAact (1,7) and SN (1,11) b Aact (-3,+3) and SN (1,11) c Aact (1,7) and SN (-5,+5) d Aact (-3, +3) and SN (-5,+5) e Standard errors in parentheses. * Statistically significant at the.05 level. ** Statistically significant at the.01 level.

20 TABLE 2 CORRELATIONS AMONG INDEPENDENT AND DEPENDENT VARIABLES ya Xlb X2c Y 1.00 X1.36 1.00 X2.64.32 1.00 X1X2.65.61.92 (X1- Xl )(X2- X2).17.06.21 (Xl-f)(X2-g).04.00.00 a Behavioral intention (BI) b Attitude toward an act (Aact) c Subjective Norm (SN)

21 TABLE 3 COMPARISION OF MEAN CENTERING AND ORTHOGONAL CENTERING Mean-Centered Modela Coef. S.E. Orthogonal Modelb Coef. S.E. Attitude 1.20*.51 1.21*.51 Subjective Norm 1.74**.23 1.77**.22 Interaction.11.17.11.17 (Intercept) 13.2 13.5 R2.44.44 a Centered to the means of Aact and SN (4.12, and 4.40). b Centered to the values that make the interaction term orthogonal to Aact and SN (4.38, 4.39). * Statistically significant at the.05 level. **Statistically significant the.01 level.