> ]7 RWbjbjUU )77RYl)))8**I,+,.oIqIqIqIqIqIqI$J
MdII8I888JoI8oI8P8:??? + }kc2$)X2?? I0I?nM7nM?8
Chapter 15
Building Multiple Regression Models
LEARNING OBJECTIVES
This chapter presents the potential of multiple regression analysis as a tool in business decision making and its applications, thereby enabling you to:
1. Analyze and interpret nonlinear variables in multiple regression analysis.
2. Understand the role of qualitative variables and how to use them in multiple regression analysis.
3. Learn how to build and evaluate multiple regression models.
4. Learn how to detect influencial observations in regression analysis.
CHAPTER TEACHING STRATEGY
In chapter 14, the groundwork was prepared for chapter 15 by presenting multiple regression models along with mechanisms for testing the strength of the models such as se, R2, t tests of the regression coefficients, and the residuals.
The early portion of the chapter is devoted to nonlinear regression models and search procedures. There are other exotic types of regression models that can be explored. It is hoped that by studying section 15.1, the student will be somewhat prepared to explore other nonlinear models on his/her own. Tukeys Ladder of Transformations can be useful in steering the research towards particular recoding schemes that will result in better fits for the data.
Dummy or indicator variables can be useful in multiple regression analysis. Remember to emphasize that only one dummy variable is used to represent two categories (yes/no, male/female, union/nonunion, etc.). For c categories of a qualitative variable, only c1 indicator variables should be included in the multiple regression model.
Several search procedures have been discussed in the chapter including stepwise regression, forward selection, backward elimination, and all possible regressions. All possible regressions is presented mainly to demonstrate to the student the large number of possible models that can be examined. Most of the effort is spent on stepwise regression because of its common usage. Forward selection is presented as the same as stepwise regression except that forward selection procedures do not go back and examine variables that have been in the model at each new step. That is, with forward selection, once a variable is in the model, it stays in the model. Backward elimination begins with a "full" model of all predictors. Sometimes there may not be enough observations to justify such a model.
CHAPTER OUTLINE
15.1 Non Linear Models: Mathematical Transformation
Polynomial Regression
Tukeys Ladder of Transformations
Regression Models with Interaction
Model Transformation
15.2 Indicator (Dummy) Variables
15.3 ModelBuilding: Search Procedures
Search Procedures
All Possible Regressions
Stepwise Regression
Forward Selection
Backward Elimination
15.4 Multicollinearity
KEY TERMS
All Possible Regressions Qualitative Variable
Backward Elimination Search Procedures
Dummy Variable Stepwise Regression
Forward Selection Tukeys Fourquadrant Approach
Indicator Variable Tukeys Ladder of
Multicollinearity Transformations
Nonlinear Regression Model Variance Inflation Factor
SOLUTIONS TO PROBLEMS IN CHAPTER 15
15.1 Simple Regression Model:
EMBED Equation.3 =  147.27 + 27.128 x
F = 229.67 with p = .000, se = 27.27, R2 = .97, adjusted R2 = .966, and
t = 15.15 with p = .000. This is a very strong simple regression model.
Quadratic Model (Using both x and x2):
EMBED Equation.3 =  22.01 + 3.385 X + 0.9373 x2
F = 578.76 with p = .000, se = 12.3, R2 = .995, adjusted R2 = .993, for x:
t = 0.75 with p = .483, and for x2: t = 5.33 with p = .002. The quadratic model is also very strong with an even higher R2 value. However, in this model only the x2 term is a significant predictor.
15.2 The model is:
EMBED Equation.3 = b0b1x
Using logs: log y = log b0 + x log b1
The regression model is solved for in the computer using the values of x and the values of log y. The resulting regression equation is:
log y = 0.5797 + 0.82096 x
F = 68.83 with p = .000, se = 0.1261, R2 = .852, and adjusted R2 = .839. This model has relatively strong predictability.
15.3 Simple regression model:
EMBED Equation.3 =  1456.6 + 71.017 x
R2 = .928 and adjusted R2 = .910. t = 7.17 with p = .002.
Quadratic regression model:
EMBED Equation.3 = 1012  14.06 X + 0.6115 x2
R2 = .947 but adjusted R2 = .911. The t ratio for the x term is t =  0.17 with p = .876. The t ratio for the x2 term is t = 1.03 with p = .377
Neither predictor is significant in the quadratic model. Also, the adjusted R2 for this model is virtually identical to the simple regression model. The quadratic model adds virtually no predictability that the simple regression model does not already have. The scatter plot of the data follows:
EMBED MinitabGraph.Document \s
15.4 The model is:
EMBED Equation.3 = b0b1x
Using logs: log y= log b0 + x log b1
The regression model is solved for in the computer using the values of x and the values of log y where x is failures and y is liabilities. The resulting regression equation is:
log liabilities = 3.1256 + 0.012846 failures
F = 19.98 with p = .001, se = 0.2862, R2 = .666, and adjusted R2 = .633. This model has modest predictability.
15.5 The regression model is:
EMBED Equation.3 =  28.61  2.68 x1 + 18.25 x2  0.2135 x12  1.533 x22 + 1.226 x1*x2
F = 63.43 with p = .000 significant at ( = .001
se = 4.669, R2 = .958, and adjusted R2 = .943
None of the t ratios for this model are significant. They are t(x1) =  0.25 with p = .805, t(x2) = 0.91 with p = .378, t(x12) =  0.33 with .745,
t(x22) =  0.68 with .506, and t(x1*x2) = 0.52 with p = .613. This model has a high R2 yet none of the predictors are individually significant.
The same thing occurs when the interaction term is not in the model. None of the t tests are significant. The R2 remains high at .957 indicating
that the loss of the interaction term was insignificant.
15.6 The F value shows very strong overall significance with a pvalue of .00000073. This is reinforced by the high R2 of .910 and adjusted R2 of .878. An examination of the t values reveals that only one of the regression coefficients is significant at ( = 05 and that is the interaction term with a pvalue of .039. Thus, this model with both variables, the square of both variables, and the interaction term contains only 1 significant t test and that is for interaction.
Without interaction, the R2 drops to .877 and adjusted R2 to .844. With the interaction term removed, both variable x2 and x22 are significant at
( = .01.
15.7 The regression equation is:
EMBED Equation.3 = 13.619  0.01201 x1 + 2.998 x2
The overall F = 8.43 is significant at ( = .01 (p = .009).
se = 1.245, R2 = .652, adjusted R2 = .575
The t ratio for the x1 variable is only t = 0.14 with p = .893. However the t ratio for the dummy variable, x2 is t = 3.88 with p = .004. The indicator variable is the significant predictor in this regression model which has some predictability (adjusted R2 = .575).
15.8 The indicator variable has c = 4 categories as shown by the c  1 = 3 categories of the predictors (x2, x3, x4).
The regression equation is:
EMBED Equation.3 = 7.909 + 0.581 x1 + 1.458 x2  5.881 x3  4.108 x4
Overall F = 13.54, p = .000 significant at ( = .001
se = 1.733, R2 = .806, and adjusted R2 = .747
For the predictors, t = 0.56 with p = .585 for the x1 variable (not significant), t = 1.32 with p = .208 for the first indicator variable (x2) and is non significant, t = 5.32 with p = .000 for x3 the second indicator variable and this is significant at ( = .001, t = 3.21 with p = .007 for the third indicator variable (x4) which is significant at ( = .01. This model has strong predictability and the only significant predictor variables are the two dummy variables, x3 and x4.
15.9 This regression model has relatively strong predictability as indicated by R2 = .795. Of the three predictor variables, only x1 and x2 have significant t ratios (using ( = .05). x3 (a non indicator variable) is not a significant predictor. x1, the indicator variable, plays a significant role in this model along with x2.
15.10 The regression model is:
EMBED Equation.3 = 41.225 + 1.081 x1 18.404 x2
F = 8.23 with p = .0017 which is significant at ( = .01. se = 11.744,
R2 = .388 and the adjusted R2 = .341.
The tratio for x2 (the dummy variable) is 4.05 which has an associated
pvalue of .0004 and is significant at ( = .001. The tratio of 0.80 for x1
is not significant (pvalue = .4316). With x2 = 0, the regression model
becomes EMBED Equation.3 = 41.225 + 1.081x1. With x2 = 1, the regression model
becomes EMBED Equation.3 = 22.821 + 1.081x1. The presence of x2 causes the y
intercept to drop by 18.404. The graph of each of these models (without
the dummy variable and with the dummy variable equal to one) is shown
below:
15.11 The regression equation is:
Price = 7.066  0.0855 Hours + 9.614 ProbSeat + 10.507 FQ
The overall F = 6.80 with p = .009 which is significant at ( = .01. se = 4.02, R2 = .671, and adjusted R2 = .573. The difference between R2 and adjusted R2 indicates that there are some nonsignificant predictors in the model. The t ratios, t =  0.56 with p = .587 and t = 1.37 with p = .202, of Hours and Probability of Being Seated are nonsignificant at ( = .05. The only significant predictor is the dummy variable, French Quarter or not, which has a t ratio of 3.97 with p = .003 which is significant at ( = .01. The positive coefficient on this variable indicates that being in the French Quarter adds to the price of a meal.
15.12 There will be six predictor variables in the regression analysis:
three for occupation, two for industry, and one for marital status. The independent variable is job satisfaction, which makes a total of seven variables.
15.13 Stepwise Regression:
Step 1: x2 enters the model, t =  7.35 and R2 = .794
The model is = 36.15  0.146 x2
Step 2: x3 enters the model and x2 remains in the model.
t for x2 is 4.60, t for x3 is 2.93. R2 = .876.
The model is EMBED Equation.3 = 26.40  0.101 x2 + 0.116 x3
The variable, x1, never enters the procedure.
15.14 Stepwise Regression:
Step 1: x4 enters the model, t =  4.20 and R2 = .525
The model is EMBED Equation.3 = 133.53  0.78 x4
Step 2: x2 enters the model and x4 remains in the model.
t for x4 is  3.22 and t for x2 is 2.15. R2 = .637
The model is EMBED Equation.3 = 91.01  0.60 x4 + 0.51 x2
The variables, x1 and x3 never enter the procedure.
15.15 The output shows that the final model had four predictor variables, x4, x2, x5, and x7. The variables, x3 and x6 did not enter the stepwise analysis. The procedure took four steps. The final model was:
y1 =  5.00 x4 + 3.22 x2 + 1.78 x5 + 1.56 x7
The R2 for this model was .5929, and se was 3.36. The t ratios were:
tx4 = 3.07, tx2 = 2.05, tx5 = 2.02, and tx7 = 1.98.
15.16 The output indicates that the stepwise process only went two steps. Variable x3 entered at step one. However, at step two, x3 dropped out of the analysis and x2 and x4 entered as the predictors. x1 was the dependent variable. x5 never entered the procedure and was not included in the final model as x3 was not. The final regression model was:
EMBED Equation.3 = 22.30 + 12.38 x2 + 0.0047 x4.
R2 = .682 and se = 9.47. tx2 = 2.64 and tx4 = 2.01.
15.17 The output indicates that the procedure went through two steps. At step 1, dividends entered the process yielding an R2 of .833 by itself. The t value was 6.69 and the model was EMBED Equation.3 =  11.062 + 61.1 x1. At step 2, net income entered the procedure and dividends remained in the model. The R2 for this two predictor model was .897 which is a modest increase from the simple regression model shown in step one. The step 2 model was:
Premiums earned =  3.726 + 45.2 dividends + 3.6 net income
For step 2, tdividends = 4.36 (pvalue = .002)
and tnet income = 2.24 (pvalue = .056).
15.18 This stepwise regression procedure only went one step. The only significant predictor was natural gas. No other predictors entered the model. The regression model is:
Electricity = 1.748 + 0.994 Natural Gas
For this model, R2 = .9295 and se = 0.490. The t value for natural gas was 11.48.
15.19 y x1 x2 x3
y  .653 .891 .821
x1 .653  .650 .615
x2 .891 .650  .688
x3 .821 .615 .688 
There appears to be some correlation between all pairs of the predictor variables, x1, x2, and x3. All pairwise correlations between independent variables are in the .600 to .700 range.
15.20 y x1 x2 x3 x4
y  .241 .621 .278 .724
x1 .241  .359 .161 .325
x2 .621 .359  .243 .442
x3 .278 .161 .243  .278
x4 .724 .325 .442 .278 
An examination of the intercorrelations of the predictor variables reveals that the highest pairwise correlation exists between variables x2 and
x4 (.442). Other correlations between independent variables are less than .400. Multicollinearity may not be a serious problem in this regression analysis.
15.21 The stepwise regression analysis of problem 15.17 resulted in two of the three predictor variables being included in the model. The simple regression model yielded an R2 of .833 jumping to .897 with the two predictors. The predictor intercorrelations are:
Net
Income Dividends Gain/Loss
Net  .682 .092
Income
Dividends .682  .522
Gain/Loss .092 .522 
An examination of the predictor intercorrelations reveals that Gain/Loss and Net Income have very little correlation, but Net Income and Dividends have a correlation of .682 and Dividends and Gain/Loss have a correlation of .522. These correlations might suggest multicollinearity.
15.22 The intercorrelations of the predictor variables are:
Natural Fuel
Gas Oil Gasoline
Natural
Gas  .570 .701
Fuel Oil .570  .934
Gasoline .701 .934 
Each of these intercorrelations is not small. Of particular concern is the correlation between fuel oil and gasoline which is .934. These two variables seem to be adding about the same predictability to the model. In the stepwise regression analysis only natural gas entered the procedure. Perhaps the overlapping information between natural gas and fuel oil and gasoline was such that fuel oil and gasoline did not have significant unique variance to add to the prediction.
15.23 The regression model is:
EMBED Equation.3 = 564  27.99 x1  6.155 x2  15.90 x3
F = 11.32 with p = .003, se = 42.88, R2 = .809, adjusted R2 = .738. For x1, t = 0.92 with p = .384, for x2, t = 4.34 with p = .002, for x3, t = 0.71 with p = .497. Thus, only one of the three predictors, x2, is a significant predictor in this model. This model has very good predictability (R2 = .809). The gap between R2 and adjusted R2 underscores the fact that there are two nonsignificant predictors in this model. x1 is a nonsignificant indicator variable.
15.24 The stepwise regression process included two steps. At step 1, x1 entered the procedure producing the model:
EMBED Equation.3 = 1540 + 48.2 x1.
The R2 at this step is .9112 and the t ratio is 11.55. At step 2, x12 entered the procedure and x1 remained in the analysis. The stepwise regression procedure stopped at this step and did not proceed. The final model was:
EMBED Equation.3 = 1237 + 136.1 x1  5.9 x12.
The R2 at this step was .9723, the t ratio for x1 was 7.89, and the t ratio for x12 was  5.14.
15.25 In this model with x1 and the log of x1 as predictors, only the log x1 was a significant predictor of y. The stepwise procedure only went to step 1. The regression model was:
EMBED Equation.3 =  13.20 + 11.64 Log x1. R2 = .9617 and the t ratio of Log X1 was 17.36. This model has very strong predictability using only the log of the X1 variable.
15.26 The regression model is:
Grain =  4.675 + 0.4732 Oilseed + 1.18 Livestock
The value of R2 was .901 and adjusted R2 = .877.
se = 1.761. F = 36.55 with p = .000.
toilseed = 3.74 with p = .006 and tlivestock = 3.78 with p = .005. Both predictors are significant at ( = .01. This is a model with strong predictability.
15.27 The stepwise regression procedure only used two steps. At step 1, Silver was the lone predictor. The value of R2 was .5244. At step 2, Aluminum entered the model and Silver remained in the model. However, the R2 jumped to .8204. The final model at step 2 was:
Gold =  50.19 + 18.9 Silver +3.59 Aluminum.
The t values were: tSilver = 5.43 and tAluminum = 3.85.
Copper did not enter into the process at all.
15.28 The regression model was:
Employment = 71.03 + 0.4620 NavalVessels + 0.02082 Commercial
F = 1.22 with p = .386 (not significant)
R2 = .379 and adjusted R2 = .068
The low value of adjusted R2 indicates that the model has very low predictability. Both t values are not significant (tNavalVessels = 0.67 with
p = .541 and tCommercial = 1.07 with p = .345). Neither predictor is a significant predictor of employment.
15.29 There were four predictor variables. The stepwise regression procedure went three steps. The predictor, apparel, never entered in the stepwise process. At step 1, food entered the procedure producing a model with an R2 of .84. At step 2, fuel oil entered and food remained. The R2 increased to .95. At step 3, shelter entered the procedure and both fuel oil and food remained in the model. The R2 at this step was .96. The final model was:
All = 1.0615 + 0.474 Food + 0.269 Fuel Oil + 0.249 Shelter
The t ratios were: tfood = 8.32, tfuel oil = 2.81, tshelter = 2.56.
15.30 The stepwise regression process with these two independent variables only went one step. At step 1, Soybeans entered in producing the model,
Corn =  2,962 + 5.4 Soybeans. The R2 for this model was .7868.
The t ratio for Soybeans was 5.43. Wheat did not enter in to the analysis.
15.31 The regression model was:
Grocery = 76.23 + 0.08592 Housing + 0.16767 Utility
+ 0.0284 Transportation  0.0659 Healthcare
F = 2.29 with p = .095 which is not significant at ( = .05.
se = 4.416, R2 = .315, and adjusted R2 = .177.
Only one of the four predictors has a significant t ratio and that is Utility with t = 2.57 and p = .018. The ratios and their respective probabilities are:
thousing = 1.68 with p = .109, ttransportation = 0.17 with p = .87, and
thealthcare =  0.64 with p = .53.
This model is very weak. Only the predictor, Utility, shows much promise in accounting for the grocery variability.
15.32 The output suggests that the procedure only went two steps.
At step 1, x1 entered the model yielding an R2 of .7539. At step 2,
x2 entered the model and x1 remained. The procedure stopped here with a final model of:
EMBED Equation.3 = 124.5  43.4 x1 + 1.36 x2
The R2 for this model was .8059 which indicates relatively strong predictability with two independent variables. Since there were four predictor variables, two of the variables did not enter the stepwise process.
15.33 Of the three predictors, x2 is an indicator variable. An examination of the stepwise regression output reveals that there were three steps and that all three predictors end up in the final model. Variable x3 is the strongest individual predictor of y and entered at step one resulting in an R2 of .8124. At step 2, x2 entered the process and variable x3 remained in the model. The R2 at this step was .8782. At step 3, variable x1 entered the procedure. Variables x3 and x2 remained in the model. The final R2 was .9407. The final model was:
EMBED Equation.3 = 87.89 + 0.071 x3  2.71 x2  0.256 x1
15.34 The R2 for the full model is .321. After dropping out variable, x3, the R2 is still .321. Variable x3 added virtually no information to the model. This is underscored by the fact that the pvalue for the t test of the slope for x3 is .878 indicating that there is no significance. The standard error of the estimate actually drops slightly after x3 is removed from the model.
PAGE
Chapter 15: Building Multiple Regression Models PAGE 4
PAGE
5I+D
v
EN/0CDEF\]^_`oqz{2345TUVWXYstuʹ5H*j5EHUjIA
CJUVaJ 56]j5EHUjIA
CJUVaJj5UH*H*6]5CJ$5CJ$G3456JKL67)*+ 0^`0
`^`` ^`^$a$
!VQW+EF34 R
S
T
u
v
12TU}`$$$$$a$ ^`$a$)*BCDEOV%T
,$a$$ $$
!$$$$$a$
!`^_WXmnoUVqr
p@@^@``^ ^`^u~ IJK+,Z[oprs$%&'=ƽj5EHUjFA
CJUVaJ5j5U 6H*]H*
jEHUjFA
CJUVaJ jUH*6]J?@{fg
! ^``^$$^$$^$$$$^`$$`=>@ABWXYcdqr
$%45DEFOP]^ 6H*]H*
jEHUjHFA
CJUVaJ jO Uj'FA
UV jU5H*js5EHUjFA
CJUVaJj5UH*6]5 56]A
^_vw~ 678%&'=>?TU ja5H*5H* 56]jW5EHUjʣFA
CJUVaJ5j5UH*H*6]OhijGHpq=>? 0^`0` ^`^
p@@^@`234LMCDrs=>?[\]5j256EHU]jFA
CJUVaJ 56]j56U] jaH*H*6]N?#"#^_ !!5!6!!!! ^`
!^`^ 0^`0 !"/0JKST_`aklm
!!
!!!!!!!9!:!M!N!O!P!`!a!b!k!l!m!v!w!x!!!!!!!!!!!!!j5EHUjDFA
CJUVaJj5UH*H* ja6] 56]55H*Q!!!!!!!!!!""""";"<"I"J"t"u"v""""""""""""##,##.#H#I#######$$ $Q$R$S$X$Y$Z$$}$$$$$$$%%%<%=%P%Q%R%S%T%f%g%h%r%s%t%w%x%%5H* 56]5
jEHUjFA
CJUVaJ jU jaH*6]H*Q!!!###%%%:%;%u%v%%%%0&&&#'z''(((#(^ 0^`0^ ^`%%%%%%%%%%%%%%%%1&2&X&Y&g&h&{&&}&&&&&&&&&&&&&&&''',''@'A'B'C'T'U'V'i'j'k'w'x'!("(U((((((((((((((((()5 jtU
jEHUjFA
CJUVaJ
jEHUjFA
CJUVaJ jUH*H*6] ja6K#($(1(2(T(U(((++Y++++,,J,p,q,,, p^p`
pp^p`
pp`^p`` 0^`0^ ^``))),)).){))))))))))))]*^*r*s***,,,1,2,@,A,B,m,n,o,{,,},,,,,,,,,,,,,,,,,,,,,/01wxyjFA
CJUVaJ
j%EHUjfFA
CJUVaJ jUH* jaH*6]P, !OPQlm.M......//// 0^`0 ^`
!
pp`^p```^ p^p`...... .!.0.1.6.7.8.C.D.E.Z.[.n.o.p.q.............////////// /!/4/5/;/` 0^`0^$>>P>Q>>>>>@@@@@@AABBB`CaCCCqDrDDD 0^`0^`^@@@@@@AAAAAAAA#A$AA.A/A9A:A;AMANAOA]A^A_AaAbApAqA~AAAAAAAAAAAAAAAAA=B>B?BZB[B\BjBkBlBBBB2C3C4CdCeCxCyCzC{CCCCCCCCCCCCCCCCrD
j0EHUjFA
CJUVaJH*H*6] jU
j.EHUUrDsDDDDDDDDDDDDDDDDDDDDDDDDDD$E%E&E6E7E8EUEVEWEwExEEEEEEEEEEEEE
FFFFFlFmFnFFFFFFFGGGGGGG(G)G0G=G>GJGKGTGaGbG
ju4EHUjFA
CJUVaJH*H*6]
j2EHUjFA
CJUVaJ jUPD E
EEEEyFzFFFFFFGG'G(GGGGHHI IBI ^``
p^ 0^`0^bGGG?H@HAHHHH
IIII$I0I1I9IIIIIJJJJJJ=J>J?JJ}JJJJJJJJJJJLLLFLGLHLLLL=M>MBMKMLMTM]M^MeM*N+N,NMNNNOO)O*ONOOOYOZO[OeOfOgO}O~OOOOOO(P)P0P=P>PGPHPVPcPdPqPrPPH*H*6] ja`BICIqIrIsIIIIIJJ"J#JJ"K$K%KLL(M)MnMoMpMNINN 0^`0^ ^`NNNNNNOOXOYOOO'P(PqPPP
QQQPQQQQQQ*R`^^ ^`PPP\Q]Q^Q}Q~QQQQQQQQQQR R
RRRRR&R'R(R/R0R1R"S#S$SSSSTT.T/T0TGTHTITkTlTmTTTTTTTTTTTTTUU
U3U4UGUHUIUJU\U]U^UfUgUhUqUrUtUUUUUU
j&8EHUj2FA
CJUVaJ
jK6EHUjFA
CJUVaJ jUH*H*6]Q*R+RSSS/U0UtUuUvUVWWWDWEWNWOWPWQWRW
$;@&#$a$h]h&`#$^
8^8 0^`0^UUUUUUUU;V(`!d` !XJxڅP
@]1`
v
"ETPXYZX[X=&XxJ@
QY9.kP"E8VCuEL)̟9]QUVb3Gw.R=)F' KD,Ù,;Z8'm.4?
@U
hJGXCKS&AAO7MqAcUI(
\F6DdTP
SA?"27b'VIG(`!b'VIGd` !XJxڅP
@]1`
v
hj1E@ JiWV~}Fۛ9B]00 Ua
JqH)cW3gsK)Bj&YʕT5m`j'2/;
d+ 6Nr]}`4F?
@U
JGXCKS&IAIOWMiǌ&!46DdP
SA?"2?5
vGБ:(`!5
vGБ:t`!hxcdd`` @bD"L1JE`x,56~) M@ k+5isC0&dT20(]!A601d++&104\F\LJ &00p&u@7.o&v0o8+KRs!!r%Ff:7DdTP
SA?"28tAT9Eh8(`!tAT9Eh8d` !XJxڅP
@]1`
v
"ETPXYZ
~c0md5*%R8cE:TW˔Uk+h 6Cp"Crl}`">Nd^2"7߹UqҖ=@Sn?
_t:4
it$?f\%Z6Dd0
#A29l;n'̾ (`!}9l;n'̾(PL3KxڝOTW=ofj6P@ĤVI0itA7]4դ$MjI&]qh m)Ǯ.ǀAM
C@@罼7=@Aw5B(~il6wېߋH(ǽ=rSQt\C]xW๗ʗ/Iu6;]!
%J]ug3`kz<<"&/sd*G>
CsS
'+?:qf;Vd iug\W*j7^eUjr暭:B]gΪäUpnͪIu;sXug\UI8sV ;wvګܴqsq8[YGκHuU*:9Gr6!tuVgsp:?IhrTa//R&G5;jw_wH:K:dd_e_6>lm+C#sy"9ϓ$ 9#9cMwqLOoHXUIתݤԌUH[uԫc5V!uZ̗]]bKv[Hw˗ffuU_XuJ4j:IG*ߺ
Ho
<
IW
kSi$T8PWʼ,3W4/e`a(eR2tk6R[Pf6%Le5l _f:iϺS\a/ b"v6<Z
ø+e&k^ĚP!,Ӽݡ,3i͖ٴ.3QL/Y _
c~0
A\.6ۦ4/eCYf5/e犯BYf&TjeV1{GPqͶqA]f(qA]Î eflLaGLzlT.\
`]1/eZO!Z'O Zeͮ*),̒fvIPE`IPװY=[kA?0zlA}$_Ɯm&?oKDdTP
SA?
!"#$%&'()*+,./0123456789:;<=>?@ABCDEFGHIJKLMNOPQSTUVWXY\_a`bdcegfh{zjklmnopqrstuvwxy~}Root Entry} F }kc^!Data
R:WordDocument)ObjectPool1@lkc }kc_1095340805m F@lkc@lkcOle
CompObjfObjInfo$).38=BGLQV[`ejotxyz{}
FMicrosoft Equation 3.0DS EquationEquation.39qĀII
2y
FMicrosoft Equation 3.0DS EquationEquation.39qEquation Native 0_1095340823F@lkc@lkcOle
CompObj
fObjInfo
Equation Native 0_1095147778TF`Ytkc`YtkcOle
XI@qI
2y `
FMicrosoft Equation 3.0DS EquationEquation.39qII
2y CompObjfObjInfo
Equation Native 0_1095147904F`Ytkc`YtkcOle
CompObjfObjInfoEquation Native 0
FMicrosoft Equation 3.0DS EquationEquation.39qII
2Y
FMicrosoft Equation 3.0DS EquationEquation.39q_1095147928F`Ytkc`YtkcOle
CompObjfObjInfoEquation Native 0_1095148327'1t`n`Ytkc`YtkcOle
CompObj sxIDI
2y `
1t`nMinitab Graph ObjectEmbedded ObjectMinitabGraph.Document9q
FMicrosoft Equation 3.0DS EquationEquation.39qObjInfo!CONTENTSi!_1095148360$F`Ytkc`YtkcOle
Plot Ad Exp * Eq & Sup Exp f Plot Ad Exp * Eq & Sup Exp;; HMF V1.24 TEXT
;; (Microsoft Win32 Intel x86) HOOPS 5.0034 I.M. 3.0034
(Selectability "windows=off,geometry=on")
(Visibility "on")
(Color_By_Index "Geometry,Face Contrast" 1)
(Color_By_Index "Window" 0)
(Window_Frame "off")
(Window 1 1 1 1)
(Camera (0 0 5) (0 0 0) (0 1 0) 2 2 "Stretched")
;; (Driver_Options "no backing storeno borderno control areadisable input,no do
;; ublebufferingno double bufferingno first color,no force blackandwhiteno f
;; orce black and whiteno gamma correctionlight scaling=0,no pen speed,no selec
;; tion proximity,no special eventssubscreen=(0.999902,0.324318,0.99987,0.195
;; 444),no subscreen creatingno subscreen movingno subscreen resizingsubscreen
;; stretchin")
(Edge_Pattern "")
(Edge_Weight 1)
(Face_Pattern "solid")
(Heuristics "no related selection limit")
(Line_Pattern "")
(Line_Weight 1)
(Marker_Size 0.421875)
(Marker_Symbol ".")
(Text_Font "name=arialgdivector,no transforms,rotation=follow path")
(User_Options "mtb aspect ratio=0.677419,graphicsversion=6,worksheettitle=\"Wor
ksheet 1\",optiplot=0,builtin=0,statguideid=0,toplayer=0,angle=0,arrowdir=0,arr
owstyle=0,polygon=0,isdata=0,textfollowpath=1,ldfill=0,solidfill=0,3d=0,usebitm
ap=0,canbrush=0,brushrows=6,columnlengthx=6,columnlengthy=6,columnlengthz=0,lig
ht scaling=0.00000,sessionline=60")
(Segment "include" ())
(Front ((Segment "figure1" (
(Window_Pattern "clear")
(Window 1 1 1 1)
(User_Options "viewinfigurecoord=0")
(Front ((Segment "region" (
(Front ((Segment "figure box" (
(Visibility "polygons=off,lines=off")
(Color_By_Index "Face" 0)
(Color_By_Index "Face Contrast,Line,Edge" 1)
(Edge_Pattern "")
(Edge_Weight 1)
(Face_Pattern "solid")
(Line_Pattern "")
(Line_Weight 1)
(User_Options "solidfill=1")
(Segment "" (
(Polygon ((0.99995 0.99995 0) (0.99995 0.99995 0) (0.99995
0.99995 0) (0.99995 0.99995 0) (0.99995 0.99995 0)))))))
(Segment "data box" (
(Visibility "faces=off")
(Color_By_Index "Face" 0)
(Color_By_Index "Face Contrast,Line,Edge" 1)
(Edge_Pattern "")
(Edge_Weight 1)
(Face_Pattern "solid")
(Line_Pattern "")
(Line_Weight 1)
(User_Options "solidfill=1")
(Segment "" (
(Polygon ((0.59997 0.59997 0) (0.59997 0.59997 0) (0.59997
0.59997 0) (0.59997 0.59997 0) (0.59997 0.59997 0)))))))
(Segment "legend box" ())
(Segment "legend" (
(Window_Pattern "clear")
(Window 1 1 1 1)
(User_Options "viewinfigurecoord=1")
(Front ((Segment "symbol1" ())))))))))
(Segment "object" (
(Front ((Segment "frame" (
(Window_Pattern "clear")
(Window 1 1 1 1)
(Front ((Segment "tick" (
(Front ((Segment "set1" (
(Color_By_Index "Face Contrast,Line,Text,Edge" 1)
(Edge_Pattern "")
(Edge_Weight 1)
(Line_Pattern "")
(Line_Weight 1)
(Text_Alignment "^*")
(Text_Font "name=arialgdivector,size=0.03389 sru")
(Segment "" (
(Text 0.579971 0.669966 0 "110")))
(Segment "" (
(Text 0.434978 0.669966 0 "100")))
(Segment "" (
(Text 0.289986 0.669966 0 "90")))
(Segment "" (
(Text 0.144993 0.669966 0 "80")))
(Segment "" (
(Text 0 0.669966 0 "70")))
(Segment "" (
(Text 0.144993 0.669966 0 "60")))
(Segment "" (
(Text 0.289986 0.669966 0 "50")))
(Segment "" (
(Text 0.434978 0.669966 0 "40")))
(Segment "" (
(Text 0.579971 0.669966 0 "30")))
(Segment "major" (
(Segment "" (
(Polyline ((0.579971 0.59997 0) (0.579971 0.639968 0)
))))
(Segment "" (
(Polyline ((0.434978 0.59997 0) (0.434978 0.639968 0)
))))
(Segment "" (
(Polyline ((0.289986 0.59997 0) (0.289986 0.639968 0)
))))
(Segment "" (
(Polyline ((0.144993 0.59997 0) (0.144993 0.639968 0)
))))
(Segment "" (
(Polyline ((0 0.59997 0) (0 0.639968 0)))))
(Segment "" (
(Polyline ((0.144993 0.59997 0) (0.144993 0.639968
0)))))
(Segment "" (
(Polyline ((0.289986 0.59997 0) (0.289986 0.639968
0)))))
(Segment "" (
(Polyline ((0.434978 0.59997 0) (0.434978 0.639968
0)))))
(Segment "" (
(Polyline ((0.579971 0.59997 0) (0.579971 0.639968
0)))))))))
(Segment "set2" (
(Color_By_Index "Face Contrast,Line,Text,Edge" 1)
(Edge_Pattern "")
(Edge_Weight 1)
(Line_Pattern "")
(Line_Weight 1)
(Text_Alignment "*>")
(Text_Font "name=arialgdivector,size=0.03389 sru")
(Segment "" (
(Text 0.669966 0.579971 0 "7000")))
(Segment "" (
(Text 0.669966 0.386647 0 "6000")))
(Segment "" (
(Text 0.669966 0.193324 0 "5000")))
(Segment "" (
(Text 0.669966 0 0 "4000")))
(Segment "" (
(Text 0.669966 0.193324 0 "3000")))
(Segment "" (
(Text 0.669966 0.386647 0 "2000")))
(Segment "" (
(Text 0.669966 0.579971 0 "1000")))
(Segment "major" (
(Segment "" (
(Polyline ((0.59997 0.579971 0) (0.639968 0.579971 0)
))))
(Segment "" (
(Polyline ((0.59997 0.386647 0) (0.639968 0.386647 0)
))))
(Segment "" (
(Polyline ((0.59997 0.193324 0) (0.639968 0.193324 0)
))))
(Segment "" (
(Polyline ((0.59997 0 0) (0.639968 0 0)))))
(Segment "" (
(Polyline ((0.59997 0.193324 0) (0.639968 0.193324
0)))))
(Segment "" (
(Polyline ((0.59997 0.386647 0) (0.639968 0.386647
0)))))
(Segment "" (
(Polyline ((0.59997 0.579971 0) (0.639968 0.579971
0)))))))))))))
(Segment "grid" ())
(Segment "reference" ())
(Segment "axis" (
(Front ((Segment "set1" (
(Color_By_Index "Face Contrast,Line,Text,Edge" 1)
(Edge_Pattern "")
(Edge_Weight 1)
(Line_Pattern "")
(Line_Weight 1)
(Text_Alignment "^*")
(Text_Font "name=arialgdivector,size=0.04236 sru")
(Segment "" (
(Text 0 0.761661 0 "Eq & Sup Exp")))
(Segment "" (
(Polyline ((0.579971 0.59997 0) (0.579971 0.59997 0)))
))))
(Segment "set2" (
(Color_By_Index "Face Contrast,Line,Text,Edge" 1)
(Edge_Pattern "")
(Edge_Weight 1)
(Line_Pattern "")
(Line_Weight 1)
(Text_Alignment "*>")
(Text_Font "name=arialgdivector,size=0.04236 sru")
(Text_Path 6.12303e17 1 0)
(Segment "" (
(Selectability "polygons=on!,text=off")
(Visibility "polygons=off")
(Text_Alignment "v>")
(Text_Path 0 1 0)
(User_Options "angle=90,polygon=3,linect=1,charct=6")
(Polygon ((0.873554 0.143184 0) (0.873554 0.128058 0)
(0.780465 0.128058 0) (0.780465 0.143184 0)))
(Renumber (Text 0.803737 0.143184 0 "Ad Exp") 1 "L")
(Segment "raw" (
(Visibility "off")
(Renumber (Text 0 0 0 "Ad Exp") 1 "L")))))
(Segment "" (
(Polyline ((0.59997 0.579971 0) (0.59997 0.579971 0)))
))))))))))))
(Segment "data" (
(Window_Pattern "clear")
(Window 0.58 0.58 0.58 0.58)
(User_Options "isdata=1,viewinfigurecoord=1")
(Front ((Segment "symbol1" (
(Segment "points" (
(Color_By_Index "Marker" 1)
(Marker_Size 0.421875)
(Marker_Symbol "@")
(User_Options "canbrush=1,brushsetup=0,grouping=0")
(Segment "" (
(User_Value 6)
(Marker 0.824959 0.881289 0)))
(Segment "" (
(User_Value 5)
(Marker 0.79996 0.647634 0)))
(Segment "" (
(User_Value 4)
(Marker 0.574971 0.373315 0)))
(Segment "" (
(User_Value 3)
(Marker 0.699965 0.222656 0)))
(Segment "" (
(User_Value 2)
(Marker 0.124994 0.384647 0)))
(Segment "" (
(User_Value 1)
(Marker 0.899955 0.93562 0)))))))))))))))
(Segment "labels" (
(Window_Pattern "clear")
(Window 1 1 1 1)))
(Segment "annotation" (
(Window_Pattern "clear")
(Window 1 1 1 1)))))))
(Segment "annotation" (
(Window_Pattern "clear")
(Window 1 1 1 1)
(User_Options "toplayer=1")))))
CompObj#%fObjInfo& Equation Native !0_1095148490",)F`Ytkc`YtkcII
2y
FMicrosoft Equation 3.0DS EquationEquation.39qII
2y Ole
"CompObj(*#fObjInfo+%Equation Native &0_1095158220Y.F`Ytkc`YtkcOle
'CompObj/(fObjInfo0*
FMicrosoft Equation 3.0DS EquationEquation.39qlmIhyI
2y
FMicrosoft Equation 3.0DS EquationEquation.39qEquation Native +0_1095158340E3F`Ytkc`YtkcOle
,CompObj24fObjInfo5/Equation Native 00_10951584528F`Ytkc`YtkcOle
1II
2y
FMicrosoft Equation 3.0DS EquationEquation.39qII
2y
FMicrosoft Equation 3.0DS EqCompObj792fObjInfo:4Equation Native 50_10951584996@=F`Ytkc`YtkcOle
6CompObj<>7fObjInfo?9Equation Native :0uationEquation.39qEquation Native ?0_1095158630;cGF`Ytkc`YtkcOle
@CompObjFHAf
FMicrosoft Equation 3.0DS EquationEquation.39qII
2y
FMicrosoft Equation 3.0DS EquationEquation.39qObjInfoICEquation Native D0_1095158668LF`Ytkc`YtkcOle
ECompObjKMFfObjInfoNHEquation Native I0_1095158702J^QF`Ytkc`YtkcII
2y
FMicrosoft Equation 3.0DS EquationEquation.39q$qIаI
2y Ole
JCompObjPRKfObjInfoSMEquation Native N0_1017777562VF`Ytkc`YtkcOle
OCompObjUWPfObjInfoXR
FMicrosoft Equation 3.0DS EquationEquation.39q̃I}I
2Y
FMicrosoft Equation 3.0DS EquationEquation.39qEquation Native S0_1095157723[F`Ytkc`YtkcOle
TCompObjZ\UfObjInfo]WEquation Native X0_1095159122`F`Ytkc`YtkcOle
YlmIhyI
2y
FMicrosoft Equation 3.0DS EquationEquation.39qII
2y
FMicrosoft Equation 3.0DS EqCompObj_aZfObjInfob\Equation Native ]0_1095159210OeF`Ytkc`YtkcOle
^CompObjdf_fObjInfogaEquation Native b0uationEquation.39qII
2y
FMicrosoft Equation 3.0DS EquationEquation.39q`!d` !XJxڅP
@]1`
v
"ETPXYZX[X=&XxJ@
QY9.kP"E8VCuEL)̟9]QUVb3Gw.R=)F' KD,Ù,;Z8'm.4?
@U
hJGXCKS&AAO7MqAcUI(
\F6DdTP
SA ?"28nqxzIQES`!nqxzIQESd` !XJxڅP
@]1`
v
"ETPXYZ~V~}6,77;7s``VNĦH8P].S.gfgT]+CrʱI:R#8y!p&߸VI[Ma(fz~ګV*I"vPMsssn~qh2{un7N Dd*J
CA
"
bѢB`
$H{nѢB`
$H{PNG
IHDR6xsRGBPLTE{քs{ޔ{ތތޜ{ss猽焵333f3333f3ffffff3f̙3ff333f333333333f33333333f33f3ff3f3f3f3333f33̙33333f333333f3333f3ffffff3f33ff3f3f3f3fff3ffݵݵݵݵݵݵݵݵݲݵݵݵݵݵ𠠤Z`!IDATx^ᒢ0_zOQW=i$qU7_'v~~? cWlH
@!m"B&6DMl
i6!(m{aOjm%
!6!(mخ˭g؊n2%lW
[!m5UBYwklB4Zb~.Ro[PEoۆaYF`!ljdqQA3^LF0m7$\d[kM:}ۮM8^p`7鹼lғr&B?^]7·Mx:m7%Fv$FuV֍7iGQ8I'v]NՐ\N,>M:`]:LI'+bCGUlbCPHĆ
@!m"B&6DMl
i6!(mbCPHĆ
@!m"B&6DMl
iҶ}ONI=BAd9=*+YD%봭duv[Ոm2nc>쓆LeMz?nz6)RIlb
6iR/6!(mbCPHĆ
@!m"B&6DMl
i6!(mbCPHĆ
@!m"B&6DMl
i6!(mbCPH*JClElm}@bK`+?Ilojߟ$`xӭ]*Jm
ZR}t_3Ό^ێl1ӱsv;ѱjͤ]S,ܽ`Y}κ:W}v 2ծ&˶;ж.5A&zzYG
ItZw}ɽ[MznWmfi$M؆&PS+IFhlvy!XL,>M:`.\Sv5PņĆ
@l{T[S{"Fzslmkho}[mdt&EKCڧྲྀ_>ءf{}ev:豃R3rMsTg߬f!'R]#[h Ȇ\GIUu{q)0tCr"2߷+vܪ{v,C`nV?__{f:b"B&6DMl
i6!(mbCPHĆ
@!m"B&6DEIENDB`DdTP
SA?"27b'VIG %&`!b'VIGd` !XJxڅP
@]1`
v
hj1E@ JiWV~}Fۛ9B]00 Ua
JqH)cW3gsK)Bj&YʕT5m`j'2/;
d+ 6Nr]}`4F?
@U
JGXCKS&IAIOWMiǌ&!46DdTP
SA?"278KOST(`!8KOSTd` !XJxڅP
@]1`
v
"ETPXYZ[[X=&XxJ@
QY9.kP"E8VCuEL)̟9]QUVb3GwR=)F' KD,Ù,;Z'm.4?
@U
hJGXCKS&AAO7MqAcUI(
_6DdTP
SA?"28<ѣP~
;)`!<ѣP~
;d` !XJxڅP
@]1`
v
"ETPXYZV~}6,77;7s``VNĦH8P].S.gfgT]+CrʱI:R#8y!p&߸VI[Ma(fz~ګV*I"vPMsssn~qh2{Rn6Dd@B
SA
?27+&~74o+`!+&~74od`! xuO
`Z87ACA nNHG>`[B(F
$JlA!X2%2s6T>*pe(('~>ઃ'^jW2[w kI[{VDMGi~ftQ3ġW^dz717'N]}4 k6Dd@J
CA?"26Hr5`!
Hr5d`! xڅP
@]+ VNACMZ
$VVbX[X=qom,3`!d` !XJxڅP
@]1`
v
"ETPXYZX[X=&XxJ@
QY9.kP"E8VCuEL)̟9]QUVb3Gw.R=)F' KD,Ù,;Z8'm.4?
@U
hJGXCKS&AAO7MqAcUI(
\F6DdTJ
CA?"28<ѣP~
;4`!<ѣP~
;d` !XJxڅP
@]1`
v
"ETPXYZV~}6,77;7s``VNĦH8P].S.gfgT]+CrʱI:R#8y!p&߸VI[Ma(fz~ګV*I"vPMsssn~qh2{Rn6DdTP
SA?"27b'VIG 6`!b'VIGd` !XJxڅP
@]1`
v
hj1E@ JiWV~}Fۛ9B]00 Ua
JqH)cW3gsK)Bj&YʕT5m`j'2/;
d+ 6Nr]}`4F?
@U
JGXCKS&IAIOWMiǌ&!46DdTP
SA?"288`!d` !XJxڅP
@]1`
v
"ETPXYZX[X=&XxJ@
QY9.kP"E8VCuEL)̟9]QUVb3Gw.R=)F' KD,Ù,;Z8'm.4?
@U
hJGXCKS&AAO7MqAcUI(
\F61TablenMSummaryInformation(~wDocumentSummaryInformation8~DCompObjj
8DP
\hpxChapter 13hapKen Blacken en Normala
Ken Black28 Microsoft Word 9.0@ګq@ͩ@ j@RciD
՜.+,0hp
#University of Houston  Clear Lake#T
Chapter 13Title
FMicrosoft Word Document
MSWordDocWord.Document.89q
i8@8NormalCJ_HaJmH sH tH RR Heading 1$<@&5CJ KH OJQJ\^JaJ TT Heading 2$<@& 56CJOJQJ\]^JaJNN Heading 3$<@&5CJOJQJ\^JaJBB Heading 4$<@&5CJ\aJ<A@<Default Paragraph Font,@,Header
!&)@&Page NumberD @DFooter
!7$8$H$CJOJQJaJ02"0List 2^`0320List 38^8`04B0List 4^`05R0List 5^`68b6
List Bullet 4
&F>Er>List Continue 2x^>F>List Continue 38x^8>G@>List Continue 4x^N>NTitle$<@&a$5CJ KHOJQJ\^JaJ @C@Body Text Indenthx^h>J>Subtitle$<@&a$OJQJ^J6@6
Normal Indent
^<<Short Return AddressRS3456JKL67)*+EF34RSTu v
1
2
T
U
}
)*BCDEOV%T
,

^
_
WXmnoUVqr?@{fghijGHpq=>?#"#^_56!!!:!;!u!v!!!!0"""##z##$$$#$$$1$2$T$U$$$''Y''''((J(p(q((( )!)O)P)Q)l)m))))*M******++++
,A,C,....//)0*0Z000006171_1`11111122C2D2p2q222W3X3Y3Z33333
44F4G4~4444H55555667M7N777777A8B8^9_9`99999::P:Q:::::<<<<<<==>>>`?a???q@r@@@ A
AAAAyBzBBBBBBCC'C(CCCCDDE EBECEqErEsEEEEEFF"F#FF"G$G%GHH(I)InIoIpIJIJJJJJJJKKXKYKKK'L(LqLLL
MMMPMQMMMM*N+NOOO/Q0QtQuQvQRSDSPSSS0000000@0@00000000000000@00000000000000000000000000000000@0@0000@0@0@0@0@0@0@0@0@000000@0000000000000000000000000000000000000@0@00000000000000000000000000000000000@0@00000000@000000000000000@00000000000000000@0@0@0@0000000000000@0@0@0@00000000000000000000000000000000@00000000@0@00000000@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@0@000@0@0@0@0@0@0@0@0@0@000000@0@0@0@0@0@0@0@0@0@0000@0000000000000000000000000000000000000000000000000000000000000000@0@000000000000000000000000000000000000@0@0@0
0MXXXX[u=!%)074@rDbGPURW3568:;=?ACDHIKNP+?!#(,/58>DBIN*RRW.012479<>@BEFGJLMOQW//
C
E
24$&9MO>>>d??}????r@@@@@@AAAAAA(C0CJCTCE$E0E9EEEFFFF=IBIKIPI]IeI(L0LGLVLqLLMNN
NNN3QJQKQLQNQSQRRS
SSSS.S.S3S4S5S7S9SCSESMSPSSSKP 2
A
U
`
*A
]
w

VpuVY>cd!_a_j61 @ ;!t!!!1"2""""###Y#{####$$$$$$Y'^'((()))**M**++
,,02../i/v/Z0]011 2"2M2O2z22P3U3333344N4P444H5J5<6K6;;I;<=>>a??r@@AACC(C0CEEFFGGYK[K(L0LqLLMMM)NFNPN0QsQRRS
SSSS.S.S3S4S5S7S9SCSESMSPSSS333333333333333333333333333333333333333333333333333333333333333333333/
F
5'9PH#$^_569!:!>`?d???q@r@@@ A
AAyBzBBBBCC'C(CCCCDE EBECEqEsEvEEEEFF"F#FF$G%GH(I)IoIJJJJJKXKYKKK'L(LqLLLMM
MPMMMMM*N+NOO/Q3QuQvQRRPSSS@'RS@@UnknownG:Times New Roman5Symbol3&:ArialC0Courier 10cpi"1h(jjfD/iD#!0dT2Q
Chapter 13
Ken Black Ken BlackRoot Entry} F&ّ^!Data
R:WordDocument)ObjectPool1@lkc }kc\
!"#$%&'()*+,./0123456789:;<=>?@ABCDEFGHIJKLMNOPQSTUVWXY_a`bdcegfh{zjklmnopqrstuvwxy~}1TablenMSummaryInformation(~wDocumentSummaryInformation8~CompObjj
8DP
\hpxChapter 13hapKen Blacken en Normala
Ken Black28 Microsoft Word 9.0@ګq@ͩ@ j@RciD
՜.+,D՜.+,Xhp
#University of Houston  Clear Lake#T
Chapter 13Title4 $,
FMicrosoft Word Document
MSWordDocWord.Document.89q