[From Bruce Abbott (2016.08.15.2325 EDT)]

Rick Marken (2006.08.15.1115) –

Bruce Abbott (2016.08.13.1915 EDT)

BA: To prove this fact, I have provided a table giving hypothetical

observedvalues of V and R…

V

R

D

R1/3

D1/3

V formula

5.0

10.0

12.5000

2.154435

2.320794

5.0

5.1

9.9

13.3991

2.147229

2.375154

5.1

5.2

9.8

14.3478

2.139975

2.429935

5.2

5.3

9.7

15.3481

2.132671

2.485146

5.3

5.4

9.6

16.4025

2.125317

2.540797

5.4

5.5

9.5

17.5132

2.117912

2.596898

5.5

5.6

9.4

18.6826

2.110454

2.653457

5.6

5.7

9.3

19.9132

2.102944

2.710486

5.7

5.8

9.2

21.2078

2.095379

2.767996

5.8

5.9

9.1

22.5691

2.087759

2.825996

5.9

6.0

9.0

24.0000

2.080084

2.884499

6.0

BA: Below I have regressed V on R. Please note the linear (not power-law) relationship. It is not an artifact of the way either V or R are computed. It is an empirical relationship that could have been observed in the data.

RM: And here you go off the rails. But in a very helpful way. I’ve never had a reviewer prove himself wrong so elegantly.

RM: Power law researchers would analyze this movement data to see if it fits a power law by regressing log (R) on log(V). I did the regression and the result is the following regression equation:

log(V) = 2.43 - 1.73*log(R)

RM: with an R^2 value of .99. So power law researchers would conclude, based on this data, that a power law fits this movement but with a beta value of -1.73 rather than .33. This would be a very surprising result. But I could point out to these researchers that if they had included log(D) in their analysis they would have found this result:

log(V) = 0.0 + .33*log(D) + .33*log (R)

RM: with an R^2 of 1.0. So the surprising beta value of -1.73 is just an artifactual result of leaving log(D) out of the analysis. When you don’t leave log(D) out of the “power law” regression analysis you find, for *any* movement (other than a perfectly straight line), that log(V) = .33*log(D) + .33*log (R)

BA: Â The numbers I provided in my spreadsheet are but a sample from a large number of possible relationships between V(t) and R(t) that I could have chosen.Â The relationship in my illistration depicts a gradually tightening curve (the radius of curvature diminishing with t) and at the same time, a gradually increasing tangential velocity.Â This is opposite the pattern one might expect of a racecar driver, who would be accelerating as the curvature lessened.Â To discover the nature of this relationship, one could submit these numbers to a linear regression analysis using R as the predictor variable and V as the criterion variable.Â What emerges is a linear relationship as follows:

V = 15.0 – 1.0*R, with R-sq equual to 1.0.

BA: The analysis thus reveals that V is a linear function of R with negative slope, and that variation in R accounts for *all* of the variation in V.Â This is the real relationship between the observed Vs and Rs in these data.Â It is not an artifact of the formula that researchers employ to compute R from V.

BA: If you wish, of course, you can take the logs on both sides of this equation to discover that this yields a fit in which Beta = -1.73.Â R-sq is reduced somewhat to 0.998.Â Many posts ago I related the warning I received from a physical scientist I once worked for that just about any monotonic curve can be fit to a straight line when the data are plotted on log-log graph paper, and here is an example.Â The best fit is obtained using the untransformed data.Â It shows that the relationship is a linear one, not a power one.

BA:Â You claim that this is not the true relationship between V and R.Â Yet it quite clearly is – you can eyeball it in the data. Â Didnâ€™tt you look at the data? Â Instead, you assert that the true relationship between V and R can be found only if an additional variable is included as a predictor.Â I assert that your method does nothing more than recover the formula from which you compute V from R and D, and tells you nothing about the actual relationship between V and R in the data.

BA:Â To see why this is true, I relate the Parable of the Rectangles.

BA:Â A scientist from Rectangleland collected some observations on the ubiquitous rectangles that grow profusely in her country.Â She measured the height and width of each observed rectangle and used these values to compute the area of each rectangle according to the formula A = H*W.Â Â Linear regression found that

log(H) = 0 + 0.333 log(A)

BA: In other words, Height is proportional to the cube root of Area.Â

BA: A critic objects, contending that the results are invalid because the Width variable has been left out of the regression.Â He does the regression with Width included and gets the following results:

log(H) = 1.0 log(A) – 1.0 log(W), R-sq = 1.0

BA: Thus, he says, in Rectangleland, length does NOT always change with the cube root of a rectangleâ€™s area as you concluded!Â You were fooled by a diabolical ILLUSION!Â In reality the length of a rectangle is equal to its area divided by its width!

BA:Â The original researcher was dumbfounded.Â Didnâ€™t the critic realize that his analysis had only revealed the equation for computing the area of a rectangle from its height and width?Â This, she said, is just a fact of geometry and says nothing about the relationship I investigated, which shows how rectangles grow in our Land.Â They always grow in such a way that their heights are equal to the cube root of their areas.Â Rectangles in another Land may follow a different rule (for example, their heights could equal the square roots of their areas); if so, my analysis would show that.Â Our criticâ€™s analysis would only show, once again, that H = A/W.Â But we already knew that!

BA:Â I leave it to the reader to decide whether it is the scientist or the critic whose opinion is correct.Â By the same logic, Rickâ€™s method does nothing more than recover the equation used to compute R from V-cubed/D.

Bruce