The Instantaneous Intercept
Definition
Consider
two points on the graph of f(x): (x1,
f(x1)) and (x2, f(x2)). What is the equation
of the line joining them? If it is y = Ax+B, then
Ax1
+ B =
f(x1)
Ax2
+ B =
f(x2)
Then
by Cramer’s Rule:
A = (f(x1)
– f(x2)) / (x1 – x2) : Newton’s
Ratio
B = (x1
f(x2) – x2 f(x1)) / (x1
– x2) : Hellerstein’s Ratio.
As x2
approaches x1:
A
approaches f '(x), pronounced “f-prime”: it is the ‘derivative’, or
‘instantaneous slope’, or ‘fluxion’.
B
approaches f #(x), pronounced “f-grid”; it is the ‘instantaneous
intercept’, or ‘interceptual’.
The
tangent line to the graph of y=f(x) at x=x1 is
y = f
'(x1) x + f #(x1)
so f(x1) = f '(x1) x1 + f #(x1)
so f #(x1) =
f(x1) - f '(x1) x1
so f # =
f - x f '
You
can also derive this from the above limit definition.
Table of Intercepts
f # =
f - x f ' = (f dx – x df) / dx =
2f – (x f)'
(a
f(x) + b g(x)) # = a f #(x) + b g#(x)
K# = K
(xN)# = (1 – N) (xN)
x# = 0
(x2)# = - (x2)
(x3)# = - 2 (x3)
(x-1)# = 2 (x-1)
(ln(x))# = ln(x) - 1
( - x ln(x) )# = x
( xN ln(x) )# = xN ( (1-N)ln(x)
- 1 )
( x f(ln(x)) )# = -
x f '(ln(x))
( x cos(ln(x)) )# = x
sin(ln(x))
( x sin(ln(x)) )# = - x cos(ln(x))
(ex)# = (1- x) ex
(x ex)# = - x2 ex
(x2 ex)# = (-1 – x) x2 ex
(x3 ex)# = (-2 – x) x3 ex
(xN ex)# = (1 – N – x) xN ex
(sin(x))# = sin(x) – x cos(x)
(cos(x))# = cos(x) + x sin(x)
(tan(x))# = tan(x) – x sec2(x)
(sec(x))# = sec(x)(1 – x tan(x))
(arctan(x))# = arctan(x) – x/(1+x2)
(f *g)# = f#g + fg# - fg
(f
/g)# = (xfg + f#g - fg# )
/ g2
(1
/g)# = (xg + g - g# ) / g2
(g2)# = 2gg# - g2
(g3)# = 3g2g# - 2g3
(gN)# = NgN-1g# -
(N-1)gN
(ln(g))# = ln(g) – 1 + g#/g
(eg)# = (1- g + g#) eg
(f(g))# = f(g) – (f(g)
– f#(g))(g – g#) / x
If F equals the inverse
of f, then
(F)# = (F) - x /
(f '(F) = (F) - x2 / (x - f #(F)
Given
an intercept f #, we can calculate the derivative f ':
f ' = ( f - f #) / x
Intercept and Newton’s Method
Newton’s Method of
approximating roots of equations is the iteration
xN+1 = xN – f(xN)
/ f '(xN)
This is the same as:
xN+1 = -
f#(xN) / f '(xN)
and also:
xN+1 = - xN
f#(xN) / (f #(xN) - f(xN)
)
and also:
xN+1 = xN ( 1 {–} f#(xN)
/ f(xN) )
where {–} is the reciprocal subtraction operator:
a {–} b = 1 / (1/a
- 1/b)
The
Anti-Intercept
F is
an anti-intercept
of f if F# = f
. Note that (F + ax)#
= F#; so any
anti-intercept will have an anti-interception term ax.
Denote the general
anti-intercept as ai(f). Then:
ai(xN) =
(1/(1-N)) xN + ax
ai(x) =
- x ln(x) + ax
For instance:
ai(7x3
– 5x2 + 3) = (-7/2) x3 + 5x2 + 3 +
ax
ai(5x4
+ 5x) =
(-5/3) x4 - 5xln(x)
+ ax
In general, if F =
ai(f), then:
F – x F ' = f
F ' – (1/x)F = - f / x
(1/x)F ' – (1/x2)F = - f
/ x2
((1/x)F) ' = - f / x2
(1/x)F = ∫(-f/x2)dx + a
F = - x
* ∫(f/x2)dx + ax
So: ai(f)
= - x * ∫(f/x2)dx + ax
Also: ∫(f)dx
= (-1/x) * ai(x2f) + a
Also:
ai(x f(ln(x))) = -x (∫f)(ln(x)) + ax
Interceptual Equations
If f# = r
f then f
= a x(1-r)
If f## =
f then f
= a x2 + b
If f## - 5f# + 6f
= 0 then f
= a x-1 + b x-2
In general, if a x2 + bx + c
= 0 has roots r and R, then
a f## + b
f# + c f =
0
has solutions f
= A x(1-r) + B x(1-R)
This extends to complex
roots. For instance:
If f## = -
f then
f = x (
A cos(lnx)) + B sin(ln(x)) )
Applications?
We can proceed like
this, reproducing calculus with derivative replaced by intercept. Even extrema
can be redone within this parody of freshman calculus; for at a minimum or
maximum value, f#(x) = f(x).
The question is, what
specific applications does the interceptual in itself have? What use has an
anti-intercept? Or an interceptual equation?
Perhaps for calculating
involutes and evolutes?
No comments:
Post a Comment