Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!
Comments¶
Sage includes everything (including SymPy) from the open source world that you might want to do mathematics. This includes many libraries that are useful for numerics, like octave.
Sage includes a bit of a DSL on top of Python. For example, you can type 1/2 without wrapping the integer literals, and it will return a rational.
x^2
givesx
squared, notxor(x, 2)
. I'm not sure if it automatically defines variables for you by default. This means that things that you do in an interactive Sage session might not translate directly to a Python script. On the other hand, this can be useful for interactive use (btw, SymPy also has isympy -I that does some similar things).
Pre-defined Symbols¶
- x (a pre-defined variable)
- pi (a pre-defined constant for $\pi$)
- e (a pre-defined constant for $e$)
- oo (a pre-defined constant for positive infinity)
- -oo (a pre-defined constant for negative infinity)
x
n(x)
pi
n(pi)
e
n(e)
oo
-oo
Define a Variable¶
Define a variable u
.
var("u")
Define a Function¶
f(x) = x^3 + 1
f
f(2)
Limitation¶
lim(f, x=1)
lim((x ^ 2 - 1) / (x - 1), x=1)
lim(f, x=1, dir="-")
lim(f, x=1, dir="right")
Integral¶
integral(cos(x), (x, 0, pi / 2))
Entropy¶
Calculate Entropy of the exponential distribution with a density function $\frac{1}{\mu} e^{-\frac{x}{\mu}}$.
var("u")
assume(u>0)
f(x) = 1/u * e ^ (-x/u)
integral(
-log(f(x)) * f(x), (x, 0, oo)
)
The above results shows that the entropy of th exponential distribution might be negative. As a matter of fact, the entropy goes to $-\infty$ as the parameter $\mu$ goes to 0.
Cross-entropy¶
var("u m")
assume(u>0)
assume(m>0)
f(x) = 1/u * e ^ (-x/u)
g(x) = 1/m * e ^ (-x/m)
integral(
-log(f(x)) * g(x), (x, 0, oo)
)
c(u, m) = u*(m*log(u)/u + m^2/u^2)/m
c2(u) = log(u) + 1/u
plot(c2(u), (u, 0, 200))