Search Blogs

Thursday, January 31, 2019

Partial Derivatives

Partial derivatives occur when taking the derivative of a function of two or more variables, e.g., $f(x,y)$.  If we want to determine the derivative of the function $f(x,y)$ with respect to one of the independent variables, we do this while holding the other independent variable(s) constant. This process is partial differentiation and is given in calculus terms as:

$$ \frac{\partial f(x,y)}{\partial x} = \lim_{x\rightarrow x_{o}} \frac{f\left(x,y_{c}\right)-f\left(x_{o},y_{c}\right)}{x-x_{o}}.$$

As mentioned above, we keep $y\rightarrow y_{c}$ constant while taking the derivative of the function with respect to $x$. Higher order partial derivatives are similar to normal derivatives and are commutative, for example:

$$\frac{\partial}{\partial x} \frac{\partial f(x,y)}{\partial y} = \frac{\partial}{\partial y} \frac{\partial f(x,y)}{\partial x}.$$

In cases where we are taking partial derivatives of a function dependent on another function(s), for example:

$$ f(g,h)=f(g(x,y),h(x,y),$$

then the partial derivatives of $f$ with respect to $x$ and $y$ are given by the chain rule, so that:

$$ \frac{\partial f(x,y)}{\partial x} = \frac{\partial f}{\partial g}\frac{\partial g}{\partial x} + \frac{\partial f}{\partial h} \frac{\partial h}{\partial x}$$

and in similar fashion for the partial derivative with respect $y$. A common application of partial derivatives is found in vector calculus with the grad, div, and curl operators. Mathematically they are written as:

$$ \text{grad}\, f(x,y,z) = \nabla f = \frac{\partial f}{\partial x} \mathbf{a_x} + \frac{\partial f}{\partial y} \mathbf{a_y} + \frac{\partial f}{\partial z} \mathbf{a_z} $$

$$ \text{div}\, \mathbf{f}(x,y,z) = \nabla \cdot \mathbf{f} = \frac{\partial f_{x}}{\partial x} + \frac{\partial f_{y}}{\partial y}  + \frac{\partial f_{z}}{\partial z} $$

$$ \text{curl}\, \mathbf{f}(x,y,z) = \nabla \times \mathbf{f} = \frac{\partial f_{z}}{\partial y}-\frac{\partial f_{y}}{\partial z} \mathbf{a_x}+\frac{\partial f_{x}}{\partial z}-\frac{\partial f_{z}}{\partial x} \mathbf{a_y}+\frac{\partial f_{y}}{\partial x}-\frac{\partial f_{x}}{\partial y} \mathbf{a_z}$$

where $\mathbf{a_{i}}$ is the unit vector in Euclidean space. There is not much more to performing partial derivatives, but they are ubiquitous in partial differential equations and vector calculus which are essential mathematics for most engineers and scientists.

The selected quote comes from Alan Turing, the father of modern computer science.

"Science is a differential equation. Religion is a boundary condition."
-Alan Turing, 1954

References  & Additional Reading

Reuse and Attribution

Tuesday, January 15, 2019

Probability & Normal Distribution Function

Let us begin with writing down the mathematical definition of the mean  for a sample set:

$$\bar{\mu} = \frac{\sum_{i=1}^{N} x_i}{N}$$

here $N$ is the number data points in the sample set and $x_i$ is the value at data point $i$. The equation above is for a sample, i.e. a subset of a population. If we are dealing with a population, then we can write the mean as:

$$\mu = \frac{\sum_{j=1}^{M} N_i \bar{\mu_i}}{\sum_{j=1}^{M} N_i} $$

here $M$ is the number of sample sets that make up the population.  A quick reminder, a population is typically viewed as the complete representation of a data set, whereas a sample is just that, a sampling/subset of the population.

A statistical quantity we may want to recall is the median, which is the central value of a distribution, or equally the half-way point between the minimum and maximum values. Another quantity that maybe useful is the mode which characterizes the most frequently occurring value in a distribution.

One of the most important quantities when dealing with data is discussing how each data point deviates from the mean of the dataset. To be more specific, we want to know the average distance/deviation of a data point from the mean, this is termed the standard deviation. A related value that captures the average range of variability in value is the variance, which is simply the standard deviation squared. The standard deviation for a sample is given by:

$$\bar{\sigma} = \sqrt{\frac{\sum{\left(x-\bar{\mu}\right)^{2}}}{N-1}} $$

and thus the variance is,

$$\bar{\nu} = \bar{\sigma}^2$$

For a population, the standard deviation is:

\begin{equation}
\sigma = \sqrt{\frac{\sum{\left(\bar{\mu}-\mu \right)^{2}}}{M}}\\
\nu = \sigma^2
\end{equation}

If the sample size, $N$, approaches a large number and we draw upon several samples then the central limit theorem states that the samples are representative of a normal distribution of the parent population. More specifically this implies that the mean  and standard deviation for the population is given as:

$\mu = \bar{\mu} $ and $\sigma = \frac{\bar{\sigma}}{N}$

When dealing with probability distributions, the most recognizable is the normal or Gaussian distribution. This is a continuous distribution function and is characterized by the mean, $\mu$, and variance, $\sigma^2$. the probability density, i.e. the probability a value will occur within a distribution given $\mu$ and $\sigma^2$ is written as:

$$ P(x) = \frac{1}{\sigma \sqrt{2\pi}} e^{\frac{-\left(x-\mu\right)^2}{2\sigma^2}}$$

The cumulative distribution function, which informs about the probability a value is less than $x$, is given by integrating $P(x)$. When the mean is located at 0 and the variance is 1, the normal distribution is referred to as being standard.  An example of a standard normal distribution curve is given below with the 68% and 95% confidence intervals (i.e. the number of standard deviations from the mean).


Thats all I have for this blog. If there are any comments or additional details you would like to add to this post please feel free to leave them in the comment box below and now for my selected quote:

"There are three types of lies: lies, damned lies, and statistics."
 -Benjamin Disraeli

References & Additional Reading

Reuse and Attribution