# Category Archives: Math

## A Math-History Timeline

The history of mathematics can be divided into three periods:

1. The Measurement and Shapes period (<77,000BC – 600AD): Mathematics first rose to preeminence with the agricultural revolution, around ~8,000BC, as a practical tool to organize economics and civilization(trade, accounting, taxes, etc.). Only with the Greeks (~600BC) did mathematics become a pure subject which was pursued for the purpose of “understanding”. Unfortunately, the Roman Empire did not share the Greek’s interest in “pure knowledge” and Europe forgot much what it had learned for the next 1000 years.

## Kinetics #6: Pulse-Chase Experiments

In our sixth post on Understanding Kinetics, we consider pulse-chase experiments which are a common method to study bio-synthetic pathways. In pulse-chase experiments:

1. A “pulse” of labeled metabolite (P) is added to culture media.
2. The down-stream intermediates ((P1, P2, P3, etc.)) are measured (“chase“) with mass-spectrometry or radiation.

## Kinetics #5: Molecular Complexes (e.g. drug-target)

In our fifth post on Understanding Kinetics, we consider the speed at which molecular complexes form. This is the fundamental mechanism underlying drug action (i.e. drugs inhibition their targets) and cellular signalling (i.e. ligands activate their receptors) and is probably the most important “kinetic effect” to consider in experimental design. Here again we use previously derived mathematical models1 to define some simple rules for the timescales/half-lives and magnitude of these reactions (figure above).

## Kinetics #4: Reversible On-Off States

In our fourth post in the Understanding Kinetics series we consider the speed at which proteins can turn off (A) or on (B). The dynamics of such processes are important to consider when designing experiments (i.e. How long should I wait to take a measurement?) and understanding Network Motifs in signalling cascades. Luckily we can use exact mathematical models (equations below1) for such processes to define some intuitive rules for the timescales/half-lives and magnitude of these reactions (figure above).

## What is Entropy??

Entropy (S) can be best understood as “the effect of probability on a physical or chemical processes”. This relationship is famously described by the Boltzmann entropy formula which relates the probability of a particular state (P1) to the chemical or mechanical work (ΔG) required to obtain that state.
Entropy changes(ΔS), are not probabilities per se but rather a conceptual bridge between probability and energy. In this equation, k is the Boltzmann constant, T is temperature, P is the probability of the considered state, ΔS is the entropy change and ΔG is the free energy change.

## Receptor # Threshold for Cell-Cell Adhesion

How do two cells that can adhere, decide whether or not they should adhere? Typically, potentially adherent cells become adherent by increasing their adherence-receptor expression levels (R1/R2) past a certain “threshold” or “EC50” (see figure above). A classic 1984 paper defined this EC50 as a function of R1/R2‘s binding constant Ksoln as illustrated above for two average eukaryotic cells.1,2 This equilibrium model for cellular adhesion is described in more detail below.

## The Maximal “Rate” of Evolution

The timescales over which organisms “mutate” or “evolve” varies dramatically from months(viruses) to millions of years(animals). The figure above plots the average genome size in base-pairs (x-axis) versus the average mutation-rate(y-axis) for various organisms. In addition, a second y-axis (“minimum time to 1% mutation”) illustrate approximate timescales corresponding to each rate.

## Choosing a Statistical Test

The number and type of possible statistical tests in experimental science can be bewildering. Luckily choosing the right test can be made alot easier by first considering the question: Is your data categorical or continuous? Categorical data is typically percentage or frequency data binned into 2 or more categories or names. Continuous data is typically measurement data where there is a well defined relationship between the values (i.e. numerical data)1,2

## Understanding and Comparing Error Bars

Often, when I look at error bars in figures I am rather confused: “Overlap = bad and no-overlap = good, right?” If this is true, what is the difference between standard deviation bars, standard error bars, and 95% confidence intervals?   Chad recently found a great paper that does a great job answering this question (at least for us).1 Below we try to summarize some of the major points along with a few from other sources.1-3

## A Complete Aerobic Model of Marathon Performance

The equation above is one of the simplest and most accurate predictors of marathon performance (having been shown to account for ~70% of the variability in individual marathon performance).1 In the figures below I delve more deeply into each of these terms to gain a better understanding of the molecular determinants of racing performance.

## Understanding Reactivity with Hard-Soft Acid-Base Theory

Hard-Soft Acid-Base(HSAB) theory one of the most useful rules of thumb for explaining and predicting chemical reactivity trends. Hard molecules tend to be small/non-polarizable and charged while soft molecules tend to be large/polarizable and uncharged. Both acids/electrophiles and bases/nucleophiles can be hard and soft and the defining reactivity rule of HSAB theory is:

## Understanding Aromaticity based on Molecular Orbital Theory

Interestingly, once you understand the relative energies of linear pi-molecular orbitals the concept of “aromaticity” becomes alot simpler to understand. For example, cyclizing the frontier molecular orbitals (FMO) of butadiene gives you the anti-aromatic orbitals of cyclobutadiene. The “geometric arrangment” of these aromatic orbitals is a result of alternating stabilization (in green) or destabilization (in red) due to symmetry match or mismatch, respectively.1

## The Energies of Linear Frontier Molecular Orbitals

The Woodward Hoffman rules are some of the most useful rules in organic chemistry. Unfortunately, because these rules are symmetry-based, they mostly ignore the relative energies of the molecular orbitals they consider. Luckily, Huckel theory (on which the Wood-ward Hoffman rules were based) gives as simple, geometric handle on the energies of these orbitals. Understanding these energies is critical for (1) rationalizing non-pericyclic reactivity trends and (2) answering the question: What exactly is aromaticity?

## Estimating Metabolite Concentrations at Steady State

In a follow up to our post on sequential biochemical pathways, we next wanted to present an method to approximate the concentration of a metabolic intermediate in a biosynthetic pathway. In general, under steady state conditions, the steady state concentration of a metabolite can be estimated from the ratio of the Vmax for the upstream rate-determining enzyme over the rate of decay of that metabolite. A more complete equation is detailed below and further discussed in our post on Estimating Protein/Metabolite Levels from RNAseq data.

## Kinetics #3: Branch Points in Biochemical Pathways

In a follow up to our posts on Intuiting Enzyme Kinetics and Sequential Biochemical Pathways, we next wanted to consider the kinetic curves branch points in biochemical pathways (or equivalently kinetic competitions). Luckily exact mathematical models exist for competitive first-order processes (see below) and we can use these to develop intuitive rules (figure above) if we consider these pathways to be approximately pseudo-first order (which is often true in the context of biosynthetic pathways).