# Category Archives: Computing

## Conditional Flow: “pit-stops” and “detours”

Analogies are often the best tools to connect a program’s “functional logic” with the “abstract tools” of the programming language. Here, we outline the logical differences between common if-statement structures using a “Driving Analogy” drawn in parallel to a “Flow-Chart” and “Model Python Code.” We choose Python because it is one of the most intuitive and commonly used programming languages in biology.

## Understanding Binary with Poker Chips

Computers store information with transistors or “switches” that have an on (“1”) or off (“0”) state. As a result, they must perform mathematical and logical operations using a binary (base-2) numbering system. For example, as can be see above, computers represent our the decimal/base-10 number “113” with the binary/base-2 number: “1110001”.

## Understanding Boolean Logic Gates

Boolean algebra/digital logic is how computer hardware performs computations with a binary numbering system. In addition, Boolean operators (and, or, not, etc.) are critical components to all programming languages. The figure above conceptually summarizes the major Boolean Logic Gates using Venn Diagrams to represent the inputs (top) and outputs (bottom) that result.

## Introduction to Super-resolution Microscopy

While microscopic methods with protein and atom scale resolution – that don’t break the diffraction barrier! – exist (e.g. electron microscopy, atomic force microscopy, x-ray crystallography, etc.), they tend to be more practically difficult than fluorescence-based microscopy (see posts on Fluorescent Probes and Fluorescent Antibodies). Therefore, to give fluorescence microscopy nanometer resolution, several super resolution techniques have been developed that combine clever (1) optical/photophysical and (2) computational-processing tricks to “clean up” the “blurred data.”

## What is Principal Component Analysis??

Principal component analysis (PCA) attempts to find true trends hidden in complex data by filtering out noise and redundancy. It does this by treating complex data as a n-dimensional shape (where n is the number of measurements in your study) and fitting that shape to n 1-dimensional lines called: “principal components” and ranking these lines by the percentage of data variation that they capture.