If you're reading this, I think it's safe to assume you already know how to count... (1, 2, 3, whatever) so what's the big deal about counting?

When we say counting in this context, we mean counting sequences of decisions. For example, we might want to get the total number of ways to choose toppings on a pizza or something.

There are two main types of problems: those where order matters and those where it doesn't.

The First Rule of Counting: When the order matters

Here's a sample problem: let's try to figure out the total number of unique 5-character strings we can make with the letters 'A' through 'E'. For instance, 'ABCDE' and 'DABBA' are both valid.

Lots of these types of problems can be visualized using slots, where each slot is one character or option:

To get the total number of ways to fill the slots, we can multiply the number of ways each individual slot can be filled together.

The Second Rule: When the order doesn't matter

Stars and Bars

When we need to split items into groups, it's sometimes nice to add bars that separate the items. This is great if there are particular classes of items rather than unique ones (if they're unique, just use slots.)

How this works is that we can treat each item as a star, and the stars are separated by bars that designate one group from another.

The Inclusion-Exclusion Principle

Used to calculate the probability of the union of events.

Last updated