Posts

Quantum Computers are Getting Closer To Cracking RSA Encryption

Recent work by Isaac Chuang, an MIT physicist and electrical engineer, illistates the progress that is being made towards a quantum computer that could crack the RSA algorithm. Read the article here . Recall that RSA is based off of the inability of computers to factor the product of two large primes. Proofs for this can be found in Fermat's Little Theorem and Euler's Theorem like we talked about in class. Quantum computers use quantum computing with qbits instead of normal 1 or 0 bits. Qbits can be a mix of both 1 and 0 simultaneously and exist in a state that is referred to as a superposition. Peter Shor, an MIT math professor, came up with an algorithm to factor large numbers with a quantum computer back in 1994, but testing it has been slow because of the difficulty of creating a stable quantum computer. Although I tried to understand how the algorithm works, I think I would need a better understanding of physics to fully grasp the algorithm. If you want to take a look read

What can we say about \(\varphi (mn) \) for \(m\) and \(n\) in general?

In class we proved that if \(m\) and \(n\) are coprimes (i.e. \(gcd(m,n)=1\)),  then \(\varphi(mn) = \varphi(m)\varphi(n)\). In this article we will discuss what will happen for \(m\) and \(n\) in general. In other words, given \(gcd(m,n)=d\) for some \(d\in\mathbb{N}\), what can we say about \(\varphi(mn)\)? Let's first do prime factorization for both \(m\) and \(n\), then $$m = p_1^{e_1}p_2^{e_2}...p_k^{e_k}$$ $$n = q_1^{r_1}q_2^{r_2}...q_j^{r_j}$$ Consider \(gcd(m,n)=d \geq 2\). Then \(m\) and \(n\) have at least one common prime factors. Assume the prime factorization expressions for \(m\) and \(n\) above are written in the way that \(m\) and \(n\)'s common prime factors are written at the beginning and aligned correspondingly. i.e. \(p_i = q_i\) for all \(1\leq i \leq c\) for some \(c\leq min(j,k)\). Then we can say the prime factors for \(m\) and \(n\)'s common divisor, \(d\), are \(p_i\)'s where  \(1\leq i \leq c\). Then $$\begin{eqnarray} \varphi(mn)

"Pyramid Numbers" Fun Fact!

Similar to the set up for the triangular numbers, "pyramid number," or \(P_n\), refers to the required number of dots for a \(n\) triangular base pyramid that has \(n\) dots per side on its base. And here is a fun fact for \(P_n\): $$\sum_{n=1}^{\infty}{\frac{1}{P_n}} = \frac{3}{2}$$ To prove this equality, we first may need some kind of formula for \(P_n\) so that we can do something with the LHS. As you may have found in homework, "Pyramid numbers" are a series of numbers right next to "triangular numbers" in Pascal's triangle. Therefore we know a general formula for \(P_n\): $$P_n = {n+2\choose 3} = \frac{(n+2)\cdot (n+1)\cdot n}{3!}$$ Now, we can rewrite the LHS and try to split the sum expression: $$\begin{eqnarray} \sum_{n=1}^{\infty}{\frac{1}{P_n}} &=& \sum_{n=1}^{\infty}{\frac{1}{(\frac{(n+2)\cdot (n+1)\cdot n}{3!})}}\\ &=&3!\sum_{n=1}^{\infty}{\frac{1}{(n+2)\cdot (n+1)\cdot n}}\\ &=&3!(\frac{1}{1\cdot 2 \cd

Parallelogram Identity Proof (Using Stars and Bars Technique)

The parallelogram identity in Pascal's triangle states that, given an entry \({n \choose k}\) in Pascal's triangle, \({n \choose k}-1 = \) the sum of the entries included in the rectangle that is straight above entry \({n \choose k}\) and also includes the apex of Pascal's triangle. This equality can be expressed as a double sum as follows, $${n+1 \choose k+1}-1=\sum_{i=0}^{k}\sum_{j=1}^{n-k}{n-i-j \choose k-i}$$.  We can prove this by using the stars and bars technique that we are familiar with. For easier explanation, let's first rewrite the double sum expression above in another form (but mathematically equivalent). Set \(r = n-k\), we have  $${r+k+1 \choose k+1}-1=\sum_{i=0}^{k}\sum_{j=1}^{r}{(k-i)+(r-j) \choose k-i}$$.  Let's assume we have \((k+1)\) stars and \(r\) bars. As we have already known, \({r+k+1 \choose k+1}\) counts the number of different star-bar sequences. This implies that maybe we can try to interpret the RHS as another way to count almo

Common counting techniques

Image
Before we go too far with proving various clever binomial coefficient identities, let's try counting some things with them. Stars and bars We know that there are " n  choose  k " ways to pick  k  children out of a group of  n  children (assuming we can tell the children apart). But what if we add an indistinguishable element to the story? Unlike our distinguishable children, let's say we have some identical candies to hand out. One natural question is: How many ways are there to hand out  k  indistinguishable candies to  n  distinguishable children? Here's where the stars and bars come in. Let's say we have 3 children and 8 candies. We can think of the candies as the "stars." The 3 children can be thought of as candy receptacles (sorry, kids), distinguished only by their order from left to right. So here's one way we could distribute the candy: ***|****|* In this case, the first child/candy receptacle gets 3 candies, the second gets 4

Common combinatorial proof set-ups

One good way to think about combinatorial proofs in general is something like this: Goal: Prove that quantity  A  and quantity  B  are equal. Step 1: Think of some question to which either  A  is obviously the answer.  Step 2: Show that  B  also gives the answer to the question. Or perhaps like this: Goal: Prove that quantity  A  and quantity  B  are equal. Step 1: Think of some set of size  A . Think of some different set of size  B . Step 2: Find a bijective (one-to-one) correspondence between the two sets. In both set-ups, most of the creativity is in that second step. That's the hard part, but it's also the part that gives us all the insight. Coming up with an explanation of why  B  counts the same thing  A  does is exactly what helps us understand why the two quantities are the same. Let' s spend a little more time with binomial coefficients to gain some more practice with combinatorial proofs. Here are a few identities. Take some time to think about t

Combinatorial proofs: intro and overview

Combinatorics is an extremely broad field, applications of which appear all over the place. Some fields, like discrete probability and computer science, simply *could not even exist* without combinatorics. In putting together a single-semester combinatorics course, I wanted to have some kind of narrative throughout so that we weren't just talking about random selections of cute problems in the field. As a result, what we'll be focusing on is what is called "combinatorial proof," also known as "proof by two way counting," "counting proof," and a variety of other aliases. The idea of a counting proof is that it counts something. This might sound silly, so here's an example of the difference between a combinatorial proof and a non-combinatorial proof of the same thing. Binomial coefficients You might have seen binomial coefficients before. They look like this: $${n \choose k}$$ (read " n choose k ") and they represent the number o