Skip to main content

Number of Primes

Anderson's Theorem

(a) The number of primes in [1,n] is no more than 2+floor(n/2).

The probability of n being prime when n is not prime is 1/2 - see Dasgupta,Papadimitriou,Vazirani "Algorithms" page 26. Therefore, the E(pi(n)) is n/2.

(b) There does not exist another set of adjacent primes other than {1,2,3}

5: 2 + floor(5/2) = 2 + 2 = 4:=> {1,2,3,5} : 4 <= 4
7: 2 + floor(7/2) = 2 + 3 = 5 => {1,2,3,5,7} : 5 <= 5
11: 2 + floor(11/2) = 2 + 5 = 7 => {1,2,3,5,7,11} 6 <= 7
26: 2 + floor(26/2) = 15 => {1,2,3,5,7,11,13,17,19,23} : 10 <= 15

Lagrange's Theorem is Inaccurate

Lagrange's theorem about primes states that pi(x) is the number of primes <= x. The pi(x) is approximately x/ln(x). He postulated that the lim of pi(x)/(x/lnx) as x-> infinity was 1. This is incorrect. if the number of primes is bounded by n/2 then refactoring and reducing Lagrange's Theorem results in the lim of ln(x) as x approaches infinity. This is always infinity.

Lagrange's theorem on some tests:

5: 5 / ln(5) = 3.1, incorrect
7: 7 / ln(7) = 3.59, incorrect
11: 4.58, incorrect
26: 7.9, incorrect

Using Anderson's Theorem, the number of prime numbers in [1,74] is 39. Lagrange says there is only 12.

Fun With Primes

Let's make some primes to test out my theory versus Lagrange. The goal is the find all of the primes between 1 and N using Lagrange's estimate as the stopping criteria versus mine.

def is_prime(p) :
    if p == 1 or p == 2 or p == 3 :
        return True
    for j in range(2,int(np.sqrt(p)+0.5)+1) :
        a = p // j
        if p - a*j == 0 :
            return False
    return True


I know you Cambridge Maths nerds say that '1' is not prime, but I disagree, so I am putting it into the list of primes. You can hate on that later.

if __name__ == "__main__" :
    max = int(sys.argv[1])
    n_primes = max // 2 + 2
    primes = []
    for i in range(1,max+1) :
        if is_prime(i) :
            primes.append(i)
            if len(primes) == n_primes :
                break

    print("found",len(primes),"primes. Lagrange predicted"int(max / np.log(max)),", I predicted",n_primes)
    print(primes)

Now run that in python and you get the list of prime numbers between 1 and the first command line argument, which is N for argument's sake.

"python mkprime.py 26"

found 10 primes. Lagrange predicted 7 , I predicted 15
me: [1, 2, 3, 5, 7, 11, 13, 17, 19, 23]
Lagrange: [1, 2, 3, 5, 7, 11, 13 ]

Obviously, Lagrange under estimated the number of primes and we missed some crucial primes up to 26. Primes are very important numbers, so we don't want to miss them. They are also very expensive to compute, so we don't want to compute more than we know exist. 

What if we determine a number to be prime and have a last known prime that is adjacent to that candidate prime? How do we know that the candidate is really prime? Is there a rule to follow? I really don't know, but what I did find is that there is a pattern to the placement of primes on the number line. They are not random.

    diffs = []
    diffs.append(0)
    for i in range(1,len(primes)) :
        diffs.append(primes[i] - primes[i-1])
    print(diffs)

Add that to your python, and run it again:

found 10 primes. Lagrange predicted 7 , I predicted 15
[1, 2, 3, 5, 7, 11, 13, 17, 19, 23]
[0, 1, 1, 2, 2, 4, 2, 4, 2, 4]

Hmmm. After '3' the diff is 2 or 4 from the last adjacent prime. Let's shoot it farther out ...

"python mkprime.py 2000"

the diff:
[0, 1, 1, 2, 2, 4, 2, 4, 2, 4, 6, 2, 6, 4, 2, 4, 6, 6, 2, 6, 4, 2, 6, 4, 6, 8, 4, 2, 4, 2, 4, 14, 4, 6, 2, 10, 2, 6, 6, 4, 6, 6, 2, 10, 2, 4, 2, 12, 12, 4, 2, 4, 6, 2, 10, 6, 6, 6, 2, 6, 4, 2, 10, 14, 4, 2, 4, 14, 6, 10, 2, 4, 6, 8, 6, 6, 4, 6, 8, 4, 8, 10, 2, 10, 2, 6, 4, 6, 8, 4, 2, 4, 12, 8, 4, 8, 4, 6, 12, 2, 18, 6, 10, 6, 6, 2, 6, 10, 6, 6, 2, 6, 6, 4, 2, 12, 10, 2, 4, 6, 6, 2, 12, 4, 6, 8, 10, 8, 10, 8, 6, 6, 4, 8, 6, 4, 8, 4, 14, 10, 12, 2, 10, 2, 4, 2, 10, 14, 4, 2, 4, 14, 4, 2, 4, 20, 4, 8, 10, 8, 4, 6, 6, 14, 4, 6, 6, 8, 6, 12, 4, 6, 2, 10, 2, 6, 10, 2, 10, 2, 6, 18, 4, 2, 4, 6, 6, 8, 6, 6, 22, 2, 10, 8, 10, 6, 6, 8, 12, 4, 6, 6, 2, 6, 12, 10, 18, 2, 4, 6, 2, 6, 4, 2, 4, 12, 2, 6, 34, 6, 6, 8, 18, 10, 14, 4, 2, 4, 6, 8, 4, 2, 6, 12, 10, 2, 4, 2, 4, 6, 12, 12, 8, 12, 6, 4, 6, 8, 4, 8, 4, 14, 4, 6, 2, 4, 6, 2, 6, 10, 20, 6, 4, 2, 24, 4, 2, 10, 12, 2, 10, 8, 6, 6, 6, 18, 6, 4, 2, 12, 10, 12, 8, 16, 14, 6, 4, 2, 4, 2, 10, 12, 6, 6, 18, 2, 16, 2, 22, 6, 8, 6, 4, 2]

Whoa. The difference between two adjacent primes is always an even distance. Let's histogram that and also let's make that first element '1' instead of '0' because 1 is '1' away from 0, just to make you Cambridge Maths nerds even more frustrated. Here's the histogram for primes up to 26.

1 : 3
2 : 4
4 : 3

How about for the primes up to 10,000, here is the histogram of the differences between adjacent primes:

1 : 3
2 : 205
4 : 202
6 : 299
8 : 101
10 : 119
12 : 105
14 : 54
16 : 33
18 : 40
20 : 15
22 : 16
24 : 15
26 : 3
28 : 5
30 : 11
32 : 1
34 : 2
36 : 1

Lemma: The difference between any two primes that are larger than 3 is always a factor of 2.

In fact, the majority of primes are 6 away from the next prime. This creates a nifty algorithm for finding primes. Starting with the first prime you find, just check every other number. What's even more interesting is the distribution of primes in [1,1e6], shown in the following figure:

This figure looks almost like a Poisson probability mass function with lambda=4:

What is really looks like is the Nearest Neighbor Routing probability distribution function in Jung, Haejoon & Lee, In-Ho. (2018). Performance Analysis of Millimeter-Wave Multi-hop Machine-to-Machine Networks Based on Hop Distance Statistics. Sensors (Basel, Switzerland). 18. 10.3390/s18010204. See equation 16 in that paper and figure 3. The red line in that paper's Figure 3 matches almost exactly the shape of the distribution of prime numbers.

Every n-digit prime number, where n > 1, ends in either 1, 3, 7, or 9. The likelihood of either of these numbers is almost even. The numbers 1 and 9 are equally likely and 3,7 are equally likely, with a slightly higher chance of being 3 or 7 in the [10,1e8] number range.

1 : 1440298 - 24.99 %
3 : 1440473 - 25.0 %
7 : 1440494 - 25.0 %
9 : 1440186 - 24.99 %

When I ran 1e8 primes, the distance graph turned out a quirk. The distance of 132 appeared 132 times. This following table shows the distribution of distances between adjacent primes in [1,1e8].



The Second Hardy-Littlewood Conjecture

https://en.wikipedia.org/wiki/Second_Hardy%E2%80%93Littlewood_conjecture

I learned about this conjecture today which states that the sum of the number of primes in A and B is greater than the number of primes in A+B. Using my formula to find the number of primes, let's prove this conjecture. The following is a rudimentary demonstration of this conjecture, not necessarily a proof because there is no proof that my theorem is accurate. It's more accurate than Lagrange's method, though. Someone else agrees that this conjecture is true: https://arxiv.org/abs/2101.03283











Popular posts from this blog

Host Species Barrier to Influenza Virus Infections

The title of this entry was taken from a paper written by Thijs Kuiken, Edward C. Holmes, John McCauley, Guus F. Rimmelzwaan, Catherine S. Williams, and Bryan T. Grenfell. This paper appeared in SCIENCE Volume 312, pp 394 – 397. If you have the gumption to really know how viral infections cross the species barrier, then this is the paper for you. It’s written as a “perspective” rather than as a technical publication, which means there isn’t a bunch of jargon in it. You can also contact the authors of the paper at t.kuiken@erasmusmc.nl . A particularly interesting quote taken from the paper: “It is well established that, as the proportion of susceptibles in the population, s , drops (as individuals become infected, then recover), the number of secondary cases per infection, R , also drops: R = s * R0 . If R is less than 1, as is currently the case for H5N1 virus in humans, an infection will not cause a major epidemic.” (pg. 312) The value, R0 , “is the number of secondary cases produced...

UNTITLED

I like people who can talk straight and take it standing. There's not enough straight talkers in the world, and certainly not enough in the USA. It seems as though our opinions are illegal if they are not in-line with the normative line of acceptance. That truly seems Orwellian to me. That said, though, this blog is more about race and ignorance than about the Thought Police. There does not exist a more sensitive and inflammatory topic than race . You should read the Wikipedia entry on race as it pertains to humans. It may enlighten you somewhat. The USA has two presidential candidates in its 2008 Presidential race. One of them is sort of a pinkish-white color, and the other is something of a brown color. The pinkish-white one has an American heritage with clear ancestry back to Northern Europeans. The brownish colored one has an Indonesian heritage with some suspected ancestry back to Africa, although he also has European ancestry. Call them whatever race you want. Where I have ...

The Spinning Brain

Intuition is a phenomenon of the biological brain that doesn't have any physical explanation. Many people experience intuition with varying degrees of success. There are a variety of theories regarding intuition [1] and some people regard intuition with much caution [2] . Yet, I am happily in the camp that has learned to respect my intuition as it has proven time and time again to be correct. Recently, though, I'd been thinking about intuition and soothsaying . There are many cases of people who claim to see the future, whatever that might be. Maybe there is something to be said about this mystical phenomenon. Maybe there is a real physical process at work that we just haven't thought of yet. To this end, I am proposing a theory about human intuition. This theory, though requires some background in quantum mechanics . Specifically, quantum entanglement . I'm not the only person who has theorized about quantum entanglement and its role in biological congnition and th...