Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PRNG + LCG chapters #1016

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,9 @@
* [Box Muller Transform](contents/box_muller/box_muller.md)
* [How costly is rejection sampling?](contents/box_muller/box_muller_rejection.md)
* [Probability Distributions](contents/probability_distributions/distributions.md)
* [Random Number Generation](contents/RNG/RNG.md)
* [Diehard Tests](contents/diehard/diehard.md.md)
* [Linear Congruential Generators](contents/LCG/LCG.md)
* [Tree Traversal](contents/tree_traversal/tree_traversal.md)
* [Euclidean Algorithm](contents/euclidean_algorithm/euclidean_algorithm.md)
* [Monte Carlo](contents/monte_carlo_integration/monte_carlo_integration.md)
Expand Down
Empty file added contents/LCG/LCG.md
Empty file.
61 changes: 61 additions & 0 deletions contents/RNG/RNG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
# Random Number Generation

Quick, try to think of a number between 1 and 10.

Ok, do you have your number?

Was it 7?

Probably not, right? I mean, there is no magic here.
I put down 7 (seemingly at random), and you chose your number seemingly at random.
In general, there should be about a 10% chance that we chose the same number; however, that is statistically not the case.

For some reason, we naturally choose 7 more than any other integer. The exact probability of choosing 7 is currently up for debate, ranging from roughly 25% {{ "RNG_reddit" | cite }} to 45% {{ "RNG_Numberphile" | cite }}, and there are some interesting papers on related topics you might want to chek out as well {{ "Shepard197582" | cite }} {{ "navarro2008latent" | cite }}.

My point is that we, as humans are really bad at coming up with random numbers.
Computers, on the other hand, are some of the most sofisticated pieces of technology we have ever built.
They have sometimes trillions of transistors, all working to provide answers to fundamental questions in science and technology.
Surely they are good at creating random numbers, right?

Yeah, not really. Like, they are fine, but they are machines built by people whowant precise answers.
Randomness is not usually something people want when trying to add $$2+2$$.

So then what do we do?
Well, white noise (static) is truly random [CITE] and comes about all the time in nature, so we could just put a microphone somewhere and collect random numbers from the world around us.
The problem is that nature is kinda slow at doing this and we want really fast simulations, so it's not in our best interest to just chill by a pond fishing for numbers.
So instead let's come up with complex algorithms that *simulate* randomness as best as we can.
From here on, rather than talking about generating truly random numbers, we will discuss creating "random enough" *pseudo* random numbers.

In the literature, you might see Random Number Generation being shortened to RNG and Pseudo Random Number generation shortened to PRNG.
The two are often used interchangeable, but there is a small destinction to be made.
Namely that RNG is related to the entire process of generating random numbers, while PRNG is a subset of methods that use the computer to do so.
In certain fields, other terms are used, but these will be covered on a case-by-case basis.

In these chapters, we'll do our best to cover a bunch of different algorithms and their specific use-cases.
Before doing that, we'll just talk about 2 things:
1. Why we want randomness to begin with
2. How to test for randomness

## Why do we want random numbers?

Monte Carlo, simulating nature, etc

computer graphics / IFS

Cryptographic hashing

Note that we need different quality of RNG for this


## Testing for Randomness

Histogram test

### Bibliography

{% references %} {% endreferences %}

<script>
MathJax.Hub.Queue(["Typeset",MathJax.Hub]);
</script>

14 changes: 14 additions & 0 deletions contents/diehard/diehard.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Diehard Tests

## Birthday spacings
## Overlapping permutations
## Ranks of matrices
## Monkey tests
## Count the 1s
## Parking lot test
## Minimum distance test
## Random spheres test
## The squeeze test
## Overlapping sums test
## Runs test
## The craps test
40 changes: 40 additions & 0 deletions literature.bib
Original file line number Diff line number Diff line change
Expand Up @@ -541,3 +541,43 @@ @misc{box_muller_wiki
url={https://en.wikipedia.org/wiki/Box%E2%80%93Muller_transform},
year={2022}
}

#------------------------------------------------------------------------------#
# RNG
#------------------------------------------------------------------------------#

@misc{RNG_reddit,
title={Asking over 8500 students to pick a random number from 1 to 10 [OC]},
url={https://www.reddit.com/r/dataisbeautiful/comments/acow6y/asking_over_8500_students_to_pick_a_random_number/},
year={2023}
}

@misc{RNG_Numberphile,
title={Random Numbers - Numberphile},
url={https://www.youtube.com/watch?v=SxP30euw3-0},
year={2013}
}

@article{Shepard197582,
title = {The internal representation of numbers},
journal = {Cognitive Psychology},
volume = {7},
number = {1},
pages = {82-138},
year = {1975},
issn = {0010-0285},
doi = {https://doi.org/10.1016/0010-0285(75)90006-7},
url = {https://www.sciencedirect.com/science/article/pii/0010028575900067},
author = {Roger N Shepard and Dan W Kilpatric and James P Cunningham},
}

@article{navarro2008latent,
title={Latent features in similarity judgments: A nonparametric Bayesian approach},
author={Navarro, Daniel J and Griffiths, Thomas L},
journal={Neural computation},
volume={20},
number={11},
pages={2597--2628},
year={2008},
publisher={MIT Press}
}