11
submitted 1 year ago* (last edited 1 year ago) by van2z@programming.dev to c/programming@programming.dev

I recently decided to think of an algorithm based on natural selection of DNA mutations. Being unclear about its speed complexity, I decided to write it up and test it out. The result was counter intuitive.

First, a simpler variant: Imagine the following sorting algorithm, given an array of numbers: Randomly shuffle the numbers until they are sorted.

What is the median amount of times you end up shuffling the array before it is sorted?

The answer is n! where n is length of the array. This is known as Bogosort.

Now, a slightly more advanced version:

Assume you can tell which numbers are already in their correctly sorted position, and which aren't. Only randomly shuffle the numbers which are not yet sorted. Keep all the others in place.

What is the median amount of times you end up doing this kind of shuffling before the whole array is sorted?

I'll be revealing the answer tomorrow.

Edit: The answer is that you end up shuffling only n times.

Addressing some concerns: this is not a real sorting algorithm, because it assumes you already know whether some of the records are sorted or not.

It shows me how "random chance" in DNA mutations can actually occur much faster than we expect. When a better gene allows an organism to survive better, it sticks around, while all of the other useless genes randomly shuffle around until they can become more useful too. This way organisms with a long DNA code can still evolve rather quickly even if it's by random chance.

you are viewing a single comment's thread
view the rest of the comments
[-] Arete@lemmy.world 1 points 1 year ago

This is interesting. Naively I'd expect that each number in a shuffled array of length n has a 1/n chance of ending up in the correct position, so there ought to be on average 1 correctly placed number regardless of the array length. I might be neglecting a correlation here, where each incorrectly placed number decreases the odds for the remaining numbers. Assuming the above though, the whole problem becomes recursive, since we'd be left with an unsorted array of length n-1 on average. The expected number of sorts would then just be n. For the time complexity, we'd have O(n) for the original shuffle, plus O(n-1) for the next, and so on, yielding O(n^2) overall.

this post was submitted on 08 Jul 2023
11 points (100.0% liked)

Programming

17314 readers
158 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS