404
you are viewing a single comment's thread
view the rest of the comments
[-] uis@lemm.ee 1 points 4 months ago

Basically you can say that time it takes never goes above grapf of some function scaled by constant.

Fun side effect of this is that you can call your O(1) algorithm is O(n!) algorithm and be technically correct.

[-] Mikina@programming.dev 2 points 4 months ago* (last edited 4 months ago)

Here is a picture, that may help a little bit. The n is input size, and f(n) is how long does the algorithm runs (i.e how many instructions) it takes to calculate it for input for size n, i.e for finding smallest element in an array, n would be the number of elements in the array. g(n) is then the function you have in O, so if you have O(n^2) algorithm, the g(n) = n^2

Basically, you are looking for how quickly it grows for extreme values of N, while also disregarding constants. The graph representation probably isn't too useful for figuring the O value, but it can help a little bit with understanding it - you want to find a O function where from one point onward (n0), the f(n) is under the O function all the way into infinity.

this post was submitted on 22 Jun 2024
404 points (97.4% liked)

Programmer Humor

19529 readers
436 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS