If you want to know how computers work, do electrical engineering. If you want to know how electricity works, do physics. If you want to know how physics works, do mathematics. If you want to know how mathematics works, too bad, best you can do is think about the fact it works in philosophy.
all roads lead to philosophy
Everything is philosophy until it becomes science. Unless it's anything to do with politics then it just remains philosophy forever.
Science is a subdiscipline of philosophy.
If you want to know how philosophy works, do sociology...
It's kind of like a horseshoe with philosophy and math at the ends.
If you want to no longer want to know how anything works, do biochemistry
Too real
tbf all good programmers are good at math. Not classic arithmetic necessarily, but at the very least applied calculus. It's a crime how many people used a mathematical discipline every day, but don't think they're "good at math" because of how lazer focused the world is on algebra, geometry and trig as being all that "math" is.
Serious question; how does Calculus apply to programming? I’ve never understood.
PID control is the classic example, but at a far enough abstraction any looping algorithm can be argued to be an implementation of the concepts underpinning calculus. If you're ever doing any statistical analysis or anything in game design having to do with motion, those are both calculus too. Data science is pure calculus, ground up and injected into your eyeballs, and any string manipulation or Regex is going to be built on lambda calculus (though a very correct argument can be made that literally all computer science is built of lambda calculus so that might be cheating to include it)
Lambda calculus has no relation to calculus calculus, though.
Data science is pure calculus, ground up and injected into your eyeballs
Lol, I like that. I mean, there's more calculus-y things, but it's kind of unusual in that you can't really interpret the non-calculus aspects of a neural net.
Graphics programming is the most obvious one and it uses it plenty, but really any application that can be modeled as a series of discrete changes will mostly likely be using calculus.
Time series data is the most common form of this, where derivatives are the rate of change from one time step to the next and integrals are summing the changes across a range of time.
But it can even be more abstract than that. For example, there's a recent-ish paper on applying signal processing techniques (which use calculus themselves, btw) to databases for the purposes of achieving efficient incremental view maintenance: https://arxiv.org/abs/2203.16684
The idea is that a database is a sequence of transactions that apply a set of changes to said database. Integrating gets you the current state of the database by applying all of the changes.
that can't be right. maybe they meant lambda calculus? programmers are definitely good at applied logic, graph theory, certain kinds of discrete math etc. but you're not whipping out integrals to write a backend.
Any function that relies on change over a domain is reliant on concepts that are fundementally calculus. Control systems, statistical analysis, data science, absolutely everything in networking that doesn't involve calling people on the phone to convince them to give you their password, that is all calculus.
A senior firmware engineer said to the group that we just have to integrate the acceleration of an IMU to get velocity. I said “plus a constant.” I was fired for it.
Depends on the context. When my company proposes me to a client for work I am, but oddly during my yearly performance review I am just some smuck who programs.
You are right man
I’m something of a scientist myself
I mean, nowadays you need to be very smart and educated to google efficiently and avoid all the AI traps, missinformation, stackoverflow mods tripping, reading reddit threads on an issue with half the comments deleted because of the APIcalypse etc... sooo you could argue that you're somewhat of a scientist yourself
Had a discussion with my 8yo niece the other day… turned out the lesson was, sometimes it can be worse to know the wrong thing than to know nothing at all.
Had a graduate Dev who did not have a fucking clue about anything computer related. How tf he passed his degree I have no idea.
Basic programming principles? No clue. Data structures? Nope.
We were once having a discussion about the limitations of transistors and dude's like "what's a transistor?" ~_~#
Tbh, as a dev knowledge of transistors is about as essential as knowledge about screws for a car driver.
It's common knowledge and in general maybe a little shameful to not know, but it's really not in any way relevant for the task at hand.
Maybe for dev knowledge, but computer science? The science of computers?
What kind of cs degree did you get where you learned about electrical circuits. The closest to hardware I've learned is logic circuit diagrams and verilog.
I mean, I graduated over 20 years ago now, but I had to take a number of EE courses for my CS major. Guess that isn't a thing now, or in a lot of places? Just assumed some level of EE knowledge was required for a CS degree this whole time.
I got my BS in CSci about 15 years ago and it was 100% about programming in java. We didn't learn a fucking thing about hardware and my roommate was an EE major and we had none of the same classes except for calculus.
By the time I graduated java was basically dead. Thanks state college.
Java isn't dead, though
My CS program had virtually no programming outside a couple of courses where C was used to implement concepts. Had one applications type course where mostly Java was used.
CS is and should be a specialized math curriculum IMO. Teaching specific programming languages is time that would be better spent teaching theory that can't be taught by dev docs or code bootcamps, as exemplified by your anecdote. Unfortunately nowadays people tend to see degrees as glorified job training programs.
Well, computer science is not the science of computers, is it? It's about using computers (in the sense of programming them), not about making computers. Making computers is electrical engineering.
We all know how great we IT people are at naming things ;)
Computational theory would be a better name, but it overlaps with a more specific subset of what is normally called CS.
We could also just call it Software Engineering. That's at least the job everyone gets with a Computer Science degree.
I've met people like that too.
It's called cheating, lots of people do it.
Most worthless dev I've met was a graduate of comp sci who couldn't hold a candle compared to a guy that did a dev boot camp.
The best dev I've met so far didn't even have any credentials whatsoever, second next best did 2yr associates.
Tie for 3rd best with associate's and 4yr degree.
looks weird without the clevage
"Engineer of Information", please 😎
If a C- is enough to pass Analysis of Algorithms, then a Computer Science degree can make me a Computer Scientist. :P
You need C++ for computer science, though!
Be me, a computer scientist who still struggles with XOR.
I literally have no idea what this picture means, and at this point I'm too afraid to ask.
I have been coding since I was 10 years old. I have a CS degree and have been in professional IT for like 30 years. Started as a developer but I’m primarily hardware and architecture now. I have never ever said I was a computer scientist. That just sounds weird.
IT stooge != science Sorry fellas.
Surely you must be a master of linear algebra and Euclidean geometry
I mean, I am applying various kinds of science but I'm not actually doing any science so I'm not thinking about myself as a scientist. What I do is solving problems - I'm an engineer.
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics