Every thing you code is binary. You may write ‘15’, but the code your computer runs will use ‘00001111’. The base-10 source code is only like that for human readability. All mathematical operations are done in binary, and all numbers are stored in binary. The only time they are not is the exact moment they are converted to text to display to the user.
What do you mean algorithms are not in base 2? What else are they in?
Just because you have human readable code which uses base 10 doesn’t mean it isn’t all translated to binary down the line. If that’s what you’re referring to, of course. Under the hood it’s all binary, and always has been.
Okay, walk me through how you think the code an algorithm is written in gets processed by the computer step by step, please. How do you think a computer operates and is programmed?
Let’s say you have code to tell the computer to calculate 3 + 5, in, say, C, because that’s close to assembly. What happens on the technical level?
Then explain to me the fundamentals you are referring to.
Because if you are wondering about how a computer processes information, I can tell you. I can then back it up with sources, and how code gets decoded into assembly, and how assembly is interpreted (spoiler alert, it’s binary, as it’s a direct representation of the digital circuits on the hardware level, which operate based on binary states, on and off). You just have to ask.
The algorithms coded into computers are not in base 2, though. Only operating functions of the computer itself are in base 2.
You don’t code in binary
Every thing you code is binary. You may write ‘15’, but the code your computer runs will use ‘00001111’. The base-10 source code is only like that for human readability. All mathematical operations are done in binary, and all numbers are stored in binary. The only time they are not is the exact moment they are converted to text to display to the user.
What do you mean algorithms are not in base 2? What else are they in?
Just because you have human readable code which uses base 10 doesn’t mean it isn’t all translated to binary down the line. If that’s what you’re referring to, of course. Under the hood it’s all binary, and always has been.
Because calculations happen in the form the calculation is written. The math is done in whatever base the algorithm is told to process in.
Okay, walk me through how you think the code an algorithm is written in gets processed by the computer step by step, please. How do you think a computer operates and is programmed?
Let’s say you have code to tell the computer to calculate 3 + 5, in, say, C, because that’s close to assembly. What happens on the technical level?
In sorry but you seem to be mistaking the fundamental distinction between what we are talking about.
Then explain to me the fundamentals you are referring to.
Because if you are wondering about how a computer processes information, I can tell you. I can then back it up with sources, and how code gets decoded into assembly, and how assembly is interpreted (spoiler alert, it’s binary, as it’s a direct representation of the digital circuits on the hardware level, which operate based on binary states, on and off). You just have to ask.
I’m really not. Nor am I wondering about how computers run algorithms.
I’m sure you’re excited to talk about CompSci 201 but I don’t need the help, thanks.
…then please stop spreading misinformation about how computers work.
I’m not tho.
Classes are tough but you’ll get there