Message from C, C++ discussions

November 2019

— A human being couldnt tell any accurate time below 100ms


What i mean to express is that computer science is one of the very few fields where you literally have to deal with relativistic effects, as such, never under estimate time

— Youre right, but at some pont you start overengineering things

— Most of the time you work with some kind of analogy
for example you want to measure some kind of distance between some things, where on earth would you need ultra perfect precision to idk the 50st digit

Message permanent page

— At some point you may have to say which, of two transactions that came at about the same nanosecond, happened first and who will be able to buy something when it is cheap leaving the other in the dirt

Message permanent page

— Practically, it wont matter if you use pi with 50 digit accuracy or 100 digit accuracy

— If youre doing some kind of manufacturing

— To quote Andrei Alexandrescu: "Write your code like you are going to defend it before court because that may happen some day"

Message permanent page

— Well, floating point calculations are by design very imprecise when you are out of -1 <x < 1

— Thats not what im about

— Sometimes accuracy is important

— There are for sure ways of actually using some number up to the 50st digit