- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Biden administration calls for developers to embrace memory-safe programing languages and move away from those that cause buffer overflows and other memory access vulnerabilities.
deleted by creator
I think that’s the point. You can’t trust the average developer to do things safely. And remember, half of all programmers are even worse than average.
Maybe even more!
Wouldn’t that be the median programmer instead of average?
The word “average“ can mean many things, for example, mean, median, mode, or even things like “within 1 standard deviation from the mean”.
I was using it strictly as the mean which divides the population exactly in half.
The median is the one that splits a data set in half and picks the middle.
You’re right of course, that was a stupid mistake on my part.
Half of all programmers constitute the so called “average” group
Yea! I’m one of them!
Which half am I in?
If you have to ask
You know
Yes. And 75% of car driver believe they are above average as well…
99% of devs believe they are in the top 1%
deleted by creator
Bell curves don’t work to make this point. A bell curve is symmetrical, so half of developers will always be below average on a bell curve. But yes, it is true that for other types of distributions, more or less than half of the developers could be below average. What the person above you was looking for, in the general case, would be the median.
The mean is in the center of the bell curve, so I’m not sure what your point is.
What? How would you define “average”? His statement is technically correct.
Average is the mean (i.e. sum of all “skill” divided by the amount of programmers)
What they were thinking of is the median (50th percentile = 0.5 quantile), which splits the group in two equal sized groups.
For a bell curve, they are the same values. But think of the example of average incomes: 9 people have an income of 10$, one has an income of 910$. The average income is 100$ ((10*9+910)/10). The median is basically 10 however.
The distribution of skill in humans, for various tasks and abilities, can often be approximated by a normal distribution. In that case, as you know, the mean is equal to the average.
Yeah, fair enough
Actually, in order to test your assumption, you’d need to quantitatively measure skill, which per se is something already problematic, but you’d also need to run a statistical test to confirm the distribution is a normal/Gaussian distribution. People always forget the latter and often produce incorrect statistical inferences.
Literally.
Or rather a Dunning Kruger issue: seniors having spent a significant time architecturing and debugging complex applications tend to be big proponents for things like rust.
Guys, C++ is gonna be dead in a couple of years now. Remember this comment…
…and read it again in ten years.
Are you the guy who has been posting this same comment every 10 years over the last half century?
(Edit: is joke)
Where’s /remindme when I need it?
Biden administration has fallen into the Rewrite it in Rust agenda.
Pretty crazy to reccomend Java as a secure alternative.
Why? What’s wrong with safe, managed and fast languages?
Java’s runtime has had a large number of CVEs in the last few years, so that’s probably a decent reason to be concerned.
Yep but:
-
it’s one runtime, so patching a CVE patches it for all programs (vs patching each and every program individually)
-
graalvm is taking care of enabling java to run on java
-
Nothing…
Only that descrition doesn’t include Java
Nothing really, the JVM has a pretty troubled history that would really make me hesitate to call it “safe”. It was originally built before anyone gave much thought to security and that fact plauges it to the present day.
and how much of this troubled history is linked to Java Applets/native browsers extensions, and how much of it is relevant today?
Written in C++
There’s a difference between writing code on a well-tested and broadly used platform implemented in C++ vs. writing new C++.
As you wish. Time to start learning D and D++
Hey girl, would you like my D or D++?
Is that nottheonion?
You mean like android running java which is why everyone and their mom bought Israel’s Pegasus spyware toolkit?
When was the last time you’ve heard of a memory safety issue in Java code? Not the runtime or some native library, raw dogged Java.
Memory safety isn’t a silver bullet, but it practically erases an entire category of bugs.
Fair point, even log4j was running java code, not literally hijacking the stack or heap.
That being said, I’m poking fun because C and C++ have low level capabilities of which only Rust offers a complete alternative of. Most of everything else is safe because it comes packaged with a garbage collector which affects performance and viability. I think Go technically counts if you set the GC allocation to 0 and use pointers for everything, but might as well use Rust or C at that point.
I guess I’m just complaining out of all the issues ONCD could point out, they went after the very broad “memeory-safe is always better” when most of the people using C and C++ need the performance. They only offered Rust as a potential alternative in the report with nothing else which everyone already knows. Would be nice to see them make a real statement like telling megacorps to stop using unencrypted SCADA on the internet.
The apps are (sometimes) Java, but the OS is a mix of languages, mostly C and C++. The Java runtime itself is C++.
I love that Android chose Java so they could run it on different processor architectures, but in the end one architecture won out so Java wasn’t necessary any more. I guess they didn’t know at the time, but they’d claw back a tonne of efficiency if they dropped the Java VM.
Java also made it very accessible to the vast majority of existing Java developers.
Way more Java developers than Objective C developers at the time.
I wasn’t a fan of learning Objective C when I started learning just as swift was coming out but too new to use.
Meanwhile the report does not really single out C/C++
What are you talking about? Did you read the report? On page 7 They directly say that C/C++ “lack traits associated with memory safety”.
Thats because in government products many unsafe languages shittier than C(++) are used, like Ada, Fortran, and Cobol. It wouldn’t surprise me if most of the code running on products for government use werent written in C or C++
When all the talented programmers are all gay communists and your entire state exists to murder gay communists. Still can’t forget how Allen Turing, a gay man whose inventions were a gigantic help in winning WW2, KYS’d because they still treated him like garbage even after the fact.
Also like it’s the only source of vulnerabilities… in addition a lot of the trendy python libs are developed in C; do we also ditch those?
It is one of the main sources. Like, actually a very substantial fraction is memory related. I think It was more than 50%, granted those are estimates.
Microsoft and Google both claim around 70%.
https://www.zdnet.com/article/microsoft-70-percent-of-all-security-bugs-are-memory-safety-issues/
Nice. Now I’m waiting for all the Rust or whatever “safe” languages environments for embedded systems to fall from the sky. And please some that actually work on small processors with little memories.
Rewrite it in Rust
That’s probably a good idea but I can see some proper longevity issues in that one
People are talking about Java, but the majority of programming languages are memory safe nowadays. Go satisfies this requirement, for example.
Shut up Brandon, you can’t even code. This is just propaganda from Big Rust.
I wanted you to know that I laughed and enjoyed this comment, ignore the haters 💛